In-Defense-Of-Inclusionism -

It may take only a few restrictions before one has inched far enough the barriers axis that the contributions does in fact fall by tenfold. One sees Wikipedia slowly adding restrictions:

Each of these steps seems harmless enough, perhaps, because we can’t see the things which do not happen as a result (this is a version of Frédéric Bastiat’s fallacy of the invisible). The legalistic motto that which is not explicitly permitted is forbidden has the virtue of being easy to apply, at least.

Few objected to the banning of anonymous page creation by Jimbo Wales during the Seigenthaler incident (we had to destroy the wiki to save it), and most of those were unprincipled ones. The objector was all for a tougher War on Drugs - er, I mean Terror, or was that Vandalism? (maybe Poverty) - but they didn’t want to be stampeded into it by some bad PR. Too, few objected to CAPTCHAs: take that you scumbag spammers! The ironic thing is, as a fraction of edits, vandalism shrunk from 2003-2008 (remaining roughly similar since) and similarly, users specializing in vandal fighting and their workload of edits have shrunk; graphing new contributions by size, one finds that for both registered and anonymous users, the apogee was 2007 and vandalism has been decreasing ever since. (A more ambiguous statistic is the reduced number of actions by new page patrollers.)

Who alive can say,
Thou art no Poet – may’st not tell thy dreams? Since every man whose soul is not a clod Hath visions, and would speak, if he had loved,

And been well nurtured in his mother tongue.


But by 2007 the water had become hot enough to be felt by devotees of modern fiction (that is, anime & manga franchises, video games, novels, etc.), and even the great Jimbo could not expect to see his articles go un-AfD’d.

But who really cares about what some nerds like? What matters is Notability with a capital N, and the fact that our feelings were hurt by some Wikigroaning! After all, clearly the proper way to respond to the observation that Lightsaber combat was longer than Sabre is to delete its contents and have people read the short, scrawny - but serious! - Lightsaber article instead.

If it doesn’t appear in Encarta or Encyclopedia Britannica, or isn’t treated at the same (proportional) length, then it must go!

Imagine a world in which every single person on the planet is given free access to the sum of all human knowledge. That’s what we’re doing.10

Deleting based on notability, fiction articles in particular, doesn’t merely ill-serve our readers (who are numerous; note how many of Wikipedia’s most popular pages are fiction-related, both now and in 2007 or 2011, or how many Internet searches lead to Wikipedia for cultural content11), but it also damages the community.

We can see it indirectly in the global statistics. The analyses (2007, 2008) show it. We are seeing fewer new editors, few new articles, fewer new images; less of everything, except tedium & bureaucracy.

Worse, it’s not that the growth of Wikipedia has stopped accelerating in important metrics. The rate of increase has in some cases not merely stopped increasing, but started dropping!

“…the size of the active editing community of the English Wikipedia peaked in early 2007 and has declined somewhat since then. Like Wikipedia’s article count, the number of active editors grew exponentially during the early years of the project. The article creation rate (which is tracked at Wikipedia:Size of Wikipedia) peaked around August 2006 at about 2400 net new articles per day and has fallen since then, to around under 1400 in recent months. [The graph is mirrored at Andrew Lih’s Wikipedia Plateau?.]

User:MBisanz has charted the number of new accounts registered per month, which tells a very similar story: March 2007 recorded the largest number of new accounts, and the rate of new account creation has fallen significantly since then. Declines in activity have also been noted, and fretted about, at Wikipedia:Requests for adminship…."

This been noted in multiple sources, such as Felipe Ortega’s 2009 thesis, Wikipedia: A Quantitative Analysis:

So far, our empirical analysis of the top ten Wikipedias has revealed that the stabilization of the number of contributions from logged authors in Wikipedia during 2007 has influenced the evolution of the project, breaking down the steady growing rate of previous years…

Unfortunately, this results raise several important concerns for the Wikipedia project. Though we do not have empirical data from 2008, the change in the trend of births and deaths [new & inactive editors] will clearly decrease the number of available logged authors in all language versions, thus cutting out the capacity of the project to effectively undertake revisions and improve contents. Even more serious is the slightly decreasing trend that is starting to appear in the monthly number of births of most versions. The rate of deaths, on the contrary, does not seem to leave its ascending tendency. Evaluating the results for 2008 will be a key aspect to validate the hypothesis that this trend has changed indeed, and that the Wikipedia project needs to put in practice more aggressive measures to attract new users, if they do not want to see the monthly effort decrease in due course, as a result of the lack of human authors.12

Ortega notes indications that this is a pathology unique to En:

In the first place, we note the remarkable difference between the English and the German language versions. The first one presents one of the worst survival curves in this series, along with the Portuguese Wikipedia, whereas the German version shows the best results until approximately 800 days. From that point on, the Japanese language version is the best one. In fact, the German, French, Japanese and Polish Wikipedias exhibits some of the best survival curves in the set, and only the English version clearly deviates from this general trend. The most probable explanation for this difference, taking into account that we are considering only logged authors in this analysis, is that the English Wikipedia receives too contributions from too many casual users, who never come back again after performing just a few revisions.13

Erik Moeller of the WMF tried to wave away the results in November 2009 by pointing out that The number of people writing Wikipedia peaked about two and a half years ago, declined slightly for a brief period, and has remained stable since then, but he also shoots himself in the foot by pointing out that the number of articles keeps growing. That is not a sustainable disparity. Worse, as the original writers leave, their articles become legacy code - on which later editors must engage in archaeology, trying to retrieve the original references or understand why something was omitted, or must simply remove content because they do not understand the larger context or are ignorant. (I have had considerable difficulty answering some straightforward questions about errors in articles I researched and wrote entirely on my own; how well could a later editor have handled the questions?)

The numbers have been depressing ever since, from the 2010 informal & Foundation study14 on editor demographics to 2011 article contributions; the WSJ’s statistician Carl Bialik wrote in September 2011 that the number of editors is dwindling. Just 35,844 registered editors made five or more edits in June, down 34% from the March 2007 peak. Just a small share of Wikipedia editors - about 3% - account for 85% of the site’s activity, a potential problem, since participation by these heavy users has fallen even more sharply.

Only in 2010 and 2011 has the Foundation seemed to wake up and see what the numbers were saying all along; while Wales says some of the right things like A lot of editorial guidelines…are impenetrable to new users, he also back-handedly dismisses it - We are not replenishing our ranks. It is not a crisis, but I consider it to be important. By December 2011, Sue Gardner seems to reflect a more realistic view in the WMF, calling it the holy-shit slide; I think she is worth quoting at length to emphasize the issue. From the 19 December 2011 The Gardner interview:

Much of the interview concerned the issues she raised in a landmark address in November to the board of Wikimedia UK, in which she said the slide showing a graph of declining editor retention (below) is what the Foundation calls the holy-shit slide. This is a huge, really really bad problem, she told Wikimedia UK, and is worst on the English and German Wikipedias.

A prominent issue on the English Wikipedia is whether attempts to achieve high quality in articles - and perceptions that this is entangled with unfriendly treatment of newbies by the community - are associated with low rates of attracting and retaining new editors. Although Gardner believes that high quality and attracting new editors are both critical goals, her view is that quality has not been the problem, although she didn’t define exactly what article quality is. What we didn’t know in 2007, she said, was that quality was doing fine, whereas participation was in serious trouble. The English Wikipedia was at the tail end of a significant drop in the retention of new editors: people were giving up the editing process more quickly than ever before.

Participation matters because it drives quality. People come and go naturally, and that means we need to continually bring in and successfully orient new people. If we don’t, the community will shrink over time and quality will suffer. That’s why participation is our top priority right now.

…Deletions and reversions might be distasteful to new editors, but how can we, for instance, maintain strict standards about biographies of living people (BLP) without reverting problematic edits and deleting inappropriate articles? Gardner rejected the premise:

I don’t believe that quality and openness are inherently opposed to each other. Openness is what enables and motivates people to show up in the first place. It also means we’ll get some bad faith contributors and some who don’t have the basic competence to contribute well. But that’s a reasonable price to pay for the overall effectiveness of an open system, and it doesn’t invalidate the basic premise of Wikipedia: that openness will lead to quality.

…While staking the Foundation’s claim to the more technical side of the equation, Gardner doesn’t shrink from providing advice on how we can fix the cultural problem.

If you look at new editors’ talk pages, they can be pretty depressing - they’re often an uninterrupted stream of warnings and criticisms. Experienced editors put those warnings there because they want to make Wikipedia better: their intent is good. But the overall effect, we know, is that the new editors get discouraged. They feel like they’re making mistakes, that they’re getting in trouble, people don’t want their help. And so they leave, and who can blame them? We can mitigate some of that by toning down the intimidation factor of the warnings: making them simpler and friendlier. We can also help by adding some praise and thanks into the mix. When the Foundation surveys current editors, they tell us one of the things they enjoy most about editing Wikipedia is when someone they respect tells them they’re doing a good job. Praise and thanks are powerful.

…[Around the time of the Seigenthaler and Essjay controversies] Jimmy went to Wikimedia and said quality … we need to do better, [and through the distortions of the ripple-effect in the projects] there was this moral panic created around quality … what Jimmy said gave a whole lot of people the license to be jerks. … Folks are playing Wikipedia like it’s a video game and their job is to kill vandals … every now and again a nun or a tourist wanders in front of the AK47 and gets murdered …

Many people have complained that Wikipedia patrollers and administrators have become insular and taken on a bunker mentality, driving new contributors away. Do you agree, and if so, how can this attitude be combated without alienating the current core contributors?

I wouldn’t characterize it as bunker mentality at all. It’s just a system that’s currently optimized for combating bad edits, while being insufficiently concerned with the well-being of new editors who are, in good faith, trying to help the projects. That’s understandable, because it’s a lot easier to optimize for one thing (no bad edit should survive for very long) than for many things (good edits should be preserved and built upon, new editors should be welcomed and coached, etc.). So I don’t think it’s an attitudinal problem, but more an issue of focusing energy now on re-balancing to ensure our processes for patrolling edits, deleting content, etc. are also designed to be encouraging and supportive of new people.

How can a culture that has a heavy status quo bias be changed? How can the community be persuaded to become less risk-averse?

My hope is that the community will become less risk-averse as the Foundation makes successful, useful interventions. I believe the Vector usability improvements are generally seen as successful, although they of course haven’t gone far enough yet. Wikilove is a small feature, but it’s been adopted by 13 Wikipedia language-versions, plus Commons. The article feedback tool is on the English Wikipedia and is currently being used in seven other projects. The new-editor feedback dashboard is live on the English and Dutch Wikipedias. New warning templates are being tested on the English and Portuguese Wikipedias. And the first opt-in user-facing prototype of the visual editor will be available within a few weeks. My hope is all this will create a virtuous circle: support for openness will begin to increase openness, which will begin to increase new editor retention, which will begin to relieve the workload of experienced editors, which will enable everyone to relax a little and allow for more experimentation and playfulness.

Regaining our sense of openness will be hard work: it flies in the face of some of our strongest and least healthy instincts as human beings. People find it difficult to assume good faith and to devolve power. We naturally put up walls and our brains fall into us-versus-them patterns. That’s normal. But we need to resist it. The Wikimedia projects are a triumph of human achievement, and they’re built on a belief that human beings are generally well-intentioned and want to help. We need to remember that and to behave consistently with it.

I am skeptical that Gardner’s initiatives will change the curves (although they are not bad ideas); my general belief is that deleting pages, and the omnipresent threat of deletion, are far more harmful than complex markup. (I should note that Gardner has read and praised this essay, but also that much of this essay is based on my feelings and may not generalize.)

Regardless of whether the WMF really understands the issue, it is almost unintentionally hilarious to look at the proposed solutions - for example, one amounts to restoring early Wikipedia culture & practices in private sandboxes, protected from the regulars & their guidelines! Band-aids like Wikilove or article rating buttons are not getting at the core of the problem; a community does not live on high-quality rating tools (Everything2) or die on poor ones (YouTube). The Foundation/developers sometimes do the right thing, like striking down an English Wikipedia consensus to restrict article creation even further, but will it be enough? To quote Carl Bialik again:

Adding more editors is one of our top priorities for the year, says Howie Fung, senior product manager for the Wikimedia Foundation, which aims to increase the number of editors across all languages of Wikipedia to 95,000 from 81,450 by June of next year.

The subsequent research has in some respects vindicated my views: some have tried to argue that the declines are due to picking all the low-hanging fruit in articles or in available editors, that lower quality editors merited additional procedures. But what we see is not that new editors are worse or lower-quality, but that they are as high-quality and useful as they have been since 2006; nor is this due to a declining supply of new editors plus better procedures for winnowing them out, from Kids these days: the quality of new Wikipedia editors over time (Research:Newcomer quality):

What we found was encouraging: the quality of new editors has not substantially changed since 2006. Moreover, both in the early days of Wikipedia and now, the majority of new editors are not out to obviously harm the encyclopedia (~80%), and many of them are leaving valuable contributions to the project in their first editing session (~40%). However, the rate of rejection of all good-faith new editors’ first contributions has been rising steadily, and, accordingly, retention rates have fallen. What this means is that while just as many productive contributors enter the project today as in 2006, they are entering an environment that is increasingly challenging, critical, and/or hostile to their work. These latter findings have also been confirmed through previous research.

(I am struck by the fall in newbie survival rates for the highest-quality - golden - editors in 2006-2007. The Seigenthaler affair was, recollect, November-December 2005.)

I suspected that Fung’s objective would not be reached, as indeed it was not15.

Remember, most measures are directed against casual users. Power users can navigate the endless processes, or call in powerful friends, or simply wait a few years16 The most powerful predictor of whether an editor will stop editing is… how much they are editing.17 User:Resident Mario (joined 2008) points in his December 2011 essay Openness versus quality: why we’re doing it wrong, and how to fix it18 to a dramatic graph of editor counts19:

Active Wikipedians: Actual versus Strategy
Active Wikipedians: Actual versus Strategy

And it’s casual users who matter. We lost the credentialed experts years ago, if we ever had them. Surveys asking why are almost otiose; they will do so if they are exceptional or if they are managing PR around a discovery. But Wikipedia is not Long Content; why would they contribute if they can get the traffic they desire just by inserting links20? Why would they build their intellectual houses on sand?21 They get the best of both worlds - gaining traffic and avoiding the toxic deletionists.

And we can see this quite directly: when the general population of editors get solicited to contribute to AfD, their !votes are different from the AfD regulars, and in particular, when keep !voters spread the word about an AfD, their recruits are much more likely to !vote keep a well, while would-be deleters do their cause no favor with publicity22. Can there be any more convincing proof that deletionism and its manifestations are a cancer on the Wikipedia corpus?

Having discussed the broad trend of deletionism and problems with editors, let’s look at one specific deletionist practice which has, as far as I know, never been examined before, despite being a classic deletionist practice and, like most deletionist practices, one that by the numbers turns out to badly misserve both editors and readers: the practice of moving links from External Links to the Talk page.

The reason for my interest in this minor deletionist practice is that I no longer edit as much as I used to, and so frequently when I find an excellent citation (article, review, interview etc.) I will often just copy it into the External Links section or (if I am feeling especially energetic) I will excerpt the important bits onto the article’s Talk page. I realized that this constitutes what one might call a natural experiment: I could go back and see how often the excerpts were copied by another editor into the article. This is better than just looking at how often anime editors edit or how often anime articles are edited because it is less related to outside events - perhaps anime news was simply boring over that period or perhaps some new bots or scripts were rolled out. Whereas if there are no anime editors who will edit even when presented with gift-wrapped RSs (links & excerpts specifically called out for their attention, and trivially copy-pasted into the article), then that’s pretty convincing evidence that there is no longer a there there - that the editors are no longer active.

On at least two articles (Talk:Gurren Lagann#Interviews & Talk:Royal Space Force: The Wings of Honnêamise#Sources), I have been strenuously opposed by editors who object to having more than a handful of links in the designated External Links section; they acknowledged the links were (mostly) all undoubted RSs and relevant to the article - but they refused to incorporate the links into the article. This is bad from every angle, yet few other editors were interested in helping me.

So I’ve begun going through my old mainspace Talk edits using Special:Contributions, starting all the way back in April 2007 (>4 years ago, more than enough time for editors to have made use of my gifts!), looking for cases where I’ve dumped such references. I compiled two lists, of 146 anime-related edits, and 102 non-anime-related edits.

Before going any further, it’s worth asking - to avoid hindsight bias and post hoc rationalization - what you expect my results to be.

When asking yourself, remember that these edits, and a larger set of edit we’ll soon examine, are selected edits; they are high-quality edits, ones where I thought the relevant article must cover it. They are not low-quality dumps of text or links by a passing anonymous editor or done out of idle amusement. What percentage would you expect to have been used after a week, enough time that most article-watchlisting editors will have seen the diff and had leisure to deal with task more complex than reverting vandalism? 50% doesn’t seem like a bad starting point. How about after a year? Or two? Maybe 70% or 90%? After that, if it hasn’t been dealt with, it’s probably not ever going to be dealt with (even assuming the section hasn’t been stuffed in an archive page). Hold onto your estimate.

Once the lists were compiled and weeded, I wrote a Haskell program to do the analysis. The program loads the specified Talk page URLs and extracts all URLs from the Talk diff so it can check whether any of them were linked in the Article (which, incidentally, leads to false positives and an overestimation23).

The results for my edits when run on the two lists:

  • anime: of 146 edits, 11 were used, or <8%
  • non-anime: 102 edits, 3 used, or <3%

For comparison, we can look at an editor who has devoted much of her time to finding references for anime articles - but made the colossal mistake of believing the EL partisans when they said external links should either be incorporated into article text or listed on the talk page. User:KrebMarkt has made perhaps thousands of such edits from impeccable RSs; it is possible that my own contributions are skewed downwards, say, by a congenital inability to select good references. Hence, looking at her reference-edits will provide a cross-check.

I compiled her most recent 1000 edits to the article talk space with a quick download: elinks -dump '' '' | grep '&diff='. Then I manually removed edits which were minor or did not seem to be her usual reference-edits, resulting in the following list of 958 edits from December 2010 to December 2011. (KrebMarkt almost exclusively adds anime-related references, so I did not prepare a non-anime list.) The results:

  • Of the 958 edits adding references, 36 were used in the article, or <4%
  • Combining my anime & non-anime with KrebMarkt’s edits, we have 1206 edits adding references, of which less than 50 were used in the article, or <4.15%

Besides it being surprising that KrebMarkt (not a particularly committed inclusionist, if she be an inclusionist at all) had a success rate half mine, <4.15% is shockingly low.

1156 ignored edits represents a staggering waste of editor-time24. This cannot be explained as our faults: we are both experienced editors (I began editing in 2004, and KrebMarkt in 2008), who know what good RSs are. And all of the edits contain good RSs. (The reader is invited to check edits and see for himself whether they are solid and valuable RSs, like reviews by the Anime News Network.) That perhaps 110\frac{1}{10} of our suggested references are included is due solely to the apathy or nonexistence of other editors. (If such a rate is a success, may the Almighty preserve us from a failure!)

Since that will not soon change for the better, this leads to one conclusion: the idea that references hidden on Talk pages will one day be used is false.

Somebody remarked: I can tell by my own reaction to it that this book is harmful. But let him only wait and perhaps one day he will admit to himself that this same book has done him a great service by bringing out the hidden sickness of his heart and making it visible.25

We have looked at what suggesting additions results in: abject failure. The Wikipedia community is failing at incorporating new links. Some attempted to justify my experiment above: it’s OK because at least the existing External Links sections are quality sections. This is desperate special pleading, but we should test it. How is the editing community at the flip side of the coin - retaining old links? If inclusionists’ suggestions are being ignored, is this at least fairly applied, with deletionists’ edits also futile?

Unfortunately, testing this requires destructive editing. (We can’t simply suggest on talk pages that external links be removed because that is both not how deletionists operate and likely will result in no changes, per the previous experiment demonstrating inaction on the part of editors.)

The procedure: remove random links and record whether they are restored to obtain a restoration rate.

  • Editors might defer to other editors, so I will remove links as a anonymous IP user from multiple proxies; the restoration rate will naturally be an underestimate of what a registered editor would be able to commit, much less a tendentious deletionist.
  • To avoid issues with cherry-picking or biased selection of links26, I will remove only the final external link on pages selected by Special:Random#External_links which have at least 2 external links in an External links section, and where the final external link is neither an official link nor template-generated. (This avoids issues where pages might have 5 or 10 official external links to various versions or localizations, all of which an editor could confidently and blindly revert the removal of; template-generated links also carry imprimaturs of authority.)
  • The edit summary for each edit will be rm external link per [[WP:EL]] - which has the nice property of being meaningless to anyone capable of critical thought (by definition, a link removal should be per one of WP:EL’s criterions - but which criterion?) but also official-looking like many deletionist edit-summaries.

    This point is very important. We are not interested in vandalism in general, nor all possible forms of external link vandalism (like adding spam links, inserting gibberish, breaking syntax), but in bad edits which mimic how a deletionist would edit. A deletionist would avoid certain links, and would be sure to make some allusion to policy. (Shades of Poe’s law: it is impossible to distinguish an actual deletionist’s edits from random deletions accompanied by repetitive jargon.) If our experiment does not mimic these traits, our final measurement of bad-edit reversion rate will simply not be measuring what we hoped to measure.
  • To avoid flooding issues and be less noticeable, no more than 5 or 10 links a day will be removed with at least 1 minute between each edit.
  • To avoid building up credibility, I will not make any real edits with the anonymous IPs
  • After the last of the 100 links have been removed, I will wait 1 month (long enough for the edit to drop off all watchlists and reversion rates become close to nonexistent27) and restore all links. I predict at least half will not be restored and certainly not more than 90%.

The full list of URL diffs is available as an appendix.

After finishing the link removals, I briefly looked over the edits contribution pages for (top), which specifies whether an edit is still the latest edit for that page (all reverted removals will by definition not still be the latest edit, but some non-reverted edits will have unrelated edits stealing the status, so the number gives an upper bound on how many removals were reverted). It looked like <10%.

I was also struck during the process of going through Special:Random by how many External Links sections have been, in wretched subterfuges, renamed Sources, References, Further reading, or the article has a long References section stuffed with external links which are used once; perhaps editors collectively know that putting a link into a section named External Links is painting a cross-hair on its forehead. Too, I was struck by the general quality of the links: of the 100, I would have assented to the removal of no more than 5 (10 at the most). In general, articles err far on the side of including too few external links rather than too many.

How many readers were affected by my experiment over the course of the month of waiting? Feel free to estimate or give a range - 1,000 or 10,000 or maybe 100,000 readers? The articles are randomly picked, so it seems highly unlikely that there is significant overlap. But my best estimate, based on data for the 100 articles’ traffic in March 2012, is that somewhere around >~335,000 readers were affected28.

How many editors were affected? The 100 articles edited were watchlisted by a median of 5 editors each; unfortunately, in lieu of technologies like Patrolled Revisions, we cannot estimate how many times each edit was checked by a human (as many of those editors no doubt are inactive or do not monitor their watchlist closely).

What was the early reaction when I mentioned this experiment? Ian Woollard said

…if you’d have picked something other than external links, that might, or might not have been a good test.

Last time I checked (which admittedly was a while ago) Wikipedia had a noticeboard whose entire purpose, was essentially to delete as many external links as possible, they’d even added a policy that said they could do that in every single case unless you could get a majority in a poll to keep individual links; oh and in practice they pretty much !vote-stuffed those polls too by announcing the polls on the noticeboard, so the chances of a clear majority was low. Oh, and there was a bunch of shady anonymous IPs involved as well that swing around after the fact to edit war them away anyway if an external link they didn’t favor gets through all that.

Basically, external links are one of the most hated parts of Wikipedia, and if hardly any of them got fixed it wouldn’t surprise me, and wouldn’t prove anything very much.

Exaggeration? Well, consider what the active administrator User:Future Perfect at Sunrise wrote in the WP:AN/I discussion:

Hmm, strange experiment. Given the huge number of inappropriate external links we have, I really wonder: wouldn’t a random removal of a hundred links catch so many bad links objectively worthy of removal that the net effect of the vandalism might be more benefit than harm? If the experiment is meant to measure how good the community is at reverting vandalism, I can’t see how they can do that without having a measure for these random beneficial hits.

None of the commenters rose to my challenge to estimate what the revision rate should be, with the exception of the administrator User:Horologium (who identifies as an transwiki-ing exclusionist29, which in practice means deletionism) who looked at 19 articles and estimated that ~30% of ELs were bad by his standards (so we can infer that a reversion rate of anything but 70% will highly likely either be allowing good links to be deleted or defending bad links by his standards).

3% is far worse than I had predicted, and statistically suggests that the true rate is no higher than 7%30. This leads to one conclusion: external links are highly vulnerable to deletionism.

A month after this experiment, I resurveyed the 100 edits to see how many restorations had been reverted. 4 had been reverted:

Those who think that 3% was the correct reversion rate for the removals are invited to explain how 4% could be the correct reversion rate for the re-adding of the same links - if it was acceptable for 97% to be removed in the first place, how could it also be acceptable for 94% to then be restored?

One might try to defend this wasteful practice by claiming that some editors and readers will go to the Talk page and there might notice and visit the deleted links. This could only ameliorate the problem slightly, but it’s worth investigating just how rarely Talk pages are visited so we can explode this particular instance of the fallacy of the invisible. How many of our readers actually look at the talk page as well? (Do a quick estimate, as before, so you can know if you were right or wrong, and by how much.) I know some writers writing articles on Wikipedia have mentioned or rhapsodized at length on the interest of the talk pages for articles, but they are rare birds and statistically irrelevant.

It might be enough simply to know how much traffic to talk pages there is period. I doubt editors make up much of Wikipedia’s traffic, with the shriveling of the editing population, which never kept pace with the growth into a top 10/20 website, so that would give a good upper bound. It would seem to be very small; there’s not a single Talk page in the top 1000 on’s top articles. We can look at individual articles; Talk:Anime has 273 hits over one month while the article Anime has 128,657 hits (a factor of 471); or Talk:Barack Obama with 1800 over that month compared to Barack Obama with its 504,827 hits (a factor of 280).

The raw stats used by are available for download, so we can look at all page hits, sum all article and all Talk hits and see what the ratio is for the entire English Wikipedia is on one day. (each file seems to be an hour of the day so I downloaded 24 and gunzipped them all.) We do some quick shell scripting. To find the aggregate hits for just talk pages:

grep -e '^en Talk:' -e '^en talk:' pagecounts-* | cut -d ' ' -f 3 | paste -sd + | bc

To find aggregate hits for non-talk pages:

grep -e '^en ' pagecounts-* | grep -v -e '^en Talk:' -e '^en talk:' | cut -d ' ' -f 3 | paste -sd + | bc

The numbers look sane - 58,2771 for all talk page hits versus 2,0268,0742 for all non-talk page hits. A factor of 347 is pretty much around where I was expecting based on those previous 2 pages. The traffic data developer, Domas, says the statistics exclude API hits but includes logged-in editor hits, so we can safely say that anonymous users made far fewer than 58k page views that day and hence the true ratios are worse than our previous ratios of 471/280/347. To put the relative numbers into proper perspective, we can convert into percentages:

  • If we take the absolutely most favorable ratio, Obama’s at 280, and then further assume it was looked at by 0 logged-in users (yeah right), then that implies something posted on its talk page will be seen by <0.35% of interested readers ((5048271800×1.0)×100(\frac{504827}{1800} \times 1.0) \times 100).
  • If we use the aggregate statistic and say, generously, that registered users make up only 90% of the page views, then something on the talk page will be seen by <0.028% of interested readers ((202680742582771×0.1)×100(\frac{202680742}{582771} \times 0.1) \times 100).

Page views don’t tell us the most interesting thing, how many people would have clicked on the link if it had been on the article and not the Talk page. It’s impossible to answer this question in general, unfortunately, since Wikipedia does not track clicks.

However, I have approximated the ratio for at least one article: the dual n-back article links to my DNB FAQ. There are a few dozen visitors each day from Wikipedia, Google Analytics tells me. What will happen if the link is removed to the Talk page? The article and general interest in n-back haven’t changed - those variables are still the same. The same sort of people will be visiting the article and (not) visiting the Talk page. The visitor count will dramatically fall, probably to less than 1 a day. The link was in the article for perhaps half a year, since ~14 July 2011; on 9 February 2012, I shifted it to the Talk page with a fake message praising the contents, to mimic how an editor might genuinely post the link on the Talk page (asking the forbearance & cooperation of my fellow editors in hidden comments). I then scheduled a followup for 100 days: 19 May 2012.

It ought to be trivial and pointless - everyone should acknowledge that essentially no readers also read Talk pages, but it’s still worth precommitting: I predict that Talk click-throughs will average <5% of Article click-throughs, and the difference between the 2 datasets will be statistically-significant at p<0.05.

As promised, on 20 May 2012 I restored my FAQ link and began analysis:

  1. Before:

    Between 14 July 2011 and 8 February 2012 (a longer period), the totals were 31,454/23,538 (pageview/unique pageview), with 1,910/1,412 from the English Wikipedia and as one would expect, a lesser 740/618 from the German Wikipedia31. n=209, so the daily average click from the English Wikipedia is 1910209=9.14\frac{1910}{209} = 9.14

    PDF overview, English hits CSV
  2. After:

    Between 10 February and 12:50 PM 20 May 2012, my DNB FAQ received from all sources 21,803/16,899 page views (raw/unique). 327/164 page views were from the German Wikipedia, and there were 161/155page views from the English Wikipedia. n=100, so the daily average is 161100=1.61\frac{161}{100} = 1.61.

    PDF overview, English hits CSV

Dividing the two averages shows that the average clicks in this period were ~17.6%, not <5% as I had predicted. This difference between the two groups is statistically-significant at p<0.001, needless to say32.

So, Talk page click-throughs are indeed lower than Article click-throughs, but almost 3 times larger than I expected. What happened? We know this can’t be the general case from looking at the data - there just isn’t enough traffic to Talk pages for any reason.

My best guess is that the dual n-back article is simply a bad example. If we look at the April 2012 data as an example, we see that it gets something like 15 page views a day with occasional spikes and throughs, 568 visits over 30 days averaging 19 visits a day. There were 9 click-throughs on average during the previous sample - suggesting that something like half the readers are clicking through to one external link! This does not sound like normal article behavior, and suggests to me that the very short and incomplete nature of the dual n-back Wikipedia article is causing readers to look for further better information like my FAQ, which might cause readers to also resort to checking the talk page for information (where they would run into my glowing fake blurb visible on the first screen). Unfortunately, I cannot check this theory because currently only one article links to my site where I can gather Google Analytics information.

More instructive is estimating how many readers have been deprived of the chance to use the references for just the subset of 1206 edits we have already looked at above. We can reuse with a little more programming; we will ask it how many hits/page-views, in total, there were in November 2011 of the 472 unique articles covered by those 1206 edits.

The total: 8,480,394.

Extrapolating backwards to 2007/2008 is left as an exercise for the reader.

When we consider how false the idea that this practice serves the editor, and when we consider how many readers are ill-served, they suggest that the common practice of moving reference/link to the Talk page be named for what it is: a subtle form of deletion.

It would be a service to our readers to end this practice entirely: if a link is good enough to be hidden on a Talk page (supposedly in the interests of incorporating it in the future, which we have seen is a empty promissory note), then it is good enough to put at the end of External Links or a Further Reading section, and the literally millions of affected readers will not be deprived of the chance to make use of them.

I fully expect to see this practice for years to come.

Elaborate euphemisms may conceal your intent to kill, but behind any use of power over another the ultimate assumption remains: I feed on your energy.33

This result will come as no surprise to longtime inclusionists. The deletion process deletes most articles which enter it, and has long been complained about by outsiders. Entire communities (such as the web comics34 or MUD online communities35) have been alienated by purges of articles - purges which not infrequently result in abuse of process, much newbie biting, and comical spectacles like AfD regulars (usually deletionists) insisting a given article is absolutely non-notable and experts in the relevant field demurring; a particularly good AfD may see statements of experts dismissed on speciously procedural grounds such as having been made in the expert’s blog (and so failing WP:RS, or perhaps simply being dismissed as WP:OR) and not a traditional medium (despite the accelerating abandonment of traditional RSs by experts in many fields36). The trend has been clear. Andrew Lih, who has been editing Wikipedia even longer than myself (since 2003) and who wrote a book on Wikipedia, writes in Unwanted: New articles in Wikipedia:

’It’s incredible to me that the community in Wikipedia has come to this, that articles so obviously keep just a year ago, are being challenged and locked out. When I was active back on the mailing lists in 2004, I was a well known deletionist. Wiki isn’t paper, but it isn’t an attic, I would say. Selectivity matters for a quality encyclopedia. But it’s a whole different mood in 2007. Today, I’d be labeled a wild eyed inclusionist. I suspect most veteran Wikipedians would be labeled a bleeding heart inclusionist too. How did we raise a new generation of folks who want to wipe out so much, who would shoot first, and not ask questions whatsoever? [If Lih can write this in 2007, you can imagine how people who identified as inclusionists in 2004, such as myself or The Cunctator, look to Wikipedians who recently joined.]

It’s as if there is a Soup Nazi culture now in Wikipedia. There are throngs of deletion happy users, like grumpy old gatekeepers, tossing out customers and articles if they don’t comply to some new prickly hard-nosed standard. It used to be if an article was short, someone would add to it. If there was spam, someone would remove it. If facts were questionable, someone would research it. The beauty of Wikipedia was the human factor - reasonable people interacting and collaborating, building off each other’s work. It was important to start stuff, even if it wasn’t complete. Assume good faith, neutral point of view and if it’s not right, {{sofixit}}. Things would grow.’

I was particularly depressed to read in the comments things from administrators whose names I recognize due to their long tenure on Wikipedia, like Llywrch (joined 2002):

I’m sorry that you encountered that, Andrew - but not surprised. I had my own encounter with the new generation ofquote policy, not reasoning" deletionists; I feel as if I encountered (to quote from the song) the forces of evil from a bozo nightmare. No one - including me - looked good after that exchange. (I keep thinking that I should have said something different, but the surrealism of the situation multiplied with the square of my frustration kept me from my best.)"

Or Stbalbach:

I’m a long time editor, since 2003, ranked in the top 300 by number of edits (most in article space). On May 11th 2007 I mostly gave up on Wikipedia - there is something wrong with the community, in particular people deleting content. I’d never seen anything like it prior to late 2006 and 2007. Further, the use ofnag tags" at the top of articles is out of hand. It’s easier to nag and delete than it is to research and fix. Too many know-nothings who want to help have found a powerful niche by nagging and deleting without engaging in dialog and simply citing 3 letter rules. If a user is unwilling or incapable of working to improve an article they should not be placing nag tags or deleting content."

Also interesting is Ta bu shi da yu’s comment, inasmuch as Ta bu invented the infamous {{fact}}:

I have also seen this happening. It’s incredible that those who are so incredibly stupid can get away with misusing the speedy deletion tag! As for DRV… don’t make me laugh. It seems to be slanted to keep articles deleted. I can’t agree more with your sentiments that if you know all the codes to WP:AFD, then you are a menace to Wikipedia.

Why is this culture changing? In part because article writing seems to get no more respect. A review article summarizes the findings of Burke and Kraut 200837:

…it is proving increasingly hard to become a Wikipedia administrator: 2,700 candidates were nominated between 2001 and 2008, with a success rate of 53%. The rate has dropped from 75.5% until 2005 to 42% in 2006 and 2007. Article contribution was not a strong predictor of success. The most successful candidates were those who edited the Wikipedia policy or project space; such an edit is worth ten article edits.

What sort of editor, with a universe of fascinating topics to write upon, would choose to spend most of his time on the policy namespace? What sort of editor would choose to stop writing articles?38 Administrators with minimal experience in creating content - and much experience in destroying it and rewriting the rules to permit the destruction of even more. Is this not almost the opposite of what one wants? And imagine how the authors must feel! An article is not a trivial undertaking; sometime sit down, select a random subject, and try to write a well-organized, fluent, comprehensive, and accurate encyclopedia article on it. It’s not as easy as it looks, and it’s even harder to write a well-referenced and correctly formatted one. To have an article deleted is bad enough; I can’t imagine any neophyte editors wanting to have anything to do with Wikipedia if an article of theirs got railroaded through AfD. It is easier to destroy than to create, and destruction is infectious. (In the study Thurner et al 2012 of 3.3 years of the online SF game Pardus, players were found to pay it forward when the subject of negative actions; the community was only saved from an epidemic of attacks by the high mortality & quitting rate of negative editors - I mean, negative players39.)

Deleting articles and piling on policy after guideline after policy are both directly opposed to why Wikipedians contribute! When surveyed in 2011:

The two most frequently selected reasons for continuing to edit Wikipedia were I like the idea of volunteering to share knowledge (71%) and I believe that information should be freely available to everyone (69%), followed by I like to contribute to subject matters in which I have expertise (63%) and It’s fun (60%).

And ironically, the more effort an editor pours into a topic and the longer & more detailed the article becomes, the more blind hatred it inspires in deletionists. If you look at AfDs for small articles or stubs, the deletionists seem positively lucid & rational; but make the article 50kB long, and watch the rhetoric fly. I call this the fancruft effect: deletionists are mentally allergic to information they do not care about or like.

If a deletionist sees an article on Lightsaber combat40 and it’s just a page long, then he has little problem with it. It may strike him as too big, but reasonable. But if the article dares to be comprehensive, if it is clearly the product of many hours’ labor on the part of multiple editors, if there are touches like references and quotes - then something is wrong on the Internet, the very universe is out of joint that this article has been so well-developed when so many more deserving topics languish, it is a cosmic injustice. A dirty beggar is parading around acting like an emperor. The article does not know its place. It needs to be smacked down and hard. And who better than the deletionist?

What is the ultimate status-lowering action which one can do to an editor, short of actually banning or blocking them? Deleting their articles.

In a particular subject area, who is most likely to work on obscurer articles? The experts and high-value editors - they have the resources, they have the interest, they have the competency. Anyone who grew up in America post-1980 can work on [[Darth Vader]]; many fewer can work on [[Grand Admiral Thrawn]]. Anyone can work on [[Basho]]; few can work on [[Fujiwara no Teika]].

What has Wikipedia been most likely to delete in its shift deletionist over the years? Those obscurer articles.

The proof is in the pudding: all the high-value/status Star Wars editors have decamped for somewhere they are valued; all the high-value/status Star Trek editors, the Lost editors… the list goes on. They left for a community that respected them and their work more; these specific examples are striking because the editors had to make a community, but one should not suppose such departures are limited to fiction-related articles. There may be evaporative cooling of the community but it’s not towards the obsessive fans.

The greatest pleasure is to vanquish your enemies and chase them before you, to rob them of their wealth and see those dear to them bathed in tears, to ride their horses and clasp to your bosom their wives and daughters.41

Outsiders! I realize it might sound like a stretch that anyone enjoys the power of nominating articles, that being a deletionist could be a joyful role. You say you understand how administrators (with their ability to directly delete, to ban, to rollback etc.) could grow drunk on power, but how could AfD nominations lead to such a feeling?

But I know from personal experience that there is power exercised in nominating for deletion. Well do I know the dark arts of gaming the system: of the clever use of templates, of the process of deleting the article by carefully challenging and removing piece after piece, of invoking the appropriate guidelines and policies to demolish arguments and references.

I have seen the wails and groans in the edit summaries & comments of my opponents, and exulted in their defeat. It’s very real, the temptation of exercising this power. It’s easy to convince yourself that you are doing the right thing, and merely enforcing the policies/guidelines as the larger community set them down. (Were all my nominations just? No, but I have succeeded in fooling myself so well that I can no longer tell which ones truly did deserve deletion and which ones were deleted just because I disliked them or their authors.)

Who can say how many authors take it personally? The deletion process is inherently insulting: Out of 2.5 million articles, yours stands out as sucking so badly that it is irredeemable and must be obliterated. And it is ultimately sad42 - life is short but must that be true of articles as well as men?