The Bots that Work Behind the Scenes to Make Wikipedia Possible

 digitaltrends.com  01/20/2020 11:00:46   Luke Dormehl

The idea behind Wikipedia is, lets face it, crazy. An online encyclopedia full of verifiable information, ideally with minimal bias, that can be freely edited by anyone with an internet connection is a ridiculous idea that was never going to work. Yet somehow it has.

Nineteen years old this month (it was launched in January 2001, the same month President George W. Bush took office), Wikipedias promise of a collaborative encyclopedia has, today, resulted in a resource consisting of more than 40 million articles in 300 different languages, catering to an audience of 500 million monthly users. The English language Wikipedia alone adds some 572 new articles per day.

For anyone who has ever browsed the comments section on a YouTube video, the fact that Wikipedias utopian vision of crowdsourced collaboration has been even remotely successful is kind of mind-boggling. Its a towering achievement, showing how humans from around the globe can come together to create something that, despite its flaws, is still impressively great.

What do we have to thank for the fact that this human-centric dream of collective knowledge works? Well, as it turns out, the answer is bots. Lots and lots of bots.

Bots to the rescue

Bots emerged on Wikipedia out of necessity. The term, used as shorthand for “software robot,” is an automated tool designed to carry out specific tasks. In the early days of Wikipedia, this largely involved sorting out vandalism. This problem could behandled manually when the total number of active contributors on Wikipedia numbered in the dozens or even hundreds. But as the website experienced its first boom in popularity this was no longer so easy to do. By 2007, for example, Wikipedia was receiving upward of 180 edits every minute. That was way too much for human editors to cope with.

A very important thing that [Wikipedia bots were created to do] is to protect against vandalism, Dr. Jeff Nickerson, a professor at the Stevens Institute of Technology in Hoboken, N.J., who has studied Wikipedia bots, told Digital Trends. Theres a lot of instances where someone goes into a Wikipedia page and defaces it. Its like graffiti. That became very annoying for the people who maintain those pages to have to go in by hand and revert the edits. So one logical kind of protection [was] to have a bot that can detect these attacks.”

Along with other researchers from the Stevens Institute of Technology, Nickerson recently carried out the first comprehensive analysis of all 1,601 of Wikipedia’s bots. According to that study, published in the Proceedings of the ACM on Human-Computer Interaction journal, bots account for around 10% of all activity on Wikipedia. This rises to a massive 88 percent of activity on Wikidata, the central storage platform for structured data used on the various Wikimedia websites.

Wikipedia Bot Roles and Associated Functions

bots that make wikipedia work wikibot 1

Generator

Generate redirect pages

Generate pages based on other sources

bots that make wikipedia work wikibot 9

Fixer

Fix links

Fix content

Fix files

Fix parameters in template/category/infobox

bots that make wikipedia work wikibot 8

Connector

Connect Wikipedia with other wikis

Connect Wikipedia with other sites

bots that make wikipedia work wikibot 7

Tagger

Tag article status

Tag article assessment

Tag Wikiprojects

Tag multimedia status

bots that make wikipedia work wikibot 6

Clerk

Update statistics

Document user data

Update maintenance pages

Deliver article alert

bots that make wikipedia work wikibot 5

Archiver

Archive content

Clean up sandbox

bots that make wikipedia work wikibot 4

Protector

Identify policy violations

Identify spam

Identify vandals

bots that make wikipedia work wikibot 3

Advisor

Provide suggestions for Wikiprojects

Provide suggestions for users

Greeting the newcomers

bots that make wikipedia work wikibot 2

Notifier

Send user notifications

The research conducted by Nickerson and colleagues divided bot activity on Wikipedia into nine different categories. There are, as noted, protectors, dedicated to identifying policy violations, spam, and vandals. Then there are fixers, who live virtual lives revolving around the fixing of links, content, files, and anything else in need of a good tweaking. There are taggers, for tagging article statuses and assessments; clerks, for updating statistics and maintenance pages; archivers for archiving content; advisors for greeting newcomers and providing suggestions for users; notifiers for sending user notifications; and generators for creating redirection pages or generating new content based on other sources.

Their complexity varies a lot, said Morten Warncke-Wang, the current controller of SuggestBot, a bot which, well, suggests articles for editors to edit, based on their previous edit history. It depends on the task that theyre sent to carry out.

A certain degree of autonomy

Nickerson agreed. A bot, he suggested, can be anything from a relatively simple algorithm to a more complex machine learning A.I. What they have in common, he said, is a degree of autonomy. A bot is something that is created and then deployed to act on its orders, a little bit like a mission objective delegated to an employee. [A bot] can go off and make hundreds, thousands, sometimes millions of edits on its own,” Nickerson said. “This is not something that [a human editor is] just running once while youre sitting there. The 24 tops bots on Wikipedia have made more than 1 million edits in their lifetime: far in excess of virtually every human editor on the website.

If the range of bot categories sounds, frankly, a bit like a medieval colony of monks — all pursuing the unified goal of dogmatic enlightenment through an assortment of seemingly menial tasks — youre not entirely wrong. The fact that the bot world is reminiscent of a community of sorts is not at all accidental.

Anyone can develop a bot, just like anyone can edit an article.

Despite the fact that most casual Wikipedia users will never interact with a bot, their creation is every bit as collaborative as anything on the Wikipedia front end. Bots are not implemented by Wikimedia in a top-down manner. Anyone can develop a bot, just like anyone can edit an article. They do this according to perceived problems they believe a bot might be able to assist with. To get their bot rubber-stamped, they must submit an approval request to BAG, the Bot Approvals Group. If BAG deems the bot to be a valuable addition to the collective, it will be approved for a short trial period to ensure that it operates as designed. Only after this will it be unleashed on Wikipedia as a whole.

Theres a prosocial nature to a lot of the editors on Wikipedia, Nickerson said. A lot of the time people might write these bots for themselves and then make it available to the community. Thats often the way these bots emerge. Some editors doing a task they realize could be fixed with a fairly simple bot. Theyve got the skill to build it, and then that bot gets deployed and used by everyone.

Like an algorithmic bring your dog to work day, the owner of each bot is responsible for its behavior. Fail to respond to behavioral concerns, and your bot will be revoked.

Make bots great again

Here in 2020, bots frequently have a popular reputation thats somewhere between venereal disease and John Wilkes Booth. They are frequently cast as the human job-replacing, election-swaying tools designed to do far more bad than good. The Wikipedia example shows the flip-side to this picture. Wikipedias bots are the sites immune system: near-invisible tools that help provide resistance to (metaphorical) infection and toxins, while strengthening the system in the process.

As Nickerson points out, however, the bots are not entirely invisible. And thats to their betterment. When people dont think theyve received a good recommendation, theyll regularly post about that on the bot page, he said, describing the advisers and notifiers intended to coax human contributors to do better. To me, thats very interesting. Id love to be able to affect news feeds I get [elsewhere], but I cant. I dont have a way of going to the companies that are selecting news for me and saying, I think youre giving me too much of this; Id rather get more of that. Having control over the algorithms that are communicating with you is an important thing. And it seems to really work with Wikipedia.

bots that make wikipedia work wikibotcomments

Some Wiki bots carry out simple text generation. The first-ever Wikipedia bot, which appeared in late 2002, was designed to add and maintain pages for every U.S. county and city. But both Nickerson and Morten Warncke-Wang, the man behind SuggestBot, said that they couldnt foresee Wikipedia ever handing control of the website over entirely to text-generating algorithms. Theyre rarely used to create the content, Warncke-Wang said. Theyre much more used as tools to manage the content development.

At its core, Wikipedia is a deeply human effort — and the bots are there to help, not hinder. As Manfred E. Clynes and Nathan S. Kline, the two researchers who coined the term cyborg wrote in an influential 1960 essay: The purpose of the [ideal collaboration between humans and machine] is to provide an organizational system in which such robot-like problems are taken care of automatically and unconsciously; leaving man free to explore, to create, to think, and to feel.

Wikipedia bots follow in that spirit. As long as that relationship continues, long may they carry on helping us find the information we want. And stop bad actors from defacing the pages of celebrities they don’t like.

Editors' Recommendations

« Go back