YouTube Enlists Wikipedia in Its Conspiracy Theory Crackdown. But That Might Not Be Enough.  3/14/2018 12:05:36 PM   David Meyer

YouTube has a notorious problem with conspiracy-theory videos, most recently evidenced by a video featuring InfoWars’ Alex Jones suggesting that the Parkland, Florida, school shooting was a “deep state false flag operation” and that the surviving students who criticized the gun lobby were actors.

YouTube pulled that video and others, but there are many more ludicrous videos out there detailing conspiracies about everything from the Moon landing and 9/11 to, uh, the N.B.A.

To fix the problem, YouTube is turning to Wikipedia and other third-party, “fact-based” sources that YouTube will link to under conspiracy videos.

“When there are videos that are focused around something that’s a conspiracy—and we’re using a list of well-known internet conspiracies from Wikipedia—then we will show a companion unit of information from Wikipedia showing that here is information about the event,” YouTube CEO Susan Wojcicki told an audience at the SXSW Interactive festival in Austin, according to The Verge.

Wikipedia may be good for helping people discover the truth when it comes to established conspiracy theories. However, as Olivia Solon noted in the Guardian‘s coverage of Wojcicki’s speech, information from the collaboratively-edited online encyclopedia may not prove so helpful when a conspiracy theory is circulating regarding breaking news, as was the case with the Parkland shooting.

Indeed, Wikipedia itself notes that its processes are not suited to “sorting out the oft-conflicting and mistaken reporting common during disaster and other breaking news events.” And, as Buzzfeed’s Tom Gara noted, conspiracy theorists might increasingly take it upon themselves to try to edit what Wikipedia says on the relevant topics.

That said, YouTube claims Wikipedia will only be one of its nonsense-debunking sources.

YouTube certainly has a lot to contend with—not just because regulators around the world are increasingly calling for a crackdown on online misinformation, but because the problem is getting worse. And that’s down to YouTube’s own fundamental workings.

After the Parkland massacre, researchers found there was a self-reinforcing ecosystem of conspiracy theorists on the network. “Every time there’s a mass shooting or terror event, due to the subsequent backlash, this YouTube conspiracy genre grows in size and economic value,” wrote Jonathan Albright, a professor at Columbia University’s Tow Center for Digital Journalism. “The search and recommendation algorithms will naturally ensure these videos are connected and thus have more reach.”

Much as Facebook’s utility as a disinformation channel is largely a function of how the network rewards and promotes outrageous content, fixing YouTube’s conspiracy problem will take more than linking to reputable sources of facts. “Demonetizing” offensive videos—removing their ability to generate revenue from views—is another key piece of the puzzle. But what a complex puzzle it is.

« Go back