Opinion

YouTube’s plan to fight conspiracy theories only exposes a crisis of authority

YouTube has tapped Wikipedia to combat dubious videos with facts—revealing how hard it’s become to determine truth amid an ocean of information online

YouTube has a credibility problem.

On Google’s video-sharing service, amid the music videos, make-up tutorials, album reviews and cooking instructions, people peddle half-truths about climate change, “chemtrails,” vaccines, shootings, and historical events. And now, a mere beat after some horrible tragedy befalls a community, conspiracy theorists have been taking to YouTube to question the veracity of the lifeless bodies or blood-stained sidewalks, tossing up speculation and falsehoods during the fog of the sometimes still-active event. In the instance of the recent school shooting in Parkland, Fla., videos accusing student David Hogg of being a “crisis actor”—someone trained to portray a victim during emergency drills—quickly gained popularity. Even in the hours after YouTube itself suffered a shooting at its own headquarters in San Bruno, Calif., this week, videos have been made to weigh in and speculate on the shooter’s identity.

But making matters worse is that YouTube feeds users videos similar to those they have already watched. Watch one conspiracy theory video, and you may end up down a rabbit hole before you know it. Coupled with recent reports about YouTube also showing kids bizarre, violent videos designed to game how the sites algorithms work, it’s clear the video service has become home to some of the Web’s worst tendencies of misinformation.

As consternation about the phenomenon has grown, YouTube’s parent company responded by announcing it was going to append Wikipedia links and captions to certain videos. The plan, as it stands, is to pick a list of well-worn conspiracy theories—that the moon landing was faked, for example—and add in-text links to Wikipedia in an attempt to counteract the misinformation. It is, at least at first glance, a reasonable-seeming move. Spend some time on YouTube, and it’s easy to find videos asserting things that are highly dubious: that climate change is a hoax, that the Freemasons killed Kennedy, and, inevitably, that 9/11 was an inside job—to make no mention of even more troubling ideas, like the Islamization of Europe or Holocaust denial. In presenting countervailing facts, Google hopes, people will correct their ways.

But the company’s reaction misunderstands the problem of online culture: It is not that there is too little information, it’s that there is too much. That has produced a crisis in authority—and while solving it will be no small task that may well fundamentally alter the nature of the Web, Google’s Wikipedia collaboration may only further expose it.

Whether or not conspiracy theories are more pervasive today is unclear; according to Joseph Uscinski, a professor at the University of Miami College of Arts & Sciences and the author of American Conspiracy Theories, social scientists have only been keeping track of polling regarding the issue for a decade. What is clear, however, is that the context has changed thanks to the new ways technology and the media intersect. After the shooting at Sandy Hook, Uscinski says, conspiracy theorists took to YouTube to say that the shooting was a false flag, “and these people were given a very big platform to spread their ideas, specifically CNN—and they should not have been.” And significantly, President Donald Trump—who himself plays fast and loose with the facts—has blindly amplified theories on the campaign trail and in the world’s highest office, and in so doing, forced our focus toward falsehoods. “What has changed is our attention to conspiracy theories, and a lot of that is coming from political elites,” says Uscinksi. “As a result, our media is paying a lot more attention to them, too.”

But the digital age has also produced a broader problem. Because so many of us now receive news through social media or from online news sites, the mechanisms by which we might have once determined truth—professional design or established brands or names—have been flattened. The traditional standard-bearers of knowledge—academics, journalists, and experts of all kinds—simply don’t have the sway they once did, in part because there’s simply so much information out there, and in part because that glut has produced a sense of mistrust that the experts actually have it right. And when a fake news site looks mostly indistinguishable from the New York Times, it can be hard for even reasonable-minded people to verify what’s legitimate.

So handing over authority to Wikipedia—itself a publicly editable repository of knowledge, updated and maintained by a small cadre of volunteers—only reveals the softness of the ground that Google wants to build a foundation upon. “Misinformation campaigns are crowdsourced, and Wikipedia is crowdsourced,” says Whitney Phillips, an associate professor at Mercer University with a specialization in digital culture. “It’s odd to see the precisely the kind of collaborative participation that fuels conspiracy theorizing that Wikipedia is renowned for being proposed as a solution.”

Google’s attempt to “fix” conspiracy theories may be further doomed to fail thanks to the intentional efforts of the bad-faith dealers, she says. “The first thing conspiracy theorists are going to do is to try and go to the Wikipedia article and disrupt it,” says Phillips.

Making matters worse is that YouTube never consulted Wikipedia about its plans, meaning that Wikimedia may well have new problems as the articles linked on YouTube could be bombarded by new traffic, stretching the resources of Wikipedia’s already overstretched volunteer editors. Responding diplomatically to Google’s move, Wikipedia executive director Katherine Maher suggested that while everyone was free to use the information on its site, “we encourage companies who use Wikimedia’s content to give back in the spirit of sustainability.”

The problem is one of scale and resourcing, a major reason why traditional media’s credibility has been so damaged: how do you deal with a sudden ratcheting-up in activity that can overwhelm the ability of human teams? “It’s a nice idea that YouTube is trying to deal with idea of misinformation and disinformation,” Phillips says, “but I can see this particular strategy being a candyland for people looking to engage in disruption.”

Then there are also the unintended consequences, like the fact that sometimes, presenting more information—in a kind of vicious circle—can counterintuitively make things worse online. Indeed, when it comes to conspiracy theories like Pizzagate—which claimed that Hilary Clinton was connected to a sex-trafficking ring run out of a D.C. pizza restaurant that deployed secret code words—the very fact that institutions like newspapers and broadcasters would debunk it just “proves” the theory. People whose worldviews were already invested in conspiracy theories and mistrust of authority are more likely to take any corrective measure as proof of the rightness of their view. “There would be a certain percentage of people who might have accidentally found themselves in front of these videos but don’t have conspiratorial leanings. Those are the people who might be amenable to fact-checking,” says Phillips. “If you are a conspiracy theorist, however, and you are convinced that something is a secret truth, a red flag that would confirm your belief is YouTube trying to convince you otherwise. It may backfire, in a boomerang effect, and make people believe the falsehood even more.”

And while traditional structures of authority have been undermined, nothing has moved in to fill the vacuum left behind—except the cacophony of a lawless internet. It’s the very open nature of our digital platforms coming back to bite us; we created arenas in which anyone can say anything, and now we’ve discovered we don’t really like what everyone has to say, and no real way to stop them from breaking fundamental rules.

So what’s the answer, when disinformation seems to be getting more and more play? According to Phillips, the key—at least in theory—lies in filling “coherency gaps” by engaging in one-on-one dialogue to understand why someone believes what they do, and reaching beyond divisions to find consensus, particularly because conspiracy theorists tend to occupy both sides of the political spectrum. “We all have a responsibility to try and do that,” she says. Unfortunately, she adds, that simply isn’t scalable for YouTube; when there are hundreds of thousands of YouTube conspiracy videos and influential personalities who command millions of viewers, that kind of empathetic practice is impossible.

Perhaps it makes sense that our digital age is being defined by feeling, loyalty, and a sense that the truth is being kept hidden from all but the few true believers who are brilliant enough to see through the smokescreen. After all, in the face of an overwhelming array of information and viewpoints, an understandable impulse is to try and simplify the world into something manageable and almost childish—that a small cabal of people run the world, or that one’s political opponents are the source of all rot—and find peace from all that noise in such a reductive view.

As understandable as that may be, the consequences may well be devastating. History suggests that we need structures of authority to rally around and form a fixed point of orientation, so that we are at least mostly talking about the same thing. In the West, for example, as the age of religion began to wane, it was replaced by the print era’s focus on science, rationality, and expertise. But when people start to believe that even innocuous news stories are fake, that science is nonsense, or that the world is governed by elaborate conspiracies, what are we to do? How do you debate politics when you can’t even agree upon what’s real?

While giving up is not an option, solutions have thus far been difficult to come by. But in the midst of what may well be a cultural revolution akin to the rise of the printing press, the long-term answer may well be education—that in the 21st century and beyond, media and information literacy could become as fundamental to a functioning democracy as reading and arithmetic. The internet seems to have manifested the truth of that old idiom that “a little knowledge is a dangerous thing,” and learning to distinguish between the reasonable and the conspiratorial, the true and the merely true-seeming, will increasingly become a basic life skill.

In the meantime, the remaining options aren’t good. If the Wikipedia collaboration nods to an effort to start to more formally and aggressively censor or stratify the internet into legitimate and illegitimate corners, it will be a serious loss and an immensely difficult, politically charged process. After all, the line between saving the democratic web and a creep toward authoritarianism is fuzzy at best. In the absence of any solution, however, conspiracy theories and the division they foster will only continue to get worse. In that case, it may not just be sickening ideas we are exposed to—but a sick and impoverished version of democratic society.

MORE ABOUT YOUTUBE:

Looking for more?

Get the Best of Maclean's sent straight to your inbox. Sign up for news, commentary and analysis.
  • By signing up, you agree to our terms of use and privacy policy. You may unsubscribe at any time.