Wall of old TV screens
See also: Getty Images/iStockphoto
Facebook Instant Articles

Fake news is about to evolve into something even more dangerous

Opinion: As technology advances, we will need a rigorous knowledge of history more than ever. But that’s now under assault—and things will get worse in 2018
By Colin Horgan
See also:
See also:

In October, the New York Times published a piece that quickly ricocheted around Twitter and Facebook: “‘Allah’ Is Found on Viking Funeral Clothes.” The story claimed that Arabic characters spelling “Allah” and “Ali” on Viking funeral costumes could suggest Islam had deeply influenced those early Scandinavians. “My opinion is that those who wore the fabrics must have understood the symbolism,” Dr. Annika Larsson, the researcher who made the discovery, told the Times. “But certainly, the person who wove the fabrics could read and write and knew what the characters meant.”

The Times wasn’t the only outlet with the story; outlets like the BBC, the Independent, and the Guardian all carried it, too. Each time they posted the story on Facebook and Twitter, it was shared hundreds or thousands of times, picked up again and re-posted by aggregating pages like IFLScience, whose version of the story was shared more than 34,000 times.

The reason it struck such a nerve is because many of today’s far-right groups—the same sort who push for anything from closed borders to an all-white ethno-state, and who had taken to the streets of Charlottesville, Va.—regularly invoke Viking, or at least Nordic, imagery. The Othala Rune, for instance, is an ancient Germanic symbol that was appropriated by Nazis and is now a common sight at recent far-right gatherings. The Soldiers of Odin—a racist, anti-immigrant group formed originally in Finland that has since seen chapters pop up all over the world, including Canada—are named after the Norse god.

So new evidence that Vikings not only had contact with Islamic nations and people, but had apparently adopted some of their deepest belief systems, made this a story worth sharing.

RELATED: Russia’s Facebook memes have been revealed, and politics will never be the same

Just one problem: the finding wasn’t as conclusive as it seemed. In the days that followed, other experts in the field weighed in to point out that, in reality, what Larsson said she found might not have actually said “Allah,” nor even been a known form of Arabic from the period. The sample of cloth was incomplete, for one thing, and, as Stephanie Mulder, a professor of medieval Islamic art and archeology at UT Austin pointed out, Larsson was going on extrapolation.

History has always been political. It has also always suffered from—or been enhanced by—biased or slanted presentation. It is constantly examined, frequently rewritten, and nearly always controversial. But technology is about to shift these known tendencies into a new realm. History is about to undergo a radical attack the likes of which it has never seen: Much as news has become a tool for warping worldviews in the present, history will soon be used to alter our perception of reality in ways that could go far beyond a simple, easily debunked, politically charged viral hit like the Viking-“Allah” story.

In May, a Twitter user by the name of Olly Gibbs visited the Rijksmuseum in Amsterdam, and snapped some photos of portraits and sculptures of unhappy subjects on his mobile phone. Then, using FaceApp, a mega-hit mobile app that allows users to alter their faces to look older, younger, happier, or like someone of a different gender, he doctored the pictures; instead of frowning, the canvas and stone faces he captured were suddenly smiling comically. He tweeted them, and Gibbs’s first four examples were retweeted more than 10,000 times and liked a further 20,000. News outlets from around the world, too, posted stories about the gag.

And it was just that—a joke. But the ease with which Gibbs, and anyone else using the app (which security experts warned might be an insecure data collection tool), could literally change the face of history should be considered—especially since it offers a vision of how things may get very confusing in the near future.

In the winter of 2016, late-night talk show host Jimmy Kimmel welcomed to his show a researcher who, together with his colleagues, discovered a way to almost seamlessly allow dialogue in video to be altered in real-time by someone else sitting in front of a different camera. Kimmel gave it a try and, sure enough, his mouth movements were tracked and the video, which featured a celebrity or politician saying something completely different, simultaneously changed to mimic what Kimmel was doing with his mouth. The demonstration was somewhat crude, and the change was visibly obvious. But the technology, as it usually does, will likely only improve.

In an age when the information we receive about the current affairs of our world is increasingly confusing, or presented in a way that makes us doubt its veracity, the only thing we still have to keep our perception of reality grounded is a solid grasp on events that have already happened, and are part of the historical record. The only way, for instance, to know that Donald Trump breaks convention is that we have recent evidence that presidents who came before him acted and spoke differently than he does. To back up such claims—as well as many others on myriad topics—we refer to written and, more and more, video evidence.

What happens if that historical evidence is no longer a sure thing?

Video after video after video of archive footage. Which one is fake?
Video after video after video of archive footage. Which one is fake?

Assuming the current state of affairs remains generally indicative of what will come in the next year—that is, that tech platforms continue to find it impossible to fully clamp down on user-generated misinformation, and face-altering technology becomes even more sophisticated and user-friendly—we could end up in a strange place.

Sometime in the next year, surely, a video app will appear on the iTunes store that combines FaceApp with voice dubbing and video editing apps into a sophisticated doctoring tool. Rather than superimposing a clearly fake mouth over a static face of a celebrity or animal, as some apps do now, this one would allow users to do what Kimmel did—replace movements entirely and seamlessly, while the audio would be doctored to mimic the subject’s original voice.

App users would pick their favourite (or least favourite) historical clip from YouTube, transfer the file and begin making changes. Want John F. Kennedy’s inaugural address to have gone differently? Easy. Could Dr. Martin Luther King Jr. have had something other than a dream? Sure. Did Adolf Hitler really say those horrible things about Jews? Maybe not! The possibilities would be endless, so long as video exists.

And once the doctored videos are uploaded back to YouTube or other online platforms, all that would keep them from becoming a legitimate alternative to the real thing would be views. Like with many other viral mistakes, the crucial thing about the flawed Vikings story is that so many of those who originally read it will not have seen the corrections. The story’s ideologically reinforcing narrative and the scale of its amplification across social media ensured that the follow-up clarifications, which told people the opposite of what they wanted to believe, were simply never going to have the same reach. The story is not true—but to many it will remain as good as true, thanks in part to its virality.

RELATED: What the Facebook, Google and Twitter algorithms hide from you

Fraudulent historical videos could suffer the same fate, taking advantage of current tensions and issues to ricochet around the web, infusing themselves into the minds of those already primed to believe what they are seeing. From there, as we have seen with news countless times, it is virtually impossible to rein it back in. No one, after all, cares about the corrections.

Just as fake news has found a home within an ecosystem of self-reinforcing, interlinked websites and news outlets, so too would fake history be shared back and forth, referenced in enough places to begin to seem legitimate. Doctored historical videos would become part of countless homemade conspiracy videos, used as proof of whatever wild-eyed theories anyone pleases—one step beyond the current trend of examining video evidence of things like the 9/11 attacks for evidence heretofore unseen by legitimate experts and investigators. Click one fraudulent historical clip, too, and the algorithm would continue to feed you more until it is difficult to decipher, without any foreknowledge, which one was the original. It’s not a big step from there to the mainstream.

In 2018, the debate about the past could move away from the merely academic, and toward the kinds of conversations we are now having about the present: purely and terrifyingly epistemic, where the truths and facts that we were once comfortable believing about society, and its place in history, will be easily changed with the swipe of our finger. We will descend further and further into the abyss of pure decontextualized information.

In 2017, we began to suspect that anything could be happening. In 2018, we will begin to believe anything could have happened.