Is the world falling out of love with Facebook?

Facebook once promised to bring the world closer together. It is indeed ironic that Facebook is now at risk of driving us even further apart.

Content image

REUTERS/Dado Ruvic/Illustration/File

REUTERS/Dado Ruvic/Illustration/File
REUTERS/Dado Ruvic/Illustration/File

Mark Zuckerberg’s efforts to grow Facebook’s already massive 1.7 billion user base hit an unexpected snag in the wake of Donald Trump’s surprise U.S. election win—and, for the first time in recent memory, the challenges facing the social media giant’s co-founder and CEO have nothing to do with technology or even privacy. They’re existential.

Ever since Trump’s victory speech in the wee hours of Nov. 9, Facebook faced a firestorm of controversy for potentially skewing the vote’s outcome by allowing “fake” or misleading news stories to run rampant on its platform throughout the campaign. ”Pope shocks the world, endorses Donald Trump,” screamed one false but widely shared story. Blared another apparent barn-burner: “WikiLeaks CONFIRMS Hillary sold weapons to ISIS.”

The articles in question were apparently written by people who sought to make both mischief and money. “I think Trump is in the White House because of me,” Paul Horner, a 38-year-old Arizona man who makes a living off the ad money associated with viral spoof stories, told the Washington Post. “His followers don’t fact-check anything—they’ll post everything, believe anything.”

Horner’s probably giving himself too much credit. But there’s no denying that fake news poses a serious problem for Facebook’s credibility and, potentially, our democracy if allowed to continue unchecked. By some counts, nearly 40 per cent of Americans cite Facebook as their primary source for news. But unlike newspapers or even news websites, Facebook doesn’t purport to be in the truth-telling business. Rather, it strives to keep users engaged for as long as possible so it can target them with advertising. That means using complex algorithms to show people content they want to see, from their friends and third-party sources, regardless of its quality or veracity. It’s hardly a recipe for a well-informed electorate.

For Facebook, the questions it suddenly finds itself facing go to the very core of its existence. Is it merely an inert technology platform that can’t be held responsible for the actions of its nearly two billion users? Or is it a new type of online publishing and broadcasting company, with all the expectations and responsibilities that come along with it? Judging by Zuckerberg’s tepid response to the “fake news” fiasco, he clearly has no desire to play the role of the planet’s editor-and-chief (although he reportedly has no problem developing tools for the Chinese government to censor Facebook data for its citizens). But he may ultimately find he has little choice. “If they don’t do something about it, they’re going to start losing users,” warns Anatoliy Gruzd, an associate professor at Ryerson University’s Ted Rogers School of Management and director of research at the school’s Social Media Lab. “Each election cycle it’s going to become a bigger and bigger issue.”

For a Silicon Valley firm that once promised to bring the world closer together, it’s indeed ironic Facebook is suddenly at risk of driving us even further apart.

In the final three months of the U.S. election campaign, so-called “hyper-partisan” blog posts and blatantly false news stories shared on Facebook outperformed actual news stories from reputable organizations like the New York Times or CNN, according to an analysis by the social news and entertainment company Buzzfeed. Nor is it just a Facebook problem. Google also faced criticism after its top result for the search term “final election count” was a wordpress blog called 70 News that said Trump won the popular vote by a margin of 700,000, which was demonstrably untrue.

Where is all this fake news coming from? At least 140 U.S. politics websites, with names like and, were traced by Buzzfeed to a handful of Macedonian teens trying to make a quick buck off the associated online ads. The Washington Post, meanwhile, profiled a couple of twentysomethings in Long Beach, Calif., who called themselves “the new yellow journalists” and were sheepish about how much money they were making by writing tarted up Facebook fodder. One sample: “BREAKING: Top official set to testify against Hillary Clinton found DEAD!” Some researchers argue there’s even evidence of Russian involvement in social media misinformation campaigns—part of an overall effort to discredit America’s legitimacy and make the country too divided to govern.

Such concerns are several order of magnitude more serious than the privacy-related ones levelled at Facebook’s in its early days—and they can’t simply be waved away as the inevitable missteps of a fresh-faced Silicon Valley start-up. Facebook is now one of the most valuable companies in the world, with a market capitalization of more than US$345 billion. Users spend more time on the site—about an hour a day, on average—then the surfing public does anywhere else on the Internet and advertisers are spending billions to reach them. Facebook’s also doing whatever it can to sign up millions more people through its program, which seeks to use “drones, satellites and lasers” to beam Internet connectivity to all four corners of the Earth.

Zuckerberg, for his part, initially scoffed at the idea that Facebook could have played a role in Trump’s narrow electoral college victory, calling it a “pretty crazy idea.” But others point to studies that suggest as many as one out of every five voters say they’ve changed or modified their stance on a political issue because of material they’ve seen on social media, and that Facebook itself pitches political campaigns on its ability to sway voters through the right type of marketing on its website. There have even been reports of a renegade band of Facebook employees, upset by Zuckerberg’s lack of action, who launched an unofficial task force to investigate the issue.

It goes without saying the stakes are high for Facebook and its inclusive, forward-looking reputation. Some believe the fallout from the “fake news” scandal, if it continues, could ultimately spook advertisers, which spent nearly US$7 billion on Facebook’s platform in the most recent quarter alone. Brian Wieser, an analyst at Pivotal Research Group, wrote in a recent note that, prior to the U.S. election, most had tended to view Facebook as “an entirely ‘brand safe’ environment”—yet another piece of conventional wisdom obliterated by Trump’s divisive and unpredictable campaign.

To be sure, there’s no way to conclusively prove whether Facebook is actually subverting Western democracy. But spare a thought for those who try to warn people about the existence of the threat. Melissa Zimdars, an associate professor of communications at Massachusetts’ Merrimack College, got an earful when, shortly after the U.S. election, she compiled a rudimentary list of about 80 fake and suspect news sites for her students and shared it in a Google Doc. It wasn’t long before the online spittle began to fly. Several of the sites on Zimdars’s list published stories about the “unhinged feminist” behind the “defamatory hit list.” Says Zimdars, “I’ve received hundreds of email messages that are really vile—some even said I should be shot or raped.”

Yet, despite the attacks, Zimdars says the exercise helped to further highlight the growing challenge of separating fact from fiction in the online world. She says struggling mainstream news outlets are just as guilty of torquing headlines and stories to drive social media traffic, adding “it leads to distrust and it’s one of the many reasons why people started seeking out some of these other organizations in the first place.” In fact, she says many of the angry people who emailed her argued “the real fake news is from the New York Times, so we have to go to these other sites for honesty and truth.”

In some respects, this is precisely what the Internet originally promised: a diversity of voices and opinions, and an overall democratization of information. But what many didn’t anticipate is that a handful of companies would effectively control our access to all those divergent viewpoints. “What’s new here is you have one or two giants—Facebook or Google—that, by changing one line of code, can influence millions or even a billion people,” says Gruzd. “The stakes are much higher now to get it right.”

Facebook clearly hasn’t figured out the formula yet. In a study awaiting peer review, Gruzd and other researchers examined Facebook posts of the three frontrunning candidates from during the 2016 primary season—Donald Trump, Hillary Clinton and Bernie Sanders—to see if they could predict which ones would gain the most traction among users. They based their assumptions on existing communications research that suggests photos, videos and positive headlines tend to attract the most attention. But, to their surprise, they found scant evidence the same is true on Facebook—particularly when it came to Donald Trump, whose followers seemingly “like” anything about the reality-TV star. Gruzd suspects it’s evidence of Facebook’s hidden algorithms at work, directing content to users who are most likely to engage with it. “It’s a very influential tool,” he says.

Zuckerberg, ever the tech geek, appears to be hoping the nebulous problem can be solved with a few coding tweaks. In a recent blog post, he said Facebook was looking at stronger detection, better user-based reporting systems, third party verification and ways to make it more difficult for hoaxsters to earn ad money. While others have suggested Facebook should employ a team of trained journalists to monitor content, recall Facebook earlier this year ditched between 15 to 18 members of its editorial team after conservatives complained that items appearing in its Trending section appeared to be written with a left-leaning bias. Human oversight, in other words, is hardly a panacea.

The pressure on Facebook to deal with its fake news problem is only going to increase alongside the rise of populist and extremist political movements around the world. Already political leaders are calling on Facebook to take steps to prevent a repeat of the U.S. election in their countries. German Chancellor Angela Merkel warned political opinions were being “manipulated” in a new media environment; some members of the European Parliament want Facebook to be treated like a media company so it could be held liable for hate speech that appears on its platform. “In the early years, we were a bit naïve about society overall and felt a bit romantic about social media’s ability to democratize free speech,” says Gruzd. “Now we’re being more pragmatic and realistic. We’re realizing social media isn’t perfect.”