SOCIETY

The problem with Facebook’s plan to teach you how to read news

Facebook is rolling out a new feature to help Canadian users think critically about news, but it’s not clear the company can it fix the mess of ‘fake news’ it helped create

FILE - In this May 16, 2012 file photo, the Facebook logo is displayed on an iPad in Philadelphia.  Facebook is adding more Snapchat-like features to its app. The company says it wants to let people's cameras "do the talking" as more people are posting photos and videos instead of blocks of text. With the update coming to users starting Tuesday, March 28, 2017,  Facebook is adding a camera icon to the top left corner of its mobile app.. (Matt Rourke/AP)

(Matt Rourke/AP)

Facebook thinks you should be better at reading the news. In an effort to help, starting Friday, you’ll notice a post appear at the top of your Facebook news feed prompting you to click through to see tips on how to spot “false news.”

“Our purpose here is just to raise awareness about how to think about information critically online,” says Kevin Chan, head of public policy at Facebook Canada. “This is a first step in our efforts to deal with this challenge—it is on the news literacy side.”

To that end, specifically, Facebook has partnered with MediaSmarts, a Canadian media literacy not-for-profit that has developed a new list (available via the Facebook tips page) of the well-known journalistic “Five Ws.” MediaSmart’s Five Ws suggest readers of online news ask questions like why a certain post is being spread around, who posted it—and whether they have an agenda—or where else they might be able to verify information they’ve seen.

RELATED: Is the world falling out of love with Facebook?

“Of course, there’s no way that we can authenticate everything that comes to us through social media, so the first question is when we should authenticate. When do we double-check?” says Matthew Johnson, director of education at MediaSmarts. One time we should double-check, he says, is “when something seems too good to be true.”

Facebook seems very aware of the position it currently occupies in the greater cultural discussion about news—and, to some extent, facts. It’s not a flattering one.

When the immediate fallout from the U.S. election in November was examined, the most radioactive particles were broadly determined to be so-called ‘fake news’ stories—those shocking headlines so dripping with maximum partisan outrage that well-meaning people on all sides of the ideological spectrum apparently couldn’t help believing and sharing them with their social media networks again and again.

Whether ‘fake news’ really did swing the election toward Donald Trump—or simply away from Hillary Clinton—has yet to be conclusively determined. But, in the weeks following the election, Facebook took seriously the criticism it garnered from having been the primary distribution tool for these posts replete with misinformation or quasi-information.

In an open letter posted in February, Facebook CEO Mark Zuckerberg wrote that, in its quest to weed out misinformation, Facebook noted that “in general, if you become less likely to share a story after reading it, that’s a good sign the headline was sensational,” but that “if you’re more likely to share a story after reading it, that’s often a sign of good in-depth content.”

RELATED: The dark irony behind Facebook’s fake news problem

This very well might be true. We don’t know for sure, as Facebook didn’t release any data publicly to support Zuckerberg’s observation. (When asked for it, Facebook pointed to this blog.) But really all that matters is that Facebook has determined it to be true. Chan repeated the same thing, nearly verbatim. And he added that “over time what we’ll want to do as we understand this stuff, is to make sure that where there is something that’s going viral and people are sharing it without having engaged with the content, then those things get severely down-ranked on News Feed.”

On the surface, this seems antithetical to both Facebook’s raison d’etre—as a place to share things with people—and its bottom line. But Chan refutes the idea that Facebook’s ultimate goal is to create a lot of activity around a post, without ever worrying whether people click through to see the story.

“I think that would be the opposite of what we want. What we want is for people to have good content, reliable authentic content that they can engage with on News Feed,” Chan says. “We very much value good engagement and good content on Facebook, so definitely one of our priorities is to make sure that where there is false information, misinformation on our platform, that we understand how it behaves and that we are able to take appropriate enforcement action.”

Missing from this conversation about how to either eradicate misleading or false information posing as news from Facebook, or reduce sensationalist, clickbait-y headlines from reputable news outlets is, of course, the fact that much of the reason all of it exists in the first place is because of Facebook. It’s longstanding ability to make something go “viral” incentivized the very thing it now hopes to squash.

Not that long ago, it thought the full-on democratization of ideas was pretty good.

Facebook CEO Mark Zuckerberg is seen on stage during a town hall at Facebook's headquarters in Menlo Park, California September 27, 2015. REUTERS/Stephen Lam/File Photo

Facebook CEO Mark Zuckerberg in Menlo Park, California September 27, 2015. REUTERS/Stephen Lam/File Photo

Back in 2012, as Facebook prepared to go public, Zuckerberg wrote another letter—this one to potential investors. He highlighted what kind of world we were living in at that time: one in which a majority of people, via the internet or their mobile phones, had “the raw tools necessary to start sharing what they’re thinking, feeling and doing with whomever they want.”

Back then, Facebook wanted to help people form connections in the hopes that it could “rewire the way people spread and consume information.” The world’s information infrastructure, Zuckerberg wrote, “should resemble the social graph—a network built from the bottom up or peer-to-peer, rather than the monolithic, top-down structure that has existed to date.”

By giving people “the power to share,” he wrote, “we are starting to see people make their voices heard on a number of different scale[s] from what has historically been possible.” Those voices, Zuckerberg predicted, would only increase in number and volume: “They cannot be ignored.” Over time, he continued, “we expect governments will become more responsive to issues and concerns raised directly by all their people rather than through intermediaries controlled by a select few.”

RELATED: Is Facebook a media company?

It is possible that Zuckerberg’s vision has been realized. A massive, global sharing of ideas has indeed happened. But, being a sharing of ideas between humans, it was naturally going to be privy to human conversational failings: hearsay, conjecture, specious arguments, baseless proclamations, just to name a few. In other words, not good or reliable content.

Is it any wonder that we are where we are? It was essentially all part of the plan, in that the plan encouraged people to speak their minds. It just turns out that a lot of the time, people don’t know what they’re talking about.

So what now?

This latest effort by Facebook to change direction—in effect to reverse the tide—is interesting. But there are two things to note.

First, Facebook’s plan does nothing to change the importance given internally to quality content. “Good reporting,” as a report from the Tow Center for Digital Journalism put it in March, is still “not currently algorithmically privileged.”

That leads to the second point, which is that this particular move puts the onus on users to figure things out. Facebook might have made a mess of things, it might have—in its design and in what posts it has naturally promoted for years—rewired information consumption, but it’s left up to us to set things right again. Whatever that might mean.

RELATED: Why Canadians should care about Facebook’s fake news problem

Yet, perhaps that’s the way it should be, for other moves Facebook is making to combat “fake news” could lead us to even weirder territory than we’re in now.

Recently in both the U.S. and Germany, Facebook began testing a flagging system that alerts users to content that might be misinformation. As The Verge reported in December, if at least two fact-checking organizations take issue with a story, users will see a banner reading “Disputed by 3rd Party Fact Checkers,” along with links below it to debunking articles.

Which could mean, with this fact- and news-checking feature in place, and thanks to its incredible size and clout, that Facebook could become the opposite of what Zuckerberg once said it was. We might see things swing entirely the other way. Rather than the disrupter of top-down information, Facebook would become the enforcer of it; the de facto portal through which people feel they must consume the news. For, where else might they be told what information should be read and what should be ignored? Where else in this world will news reading be safe?

When, and if, that tool comes to Canada, it may be trumpeted by Facebook as a thing that will rewire information dissemination again. As a thing that will save us. As a thing, maybe, that seems too good to be true.

But in that case, at least Facebook’s media literacy push will have taught us to double-check it.

An earlier version of this piece contained the suggestion that Facebook is promoting its media literacy effort as a cure-all for ‘fake news’. This piece has been amended to clarify that Facebook is not promoting its current media literacy program as such.

Looking for more?

Get the Best of Maclean's sent straight to your inbox. Sign up for news, commentary and analysis.
  • By signing up, you agree to our terms of use and privacy policy. You may unsubscribe at any time.