Facebook Purges Journalists, Immediately Promotes a Fake Story for 8 Hours

Why did the company trend a false article about Megyn Kelly?

Mark Zuckerberg stands in front of a slide that says “Give everyone the power to share anything with anyone.”
Stephen Lam / Reuters

Oh, Facebook. Just when the company seems to have avoided the responsibility of being a news organization (and all the attendant controversy), it finds itself back in the editorial muck.

Last week, Facebook made a surprise overhaul of its “Trending Stories” feature, the sidebar that highlights some of the most popular news stories on Facebook. Where the company had previously provided a short, human-written summary of the news at hand, it now only described the story in a one or two-word phrase: “#Toyko2020: Japanese Prime Minister Appears in Surprise Performance During Rio Ceremony,” became just “#Tokyo2020.”

A graphic from Facebook explaining the new and old feature

Facebook’s decision to simplify the feature seemed like an attempt to wriggle out of editorial responsibility: What had been a messy human-led process would now become an algorithm-guided one. The company also laid off the 26 employees who had run the feature—19 curators and seven copyeditors—with little warning on Friday, according to Quartz.

Yet the company assured users that it would still remain discerning. “There are still people involved in this process to ensure that the topics that appear in Trending remain high-quality—for example, confirming that a topic is tied to a current news event in the real world,” said a release from the company on Friday.

If so, they’re not doing their jobs very well. From Sunday evening to early Monday morning, Facebook allowed the topic “Megyn Kelly” to trend. Driving the trend was an article claiming that Kelly had been fired by Fox News for supporting Hillary Clinton. The story, hosted by endingthefed.com, was completely inaccurate: Kelly has not endorsed Clinton, and she has not been fired by Fox. Yet with the assistance of Facebook’s algorithmic editors, it garnered 200,000 likes.

On Sunday night, I asked Facebook whether a human editor approved the topic before it trended, and how it plans to keep this from happening in the future; it had not responded by press time.

For Facebook’s now out-of-work contractors, the irony must be painful. Earlier this year, Gizmodo reported that the Facebook Trending Topics staff were biased against conservatives in their story selection. Among the allegations: the team didn’t let a story trend if it was reported only by a rightwing site like Breitbart; the news in question also had to be covered by a reputable source like the AP, The New York Times, or The Washington Post. As Kashimir Hill wrote at Fusion when the news broke, this didn’t reveal anti-right bias so much as good journalistic instinct. Had the wise algorithm obeyed this tenet, this whole mess could have been avoided.

For Facebook, this scandal isn’t inherently embarrassing because the Megyn Kelly story was fake. Rather, it’s a negative story because the company has primed users to expect—and also told them outright, just last week—that the stories that trend should be accurate. For an inaccurate story to trend so soon after all the experienced workers were fired? It prompts anyone who’s been on the wrong side of a corporate consultant’s Excel-driven downsizing to go: har, har, sob.

But maybe the answer isn’t just that Facebook should rehire its contractors. Jonathan Zittrain, a Harvard professor of law and computer science, said that he thinks Facebook is too important, too powerful, for any one editorial approach to hold sway. Describing the sum of all conversation on the world’s biggest message board—which is, after all, what Trending Topics does—might be too weighty a task for any single decision-making schema, he said. What if there was a blank spot on the Facebook homepage, and the user could fill it with their own favorite “Trending Topics” algorithm?

“Facebook could say, ‘Trending Topics go here, and we’re going to fill it in with a default—but now you should choose your disk jockey,’” he said. A user could swipe between different algorithmic approaches or even add their own. So one option might be a generic “most popular” feed, another might be a human-edited approach, and two more could focus only on sports or business news. This diversity of approaches would let Facebook off the hook for any individual trending decision, and it could create a better survey of the whole conversation.

“They really can make Trending Topics a spectrum, instead of a universal broadcasting corporation. Why wouldn’t they want to do that?” he asked.

Another option for Facebook would be to forswear any promise of accuracy whatsoever. Watching this episode unfold, I’ve remembered that if the company in question here wasn’t Facebook, but Twitter, none of this would even be a news story. Twitter has never promised truth for its own “Trending” feature; it describes that list of stories as nothing more than a simple programmatic accounting of the most popular topics of the day. As a consequence, false stories trend on Twitter probably once a week. As long as a plurality of Twitter users are talking about something, Twitter will tell you about it—call it the “many people are saying” theory of editorial judgement.

It may offer an even-handed way of dealing with the problem of false news, but, in today’s news environment, it will have an especially deleterious effect. Fake news sites have prospered over the past few years, partly thanks to a guzzle of free attention from Facebook. One of the former presidents of those sites is now CEO of the ongoing Republican presidential campaign. Many people are saying that to give those conspiracy sites the comfort of popularity, via the “trending” module, would gravely harm the public conversation.

Robinson Meyer is a former staff writer at The Atlantic and the former author of the newsletter The Weekly Planet.