Mark Zuckerberg Says He’s Not Resigning

In an interview, the Facebook CEO tells The Atlantic he’s not walking away from the company, but he is looking for outside expertise.

A photograph of Mark Zuckerberg superimposed over the words "I'm sorry"
Justin Sullivan / Getty / Katie Martin / The Atlantic

Mark Zuckerberg’s story doesn’t quite line up.

For months, the Facebook chief executive has described the 2016 election as a turning point both for him and for the company over which he holds enormous power.

The cavalcade of scandals that followed that November—disputes over user data, fake news, and Russia’s manipulation of the platform—has led to a “very basic shift in how we view our responsibility,” he said in an interview with The Atlantic on Friday. Now, Zuckerberg is transforming the company, opening it up to public scrutiny in unprecedented ways. “A big theme” going forward, he said, will be getting “independent expertise and assessment of the work that we’re doing.”

Yet Zuckerberg—who is not only Facebook’s CEO, but also the chairman of its board and its majority voting shareholder—struggled to describe when his personal thinking about the company and its philosophy shifted. He could not articulate what changed his mind or drove him to adopt the new approach.

“Well, I certainly feel very bad, and I’m sorry that we did not do a better job of finding the Russian interference during the 2016 election,” Zuckerberg told me. “I mean, that was a huge miss.”

Almost unique among American companies, Facebook is the outgrowth of one man’s sensibility. As that firm now changes its approach to the public, that man hasn’t articulated why. In the days after I spoke to the famously private Zuckerberg, I have wondered whether it matters. Though he runs a company that constantly exhorts people to share how they feel, Zuckerberg himself seems uncomfortable with reflection. He has not ever, in my memory, appeared vulnerable in public. He has ignored the CEO playbook for a company that faces a crisis of public trust: He does not grovel, he does not evince embarrassment at the size of the lapse. He does not tell users: I feel your pain.

Should we care? Does Zuckerberg’s reticence only affect the public appearance of the firm—is it just content-free pablum that we’ve been raised to expect as American consumers?—or does it reveal something of Zuckerberg’s actual capacity for leadership? Americans don’t necessarily need Zuckerberg to act like a normal CEO, personifying his company and hanging his head in shame. But they do need him to be commensurate to the institution he leads.

Or perhaps actions speak louder than words—in which case, there are plenty of actions to talk about. Today, Zuckerberg unveils the newest attempt at reform. Facebook will give a committee of senior academics independent access to its data, allowing researchers to study the social network’s effect on democracy and elections. The work will be paid for by foundations spanning the ideological spectrum, and—most importantly—Facebook says it will not be able to veto studies before their publication.

The research program follows the news of a transparency program for political ads, announced last week. When Facebook users in the United States encounter an ad for a candidate or issue, they’ll soon be able to see the identity of the advertiser, its cost, how it was targeted, and what other ads the advertiser ran.

Taken together, the two measures constitute one of the company’s more significant attempts at self-regulation in its 14-year history. The timing is not by happenstance. On Tuesday, Zuckerberg will testify before the Senate Judiciary and Commerce committees; on Wednesday, he will face the House Energy and Commerce Committee.

He has never appeared before Congress before. The New York Times reports that he has received a “crash course” in warmth and charisma, training with a team of all-star lawyers and a former aide to President George W. Bush. He will need it: While neither he nor the politicians are particularly popular, many of the members of Congress he will face are former federal prosecutors who have spent hundreds of hours of their lives taking depositions. They love to grandstand; Zuckerberg loathes to speak in public at all.

It must be a surreal moment for the 33-year-old. Decisions he made two years ago are suddenly up for debate in the public sphere. Americans are grappling with his enormous personal power, and he appears to be taking some blame for the country’s current political state. After he told me he felt “very bad” about Russian interference on his platform, I asked: Was there an especially dark moment when he came to realize the role that Facebook played in the 2016 election?

Zuckerberg paused for eight long seconds. “I need to think about that,” he said eventually.

Was there any one moment that stood out?

“I’m not sure if there’s one in particular,” he said.

Instead, he told me, a broader evolution had taken place in the company “across a number of these issues, whether it’s foreign interference or fake news or the Cambridge Analytica data-privacy issue recently.”

“I think there’s just been a very basic shift in how we view our responsibility,” he said. “We used to view our role as building tools for people and saying, ‘Hey, we’re going to put this power in your hands.’ And we think people are basically good, and we think that that can have a net positive effect.”

“Now I just think we understand—both because of the ability for us to develop these things and because of the scale at which we operate—that it’s also our responsibility to make sure that all these tools are used well, not just to put them in people’s hands,” Zuckerberg said.

“You know, you can’t just give people a voice,” he told me. “You need to also make sure that that voice is not used for foreign interference in elections or disseminating fake news.”


The most important aspect of the new Facebook reforms—at least according to Zuckerberg—is that they are not just reactive. If Facebook is a castle, it must build its walls, maintain them, and make sure its more than 25,000 employees work to protect them—all while constantly imagining new ways of coming under attack. But Facebook’s adversaries, be they Russian hackers or a team of enterprising bigots, need only find one gap in the ramparts to wage a successful assault.

What went wrong in 2016, Zuckerberg said, was in part a failure of imagination. His team caught Russia trying to interfere with campaigns—for instance by phishing staff—but it didn’t understand the scale of the country’s overall effort. “We were looking for different issues," he said. “But we were slow in identifying this new kind of attack, and we need to get ahead of that in the future.”

Anticipating different kinds of attacks is one reason why Facebook is now trying to show more information to Facebook users. Take the company’s new political-ad transparency program.

“Generally, you can come to Facebook, you can post what you want, you can run an ad,” said Zuckerberg. “You post something and then if it violates the community standards, someone flags it to us and then we review it. Increasingly we’re building AI tools to do more of that proactively, but still a lot of it is reactive.”

“Given the importance of these elections, especially 2018—with major elections in the United States, India, Brazil, Mexico, Pakistan, Hungary coming up soon—we just thought that that wasn’t enough,” he said.

So Facebook has taken two steps. First, it will require the individual owners of the largest Facebook pages to verify their names by providing the company with a copy of their government IDs. It will also make them respond to a letter in the mail.

“We will physically mail a code to where you say you are, and you have to be able to access that code that we mail to you in order to be able to run an ad,” said Zuckerberg. “I think that will be quite effective at preventing someone in Russia, for example, from lying and saying that they’re in the United States and able to advertise in the election here.”

Second, it will show Facebook users in the United States more information about each political ad. “Basically, you can see some information about who’s running it, and how much they paid for it, and who [they’re] trying to target, and how many people have seen it,” he said. Facebook users will also see the amount spent in dollars to run it.

Zuckerberg asserted this met an “even higher standard of transparency than what has traditionally existed on TV or print media for ads.”

“There’s never been a way historically to go to some group and say, ‘Hey, you know, you’re trying to, you know, campaign for this candidate. Are you saying messages that are different to different people? Are they contradictory in different places?’” he said. (While this may make Facebook’s transparency effort more extensive than that of TV, traditional media is compelled by law to disclose certain information about political ads. Outlets can be sanctioned if they do not comply.)

But this transparency effort relies on Facebook users to take initiative themselves. So the company is also opening a new academic research program, as it searches for outside scrutiny from experts.

Under that program, the company will give a committee of steering academics unprecedented access to Facebook—sitting side by side with their data scientists, a spokesman told me—and identify major research questions about the social network. The committee, organized by the Social Science Research Council, will then announce a call for papers and issue grants to fund the research.

Both the committee of academics and their research grants will be funded by a group of independent foundations, including the Hewlett Foundation, the Omidyar Network, and the Charles Koch Foundation. Unlike current studies about Facebook, which are conducted in conjunction with the company’s product team, Facebook will not be able to review these academic papers before they are published.

Zuckerberg sounded relieved about that effort, in part because it will create a source of information about the company that does not itself come from the company. “In terms of just understanding the amount of problematic content on the platform, I think we need to be more transparent [so] that we can ground some of those debates in fact,” he said.

“I think a lot of the discourse that I see around fake news, for example, is grounded in anecdotes, right? Someone saw something and then they write about it. Which is fair because that’s all the information that’s out there today ... but I think we want to try to move beyond that,” he said.


Academic researchers have begged for the ability to see Facebook data for years. After a massive study from MIT found that fake news traveled faster than accurate news on Twitter, the Dartmouth political scientist Brendan Nyhan told me: “We can study Twitter all day, but only about 12 percent of Americans are on it. It’s important for journalists and academics, but it’s not how most people get their news.”

Facebook is the 6 o’clock news of the internet: Americans encounter news there not because they’re political junkies who seek it out, but because it’s presented to them amid statuses, photos, and everything else. More than two-thirds of Americans get their news from social media, and the vast majority of that group uses Facebook.

So transparency might not be enough. Facebook keeps encountering problems that are seemingly impossible to predict: It’s just hard to protect a castle that’s home to 2.1 billion people. And for every Facebook user in the United States, there are dozens in countries like the Philippines, Turkey, and Chile, where “the internet” is synonymous with “Facebook.”

Maybe these problems are inherent to any product or social network of Facebook’s size. Is there a point at which Facebook is too big to be successfully managed at all?

“Well, I think what we need to do is be more transparent about what we’re seeing and find ways to get independent and outside experts to be able to come in, and contribute ideas on how to address these issues, like things that might be problems,” Zuckerberg said. “And then hold us accountable for making sure we do it.”

“That’s exactly why we’re trying to create this independent election commission for research, because we think that getting this right for the elections starting in 2018, and after that, is just such an important thing,” he told me.

But he is largely not ready to consider using democratic methods to govern Facebook itself—like having users vote on major changes to the platform, for example. “I’ve spoken a bit about how I’d like to create more community-based processes setting our content policies,” he told me. “I think we’re still working through exactly how that would work, but that’s certainly one example.”

Community is a term that Zuckerberg uses almost unfailingly when discussing the $450 billion company. But some of Facebook’s other leaders—and some of its critics—have adopted other terms. Sheryl Sandberg, the company’s chief operating officer, did not push back on Thursday when an NPR reporter called Facebook a “publisher.” In New York last year, the journalist Max Read compared Facebook to the EU, the Catholic Church, and a “faceless Elder God.”

I asked Zuckerberg outright: What is Facebook?

“I mean, I think it’s a lot of things,” he said, then he paused. “But you know, overall I think about it as a community.” While that broad description may be an accurate characterization of the platform, it’s also one that helps Facebook avoid seeming to work in businesses that are subject to government regulation.

“I know that a lot of the biggest issues that we face are fundamentally trade-offs between different people’s interests in the community: What speech is allowed? How do you manage the balance between people wanting to share but people also wanting to see what other people are doing? All of these really complex interactions are what makes, I think, the community work well,” he said.

Last week, on a press call with dozens of reporters, he told my colleague Alexis Madrigal that managing conflicts between Facebook-as-a-community and Facebook-as-a-company was “quite easy.”

Whether that assessment is glib, naïve, or something else entirely, it hardly matters, because no one else can do Zuckerberg’s job. He has near total control of the levers of that community. Even though Facebook went public in 2012, he retained a majority of its voting shares. This means that he can decide at will to sacrifice its short-term profits: In a statement in November, Zuckerberg warned that “we’re investing so much in security that it will impact our profitability.”

But his sole control over Facebook means he also rules unequivocally. If there is one trusted confidant he turns to for advice, he didn’t say. Instead, he said, he looks for guidance everywhere. “I read a lot. I talk to a lot of people. We have a great team here. I think the real challenge is: How do we get so many of the smart people who are out there engaged in some way?”

“What we’re building is unprecedented, and we’re going to need new models to be able to get input and for the public to be able to trust that what we’re doing is good,” he said.

And there is a gap here that clearly nags him, between what he knows and what the public can know. Experts “who people count on to assess how companies are doing can’t actually have access to the data to do it,” he said. “You can’t just have outside academics come in and look at people’s data, right? I mean there are real issues to that—that prevent, I think, both us being able to get [their] opinion and benefit from [them], and from us being able to have a good discourse as a society.”

First, though, Zuckerberg must win back the support of that society. The company is currently embroiled in the worst crisis of trust in its 14-year history. Facebook stock has fallen 14 percent since the Cambridge Analytica scandal was announced. Last week, the financial columnist Felix Salmon called for Zuckerberg’s resignation in Wired.

Zuckerberg’s fellow technology executives have been unusually unsparing in their criticism. In March, Elon Musk had Tesla and SpaceX delete their Facebook pages. And when Tim Cook, Apple’s CEO, was asked what he would do if he were in Zuckerberg’s situation, he said: “I wouldn’t be in this situation.”

I asked Zuckerberg: Have you considered resigning at all?

“No, I mean—I am—I do work on philanthropy too, separately. But, these issues are very important,” he said. “We’ve also worked on a lot of hard problems over the last 14 years building Facebook. I mean, it started in a dorm room and now it’s this unprecedented community in scale and I’m very confident that we're gonna be able to work through these issues.”

Robinson Meyer is a former staff writer at The Atlantic and the former author of the newsletter The Weekly Planet.