Facebook’s Zuckerberg to Reporters: ‘We Didn’t Do Enough’


Facebook privacy

In an effort to be more transparent and rebuild trust with the media, Facebook CEO Mark Zuckerberg gave more information on data-sharing, privacy, advertising and the Cambridge Analytica scandal on a call with reporters on Wednesday.

He opened with a mea culpa, taking personal responsibility while admitting Facebook’s corporate culture had been too “optimistic” to acknowledge or address the abuses of user privacy and data:

We’re an idealistic and optimistic company… but it’s clear now that we didn’t do enough. We didn’t focus enough on preventing abuse and thinking through how people could use these tools to do harm as well… We didn’t take a broad enough view of what our responsibility is and that was a huge mistake. That was my mistake.

It’s not enough to just connect people. We have to make sure those connections are positive and that they’re bringing people together. It’s not enough just to give people a voice, we have to make sure that people are not using that voice to hurt people or spread misinformation. And it’s not enough to give people tools to sign into apps—we have to make sure that all those developers protect people’s information too.

It’s not enough to have rules requiring that they protect the information. It’s not enough to believe them when they’re telling us they’re protecting information. We actually have to ensure that everyone in our ecosystem protects people’s information.”

Zuckerberg also admitted that the actions it’s taking to shore up security and gain users’ trust will take years. “This is going to be a never-ending battle. You never fully solve security. It’s an arms race… I think this is a multi-year effort. My hope is that by the end of this year we’ll have turned the corner on a lot of these issues and that people will see that things are getting a lot better.”

Facebook newsfeed changes April 2018

Other highlights from his remarks (read the transcript and listen to the audio for more):

• Zuckerberg admitted he dismissed the concept of fake news too readily and should have responded immediately (hence this week’s newsfeed changes): “I clearly made a mistake by just dismissing fake news as crazy—as having an impact… it was too flippant. I never should have referred to it as crazy.”

• ‘Malicious actors’ used its tools to discover identities and collect data on a massive global scale, so Facebook has deleted 135 Facebook and Instagram accounts belonging to Russian government-connected election interference unit the Internet Research Agency and removed “a Russian news organization that we determined was controlled and operated by the IRA.”

Facebook graphic — estimates by country of Cambridge Analytica impact (April 2018)

• Up to 87M users (mostly in the U.S.) may have had their data improperly shared with Cambridge Analytica.

• Most Facebook users should assume their data has been exposed: “We believe most people on Facebook could have had their public profile scraped” via its search by phone number or email address feature and account recovery system.

• Effective immediately, certain app APIs will restrict data access.

• The EU’s looming GDPR privacy rules and standards will be applied everywhere.

• Having updated its privacy tools to be more obvious, Facebook just updated its terms of service and data policy. The TOS confirmed how it collects and shares across its family of products including Instagram, WhatsApp, Oculus and Messenger, where private messages are routinely scanned.

• Targeted ads are a continuing part of the user experience: “People tell us if they’re going to see ads they want the ads to be good… that the ads are actually relevant to what they care about… On the one hand people want relevant experiences, and on the other hand I do think there’s some discomfort with how data is used in systems like ads. But I think the feedback is overwhelmingly on the side of wanting a better experience. Maybe it’s 95-5.”

• Zuckerberg believes he’s still fit to serve as CEO: “Life is about learning from the mistakes and figuring out what you need to do to move forward… I think what people should evaluate us on is learning from our mistakes… and if we’re building things people like and that make their lives better… there are billions of people who love the products we’re building.”