Transcript: Wired editor-in-chief Nicholas Thompson on "Face the Nation," April 8, 2018

Facebook CEO Mark Zuckerberg is headed to Capitol Hill for two days of testimony before House and Senate committees about the company's data policies and protection of user data. Facebook has been under fire following the Cambridge Analytica data scandal, and there have been growing calls for regulating the social media giant.

Nicholas Thompson, the editor-in-chief of Wired, joined us to break down the Cambridge Analytica scandal, Zuckerberg's testimony and the issues lawmakers are likely to raise.

The following is a transcript of the interview with Thompson that aired Sunday, April 8, 2018, on "Face the Nation."  


MARGARET BRENNAN: We want to take a closer look now at the Facebook data story how it's affected users and how the company is responding. To help us do that we're joined now by WIRED Magazine Editor in Chief Nicholas Thompson who's also a CBS News contributor. He's in New York this morning. Good morning to you Nick. This is a very confusing story for a lot of people at home. What is the main question that Mark Zuckerberg has been called before Congress to answer?

NICHOLAS THOMPSON: Well the most important thing he's going to answer is what happened to people's data. What happened with Cambridge Analytica. What are you doing to make sure that doesn't happen again. He'll go up there, he'll apologize, he'll explain it and then what's going to be interesting is what comes next. Is it just retributions or do we actually try to figure out good tech governing policy. Because we haven't really had a debate about how to regulate these companies in 20 years.

MARGARET  BRENNAN: Well that's what Senator Kennedy was just telling us. He's going to be one of the questioners. He said Facebook may be too big to fix meaning the government may have to regulate it. What does that look like?

NICHOLAS THOMPSON: Yeah and I do think there is going to be regulation coming. Right and there's certainly some regulation that would be very sensible, right? You should certainly regulate advertising on Facebook, political advertising on Facebook, so it meets the standards of political advertising on other media platforms. That's a good idea. You should also probably have some kind of structure for privacy regulation, right? You could model it after what has been done in Europe to make sure that people have control over their data and that the companies have requirements to make their privacy settings very clear. That would be a good idea too. Once you get beyond that and you get into specific regulations about speech, when you get into specific discussions about antitrust, then it gets very complicated and there's a lot of risk.

MARGARET BRENNAN: But we've already seen Facebook kind of try to get ahead of these hearings. Tomorrow they're going to disclose to users if their data was shared, breached whatever word you want to use. They're also talking about forcing some disclosures on political ads in terms of where that person or entity was located and who paid for the ad. Is that enough to sort of soften the blow?

NICHOLAS THOMPSON: I think it is enough to soften the blow. It's probably not enough overall.

There does need to be some government regulation that goes beyond what Facebook is doing. Facebook has announced about 20 policy changes in the last few weeks. There are very good changes. They do protect you. They do open things up. They will make political campaigns clearer and fairer. But there also is a role for Congress both in setting specific regulations and also setting some guidelines for Facebook to follow in the future.

MARGARET BRENNAN: Is there truth to what Senator Kennedy was telling us that Facebook doesn't even know who's running ads on Facebook. Is there truth to that? Do they really not know? They know a lot about users.

NICHOLAS THOMPSON: They don't know everything about who is running ads on Facebook, right? They didn't know that the IRA the Russian propaganda group was running ads on Facebook because the IRA had hidden its purchases. But I think he was overstating that a little bit. Facebook does have a good sense, right? People enter their financial information. They buy the ads. Facebook now has lots of people monitoring them, looking for suspicious behavior. They've set up their AI systems to monitor for suspicious behavior. So it's an overstatement to say that they don't know who's advertising, but there are certainly certain things they don't know.

MARGARET BRENNAN: Well they seem to be playing catch up though because none of these things stopped those buyers from that rock Russian propaganda unit in the 2016 election.

NICHOLAS THOMPSON: Yeah, so Facebook was caught totally unprepared during the 2016 election and they are still paying for that, right? They stuck their heads in the sand. They were not paying attention to the fake news, to the propaganda operations. Since then though they have adjusted their algorithms, they have hired tons of people. They are working very hard. So the bad guys are going to work harder at hiding what they do. But Facebook is also going to work much harder at uncovering what's going on. So the odds that we have as much manipulation, as much chaos in 2018 or 2020 that we had in 2016. I think it's small. I think Facebook is getting a handle on this, but yeah you're totally right. They were absolutely unprepared in '16.

MARGARET BRENNAN: Do you think-- you've interviewed Mark Zuckerberg. Do you think that he he gets it now that he understands the weight of the outrage?

NICHOLAS THOMPSON: I think that there's been a real education process for Mark Zuckerberg that began the day that Trump was elected because remember Trump's philosophy which is somewhat tribalistic is entirely different from Zuckerberg philosophy which is bring everyone in the world together. And so the day after the election I think Zuckerberg started to realize wait in my system to do this? Am I responsible for this? And he's gone through a lot in the last year and a half. And I think you've seen a real education, a real evolution. I mean he's still making all kinds of unforced errors and mistakes, but I think that Zuckerberg is really grappling and I think he's understanding that this platform that he genuinely thought could only do good for the world actually can be manipulated and that's the story of the last two years. It's Mark Zuckerberg realizing that the tools he built can be used for ill as well as for good. And that's something he had not realized.

MARGARET BRENNAN: One of his top executive Sheryl Sandberg tried to in many ways lay the groundwork for this testimony this week she was all over news networks apologizing on Mark Zuckerberg's behalf and her own. How effective was that?

NICHOLAS THOMPSON: It was fine. I mean she and Zuckerberg have--.

MARGARET BRENNAN: That doesn't sounds very convincing, Nick.

NICHOLAS THOMPSON: Well look at the reaction to it. I don't think anybody said, "Oh, you know now we're totally sympathetic to them." Public opinion is still completely against Facebook, I mean, and that is why Congress is going to be out for blood on Tuesday or Wednesday. If you look at the public perception of both Zuckerberg, Sandberg and also of the company, it's terrible. The stock market is mad, the employees are upset. People are really upset at Facebook right now. So I think she said the right things. I think Zuckerberg has been saying the right things. I thought his conference call with the media the other day went very well.

On the other hand, they didn't respond to this crisis nearly as quickly as they should have. And they're still paying for that and they're still pain for years and years of sins. One of the ironies here is that I think this Cambridge Analytica scandal has been a little blown out of proportion, right? What happened in this specific instance isn't quite as terrible as people make it out to be. And Facebook's not as much at fault as people make them out to be. On the other hand --

MARGARET BRENNAN: You're talking about data scraping and the use by an outside application of this information without the users knowledge?

NICHOLAS THOMPSON: Yes, that that particular scandal it's bad, but it's been a little blown out of proportion. On the other hand, Facebook has been violating our privacy and not paying any price for it for 12 years. So in some ways this is the comeuppance for 12 years of sort of small privacy violations and breaches of trust that Facebook hasn't really been punished for so they're being punished too much for this specific crime. But maybe the right amount for the accumulation of things over the last decade

MARGARET BRENNAN: That's going to be fascinating to watch. Nick thank you very much.