A number of Facebook’s recent decisions have fueled a criticism that continues to follow the company, including the decision not to fact-check political advertising and the inclusion of Breitbart News in the company’s new “trusted sources” News tab. These controversies were stoked even further by Mark Zuckerberg’s speech at Georgetown University last week, where he tried—mostly unsuccessfully—to portray Facebook as a defender of free speech. CJR thought all of these topics were worth discussing with free-speech experts and researchers who focus on the power of platforms like Facebook, so we convened an interview series this week on our Galley discussion platform, featuring guests like Alex Stamos, former chief technology officer of Facebook, veteran tech journalist Kara Swisher, Jillian York of the Electronic Frontier Foundation, Harvard Law professor Jonathan Zittrain, and Stanford researcher Kate Klonick.

Stamos, one of the first to raise the issue of potential Russian government involvement on Facebook’s platform while he was the head of security there, said he had a number of issues with Zuckerberg’s speech, including the fact that he “compressed all of the different products into this one blob he called Facebook. That’s not a useful frame for pretty much any discussion of how to handle speech issues.” Stamos said the News tab is arguably a completely new category of product, a curated and in some cases paid-for selection of media, and that this means the company has much more responsibility for what appears there. Stamos also said that there are “dozens of Cambridge Analyticas operating today collecting sensitive data on individuals and using it to target ads for political campaigns. They just aren’t dumb enough to get their data through breaking an API agreement with Facebook.”

Ellen Goodman, co-founder of the Rutgers Institute for Information Policy & Law, said that Mark Zuckerberg isn’t the first to have to struggle with tensions between free speech and democratic discourse, “it’s just that he’s confronting these questions without any connection to press traditions, with only recent acknowledgment that he runs a media company, in the absence of any regulation, and with his hands on personal data and technical affordances that enable microtargeting.” Kate Klonick of Stanford said Zuckerberg spoke glowingly about early First Amendment cases, but got one of the most famous—NYT v Sullivan—wrong. “The case really stands for the idea of tolerating even untrue speech in order to empower citizens to criticize political figures,” Klonick said. “It is not about privileging political figures’ speech, which of course is exactly what the new Facebook policies do.”

Evelyn Douek, a doctoral student at Harvard Law and an affiliate at the Berkman Klein Center For Internet & Society, said most of Zuckerberg’s statements about his commitment to free speech were based on the old idea of a marketplace of ideas being the best path to truth. This metaphor has always been questionable, Douek says, “but it makes no sense at all in a world where Facebook constructs, tilts, distorts the marketplace with its algorithms that favor a certain kind of content.” She said Facebook’s amplification of certain kinds of information via the News Feed algorithm “is a cause of a lot of the unease with our current situation, especially because of the lack of transparency.” EFF director Jillian York said the political ad issue is a tricky one. “I do think that fact-checking political ads is important, but is this company capable of that? These days, I lean toward thinking that maybe Facebook just isn’t the right place for political advertising at all.”

Swisher said: “The problem is that this is both a media company, a telephone company and a tech company. As it is architected, it is impossible to govern. Out of convenience we have handed over the keys to them and we are cheap dates for doing so. You get a free map and quick delivery? They get billions and control the world.” Zittrain said the political ad fact-checking controversy is about more than just a difficult product feature. “Evaluating ads for truth is not a mere customer service issue that’s solvable by hiring more generic content staffers,” he said. “The real issue is that a single company controls far too much speech of a particular kind, and thus has too much power.” Dipayan Ghosh, who runs the Platform Accountability Project at Harvard, warned that Facebook’s policy to allow misinformation in political ads means a politician “will have the opportunity to engage in coordinated disinformation operations in precisely the same manner that the Russian disinformation agents did in 2016.”

Sign up for CJR‘s daily email

Today and tomorrow we will be speaking with Jameel Jaffer of the Knight First Amendment Institute, Claire Wardle of First Draft and Sam Lessin, a former VP of product at Facebook, so please tune in.

Here’s more on Facebook and speech:

Other notable stories:

Has America ever needed a media watchdog more than now? Help us by joining CJR today.


Related posts