Facebook staffers walk out saying Trump’s posts should be reined in | ABS-CBN News

Facebook employees walked away from their work-from-home desks on Monday and took to Twitter to accuse Chief Executive Mark Zuckerberg of inadequately policing US President Donald Trump’s posts as strictly as the rival platform has done.

Reuters saw dozens of online posts from employees critical of Zuckerberg’s decision to leave Trump’s most inflammatory verbiage unchallenged where Twitter had labeled it. Some top managers participated in the protest, reminiscent of a 2018 walkout at Alphabet Inc’s Google over sexual harassment.

It was a rare case of staff publicly taking their CEO to task, with one employee tweeting that thousands participated. Among them were all seven engineers on the team maintaining the React code library which supports Facebook’s apps.

“Facebook’s recent decision to not act on posts that incite violence ignores other options to keep our community safe. We implore the Facebook leadership to #TakeAction,” they said in a joint statement published on Twitter.

“Mark is wrong, and I will endeavor in the loudest possible way to change his mind,” wrote Ryan Freitas, identified on Twitter as director of product design for Facebook’s News Feed. He added he had mobilized “50+ like-minded folks” to lobby for internal change.

A Facebook employee said Zuckerberg’s weekly Friday question-and-answer session would be moved up this week to Tuesday.

Katie Zhu, a product manager at Instagram, tweeted a screenshot showing she had entered “#BLACKLIVESMATTER” to describe her request for time off as part of the walkout.

Facebook Inc will allow employees participating in the protest to take the time off without drawing down their vacation days, spokesman Andy Stone said.

Separately, online therapy company Talkspace said it ended partnership discussions with Facebook. Talkspace CEO Oren Frank tweeted he would “not support a platform that incites violence, racism, and lies.”

SOCIAL JUSTICE

Tech workers at companies including Facebook, Google, and Amazon.com Inc have pursued social justice issues in recent years, urging the companies to change policies.

Employees “recognize the pain many of our people are feeling right now, especially our Black community,” Stone wrote in a text.

“We encourage employees to speak openly when they disagree with leadership. As we face additional difficult decisions around content ahead, we’ll continue seeking their honest feedback.”

Last week, nationwide unrest erupted after the death of a black man, George Floyd, in police custody in Minneapolis last Monday. Video footage showed a white officer kneeling on Floyd’s neck for nearly nine minutes before he died.

On Friday, Twitter Inc affixed a warning label to a Trump tweet that included the phrase “when the looting starts, the shooting starts.” Twitter said it violated rules against glorifying violence but was left up as a public interest exception.

Facebook declined to act on the same message, and Zuckerberg sought to distance his company from the fight between the president and Twitter.

On Friday, Zuckerberg said in a Facebook post that while he found Trump’s remarks “deeply offensive,” they did not violate company policy against incitements to violence and people should know whether the government was planning to deploy force.

Zuckerberg’s post also said Facebook had been in touch with the White House to explain its policies.

Jason Toff, a director of product management and former head of short-form video app Vine, was one of several Facebook employees organizing fundraisers for racial justice groups in Minnesota. Zuckerberg wrote on Facebook on Monday the company would contribute an additional $10 million to social justice causes.

Toff tweeted: “I work at Facebook and I am not proud of how we’re showing up. The majority of coworkers I’ve spoken to feel the same way. We are making our voice heard.” 

Related posts

Facebook, free speech, and political ads – Columbia Journalism Review

A number of Facebook’s recent decisions have fueled a criticism that continues to follow the company, including the decision not to fact-check political advertising and the inclusion of Breitbart News in the company’s new “trusted sources” News tab. These controversies were stoked even further by Mark Zuckerberg’s speech at Georgetown University last week, where he tried—mostly unsuccessfully—to portray Facebook as a defender of free speech. CJR thought all of these topics were worth discussing with free-speech experts and researchers who focus on the power of platforms like Facebook, so we convened an interview series this week on our Galley discussion platform, featuring guests like Alex Stamos, former chief technology officer of Facebook, veteran tech journalist Kara Swisher, Jillian York of the Electronic Frontier Foundation, Harvard Law professor Jonathan Zittrain, and Stanford researcher Kate Klonick.

Stamos, one of the first to raise the issue of potential Russian government involvement on Facebook’s platform while he was the head of security there, said he had a number of issues with Zuckerberg’s speech, including the fact that he “compressed all of the different products into this one blob he called Facebook. That’s not a useful frame for pretty much any discussion of how to handle speech issues.” Stamos said the News tab is arguably a completely new category of product, a curated and in some cases paid-for selection of media, and that this means the company has much more responsibility for what appears there. Stamos also said that there are “dozens of Cambridge Analyticas operating today collecting sensitive data on individuals and using it to target ads for political campaigns. They just aren’t dumb enough to get their data through breaking an API agreement with Facebook.”

Ellen Goodman, co-founder of the Rutgers Institute for Information Policy & Law, said that Mark Zuckerberg isn’t the first to have to struggle with tensions between free speech and democratic discourse, “it’s just that he’s confronting these questions without any connection to press traditions, with only recent acknowledgment that he runs a media company, in the absence of any regulation, and with his hands on personal data and technical affordances that enable microtargeting.” Kate Klonick of Stanford said Zuckerberg spoke glowingly about early First Amendment cases, but got one of the most famous—NYT v Sullivan—wrong. “The case really stands for the idea of tolerating even untrue speech in order to empower citizens to criticize political figures,” Klonick said. “It is not about privileging political figures’ speech, which of course is exactly what the new Facebook policies do.”

Evelyn Douek, a doctoral student at Harvard Law and an affiliate at the Berkman Klein Center For Internet & Society, said most of Zuckerberg’s statements about his commitment to free speech were based on the old idea of a marketplace of ideas being the best path to truth. This metaphor has always been questionable, Douek says, “but it makes no sense at all in a world where Facebook constructs, tilts, distorts the marketplace with its algorithms that favor a certain kind of content.” She said Facebook’s amplification of certain kinds of information via the News Feed algorithm “is a cause of a lot of the unease with our current situation, especially because of the lack of transparency.” EFF director Jillian York said the political ad issue is a tricky one. “I do think that fact-checking political ads is important, but is this company capable of that? These days, I lean toward thinking that maybe Facebook just isn’t the right place for political advertising at all.”

Swisher said: “The problem is that this is both a media company, a telephone company and a tech company. As it is architected, it is impossible to govern. Out of convenience we have handed over the keys to them and we are cheap dates for doing so. You get a free map and quick delivery? They get billions and control the world.” Zittrain said the political ad fact-checking controversy is about more than just a difficult product feature. “Evaluating ads for truth is not a mere customer service issue that’s solvable by hiring more generic content staffers,” he said. “The real issue is that a single company controls far too much speech of a particular kind, and thus has too much power.” Dipayan Ghosh, who runs the Platform Accountability Project at Harvard, warned that Facebook’s policy to allow misinformation in political ads means a politician “will have the opportunity to engage in coordinated disinformation operations in precisely the same manner that the Russian disinformation agents did in 2016.”

Sign up for CJR‘s daily email

Today and tomorrow we will be speaking with Jameel Jaffer of the Knight First Amendment Institute, Claire Wardle of First Draft and Sam Lessin, a former VP of product at Facebook, so please tune in.

Here’s more on Facebook and speech:

Other notable stories:

Has America ever needed a media watchdog more than now? Help us by joining CJR today.

Related posts