Facebook wants to know how it’s shaping the 2020 elections — researchers say it’s looking too late and in the wrong places (FB)

Summary List Placement

Facebook was first warned in late 2015 that Cambridge Analytica was misusing data illicitly harvested from millions of Americans in an attempt to sway the 2016 US elections.

It didn’t pull the plug on the firm’s access to user data until March 2018 after reporting from The Guardian turned the breach into a global scandal.

More than two years later — and barely two months before the deadline for votes to cast their ballots in the 2020 elections — Facebook has decided it wants to know more about how it impacts democracy, announcing last week that it would partner with 17 researchers to study the impact of Facebook and Instagram on voters’ attitudes and actions.

But researchers outside of the project are conflicted. While they praised Facebook for promising to ensure more transparency and independence than it has before, they also questioned why the company waited so long and just how much this study will really bring to light.

“Isn’t this a little bit too late?” Fadi Quran, a campaign director with nonprofit research group Avaaz, told Business Insider.

“Facebook has known now for a long time that there’s election interference, that malicious actors are using the platform to influence voters,” he said. “Why is this only happening now at such a late stage?” 

Facebook said it doesn’t “expect to publish any findings until mid-2021 at the earliest.” The company did not reply to a request for comment on this story.

Since the company is leaving it to the research team to decide which questions to ask and draw their own conclusions — a good thing — we don’t yet know much about what they hope to learn. In its initial announcement, Facebook said it’s curious about: “whether social media makes us more polarized as a society, or if it largely reflects the divisions that already exist; if it helps people to become better informed about politics, or less; or if it affects people’s attitudes towards government and democracy, including whether and how they vote.”

Facebook executives have reportedly known the answer to that first question — that the company’s algorithms do help polarize and radicalize people — and that they knowingly shut down efforts to fix the issue or even research it more.

But even setting that aside, researchers say they’ve already identified some potential shortcomings in the study.

“A lot of the focus of this work is very much about how honest players are using these systems,” Laura Edelson, a researcher who studies political ads and misinformation at New York University, told Business Insider.

“Where I’m concerned is that they’re almost exclusively not looking at the ways that things are going wrong, and that’s where I wish this was going further,” she added.

Quran echoed that assessment, saying: “One big thing that they’re going to miss by not looking more deeply at these malicious actors, and just by the design, is the scale of content that’s been created by these actors and that’s influencing public opinion.”

A long list of research and media reports have documented Facebook’s struggles to effectively keep political misinformation off its platform — let alone misleading health claims, which despite Facebook’s more aggressive approach, still racked up four times as many views as posts from sites pushing accurate information, according to Avaaz. 

But political information is much more nuanced and constantly evolving, and even in what seem to be clear-cut cases, Facebook has, according to reports, at times incorrectly enforced its own policies or bent over backward to avoid possible political backlash.

Quran and Edelson both worried that Facebook’s election study may not capture the full impact of aspects of the platform like its algorithms, billions of fake accounts, or private groups.

“You find what you go and you look for,” Edelson said. “The great problem of elections on Facebook is not how the honest actors are working within the system.”

Quran also said, though it’s too early say this will happen for sure, that because it’s Facebook asking users directly within their apps to join the study, sometimes in exchange for payment, it risks inadvertently screening out people who are distrustful of the company to begin with.

“We’re already seeing posts on different groups that share disinformation telling people: ‘Don’t participate in the study, this is a Facebook conspiracy'” to spy on users or keep Republicans off the platform ahead of the election, he said. “What this could lead to, potentially, is that the people most impacted by disinformation are not even part of the study.”

In a best-case scenario, Edelson said the researchers could learn valuable information about how our existing understanding of elections maps onto the digital world. Quran said the study could even serve as an “information ecosystem impact assessment,” similar to environmental impact studies, that would help Facebook understand how changes it could make might impact the democratic process.

But both were skeptical that Facebook would make major changes based on this study or the 2020 elections more broadly. And Quran warned that, despite Facebook’s efforts to make the study independent, people shouldn’t take the study as definitive or allow it to become a “stamp of approval.”

It took Facebook nearly four years from when it learned about Cambridge Analytica to identify the tens of thousands of apps that were also misusing data. And though it just published the results of its first independent civil rights audit, the company has made few commitments to implement any of the auditors’ recommendations.

Join the conversation about this story »

Related posts

Facebook flags Bruce Springsteen pro-Biden ‘The Rising’ video for ‘false information’

Facebook flags Bruce Springsteen pro-Biden ‘The Rising’ video for ‘false information’

Chris Jordan
Asbury Park Press
Published 12:54 AM EDT Aug 19, 2020

Oops. 

Facebook flagged Bruce Springsteen for spreading “false information” on Tuesday, Aug. 18, but FB says it was all a mistake.  

The Democratic National Convention video of the Bruce Springsteen song “The Rising,” in which Springsteen and wife Patti Scialfa make an appearance, was removed from Springsteen’s verified Facebook page at approximately 9:30 p.m. EST, Tuesday, Aug. 18.

“Facebook found this post repeats information about COVID-19 that multiple independent fact-checkers say is false,” read an explanation superimposed over a faded image of the video.

Bruce Springsteen’s Facebook page

About two and a half hours later, the label was removed and the video was viewable.

“The label was applied by mistake and was quickly removed once we became aware of the issue,” said Facebook’s spokesperson Katie Derkits to the USA Today Network New Jersey via email.    

The video features Springsteen’s 2002 song “The Rising” framed as a message of resiliency against the Donald Trump presidency. Scenes of a COVID-19 ravaged  America, including an empty subway and football stadium, are shown as “The Rising” begins. That’s contrasted with the march of neo-Nazis with torches in Charlottesville, Virginia and Trump throwing paper towels to hurricane victims in Puerto Rico.

After that, first responders, George Floyd protesters, Black Lives Matter sign makers, mask wearers and more who are dedicated to the Rising  are shown.

More: Bruce Springsteen ‘The Rising’ video takes on Donald Trump at Democratic Convention

More: Why the concept of time is different for Bruce Springsteen than it is for you and me

A Facebook “Science Feedback” explanation of the video removal, accessible by a click on the label, said that “SARS-CoV-2 is a novel coronavirus that arose naturally; no patent exists for SARS-CoV-2; no COVID-19 vaccine exists yet.”

Reps from the Democratic National Convention and Springsteen did not reply to a request for comment by press time. 

Alberto Engeli of Asbury Park attempted to share “The Rising” video on Facebook on Tuesday night and was blocked.

“I don’t understand why, I could only imagine the fact checkers are from multiple organizations and they’re Republican,” said Engeli via email before the video was restored. “I don’t see any relation with Sars-CoV 2 or that (the video) is spreading bogus coronavirus conspiracy theories.”

While it was down on Springsteen’s Facebook page, the video was viewable on Instagram, including Springsteen’s verified page, on YouTube and on Twitter, including Springsteen’s verified page there, where he shared Democratic presidential candidate Joe Biden’s tweet featuring the “Rising” video.

Chris Jordan, a Jersey Shore native, covers entertainment and features for the USA Today Network New Jersey. Contact him at @chrisfhjordan; cjordan@app.com.  

Related posts

My favourite film aged 12: Gold | Film | The Guardian

The quick answer is: “No.” The longer answer is that it depends on your expectations. If you feel certain you are about to watch an execrable film, you will be pleasantly surprised: Gold is a perfectly serviceable thriller, with some tense moments and a genuinely exciting climax in the flooded mine at Pinewood.

One of the reasons the film isn’t as shit as it should be is that it was made, in no small part, by members of the James Bond team. Peter Hunt directs – he was editor of the early Bonds and directed On Her Majesty’s Secret Service. John Glen, later to direct five Bonds, edits and directs the second unit. The production designer is Syd Cain, who did From Russia With Love, OHMSS and Live and Let Die. Those guys are responsible for making two men and a dinghy floating around at Pinewood seem an exciting climax.

Then there’s Roger’s character, Rod Slater, a maverick, woman-chasing commitment-phobe ultimately prepared to die to save the mine (and its miners), which he almost does. There’s a suggestion a genuine relationship may be on the cards with Terry Steyner (York), whose evil husband has conveniently died a few minutes earlier and who is on hand to look after Rod in an ambulance.

It’s definitely one of Roger’s best non-007 performances. This was 1974, so he hadn’t established his Bond persona. Hunt pushes him to be as serious as possible. I certainly believe he’s a miner. And that he’s younger than his 46 years – he’s in good shape, the hair’s more tousled than usual and there’s a bit more sweat than Bond. And he gets quite badly injured at the end. Or at least his arms do.

My sense is that York fought hard to be more than just another Bond girl, making her character as strong as possible. There’s a great scene where she flies Moore back to the stricken mine in her plane (she’s rich) and he accuses her of being involved in the conspiracy. She’s outraged: she won’t take any shit from Roger. As it happens, she is involved in the conspiracy, but she doesn’t know that yet.

And the villain? We’re very much in the “speak quickly with a slight smile” stage of Sir John Gielgud’s film career but there’s a wonderful moment where one of the sub-villains tells him they’ve commissioned a survey that shows just how close the mine is to water but have cleverly replaced every mention of the word “water” with the word “gold”.

“Ingenious”, says Gielgud, without the smile.

There’s also the brilliantly intrusive score by Elmer Bernstein, a crucial reason the action sequences are so tense and exciting. The song Jimmy Helms belts at the start and the end is magnificently absurd too.

I truly think this is Roger’s best non-Bond. Others tout The Man Who Haunted Himself (1971), in which Roger does actually have to do a fair bit of acting, playing a good guy and his evil doppelganger. That probably is his best performance. At the end of his life, knighted, Sir Roger certainly thought so. But I think Gold is the better film. The bar isn’t high, but Gold is a perfectly enjoyable romp one could happily sit through, some warm self-isolated evening.

Related posts