Why the fight against disinformation, sham accounts and trolls won’t be any easier in 2020

2020 Election

The big tech companies have announced aggressive steps to keep trolls, bots and online fakery from marring another presidential election — from Facebook’s removal of billions of fake accounts to Twitter’s spurning of all political ads.

But it’s a never-ending game of whack-a-mole that’s only getting harder as we barrel toward the 2020 election. Disinformation peddlers are deploying new, more subversive techniques and American operatives have adopted some of the deceptive tactics Russians tapped in 2016. Now, tech companies face thorny and sometimes subjective choices about how to combat them — at times drawing flak from both Democrats and Republicans as a result.

This is our roundup of some of the evolving challenges Silicon Valley faces as it tries to counter online lies and bad actors heading into the 2020 election cycle:

1) American trolls may be a greater threat than Russians

Russia-backed trolls notoriously flooded social media with disinformation around the presidential election in 2016, in what Robert Mueller’s investigators described as a multimillion-dollar plot involving years of planning, hundreds of people and a wave of fake accounts posting news and ads on platforms like Facebook, Twitter and Google-owned YouTube.

This time around — as experts have warned — a growing share of the threat is likely to originate in America.

“It’s likely that there will be a high volume of misinformation and disinformation pegged to the 2020 election, with the majority of it being generated right here in the United States, as opposed to coming from overseas,” said Paul Barrett, deputy director of New York University’s Stern Center for Business and Human Rights.

Barrett, the author of a recent report on 2020 disinformation, noted that lies and misleading claims about 2020 candidates originating in the U.S. have already spread across social media. Those include manufactured sex scandals involving South Bend, Ind., Mayor Pete Buttigieg and Sen. Elizabeth Warren (D-Mass.) and a smear campaign calling Sen. Kamala Harris (D-Calif.) “not an American black” because of her multiracial heritage. (The latter claim got a boost on Twitter from Donald Trump Jr.)

Before last year’s midterm elections, Americans similarly amplified fake messages such as a “#nomenmidterms” hashtag that urged liberal men to stay home from the polls to make “a Woman’s Vote Worth more.” Twitter suspended at least one person — actor James Woods — for retweeting that message.

“A lot of the disinformation that we can identify tends to be domestic,” said Nahema Marchal, a researcher at the Oxford Internet Institute’s Computational Propaganda Project. “Just regular private citizens leveraging the Russian playbook, if you will, to create … a divisive narrative, or just mixing factual reality with made-up facts.”

Tech companies say they’ve broadened their fight against disinformation as a result. Facebook, for instance, announced in October that it had expanded its policies against “coordinated inauthentic behavior” to reflect a rise in disinformation campaigns run by non-state actors, domestic groups and companies. But people tracking the spread of fakery say it remains a problem, especially inside closed groups like those popular on Facebook.

2) And policing domestic content is tricky

U.S. law forbids foreigners from taking part in American political campaigns — a fact that made it easy for members of Congress to criticize Facebook for accepting rubles as payment for political ads in 2016.

But Americans are allowed, even encouraged, to partake in their own democracy — which makes things a lot more complicated when they use social media tools to try to skew the electoral process. For one thing, the companies face a technical challenge: Domestic meddling doesn’t leave obvious markers such as ads written in broken English and traced back to Russian internet addresses.

More fundamentally, there’s often no clear line between bad-faith meddling and dirty politics. It’s not illegal to run a mud-slinging campaign or engage in unscrupulous electioneering. And the tech companies are wary of being seen as infringing on American’s right to engage in political speech — all the more so as conservatives such as President Donald Trump accuse them of silencing their voices.

Plus, the line between foreign and domestic can be blurry. Even in 2016, the Kremlin-backed troll farm known as the Internet Research Agency relied on Americans to boost their disinformation. Now, claims with hazy origins are being picked up without need for a coordinated 2016-style foreign campaign. Simon Rosenberg, a longtime Democratic strategist who has spent recent years focused on online disinformation, points to Trump’s promotion of the theory that Ukraine significantly meddled in the 2016 U.S. election, a charge that some experts trace back to Russian security forces.

“It’s hard to know if something is foreign or domestic,” said Rosenberg, once it “gets swept up in this vast ‘Wizard of Oz’-like noise machine.”

3) Bad actors are learning

Experts agree on one thing: The election interference tactics that social media platforms encounter in 2020 will look different from those they’ve trying to fend off since 2016.

“What we’re going to see is the continued evolution and development of new approaches, new experimentation trying to see what will work and what won’t,” said Lee Foster, who leads the information operations intelligence analysis team at the cybersecurity firm FireEye.

Foster said the “underlying motivations” of undermining democratic institutions and casting doubt on election results will remain constant, but the trolls have already evolved their tactics.

For instance, they’ve gotten better at obscuring their online activity to avoid automatic detection, even as social media platforms ramp up their use of artificial intelligence software to dismantle bot networks and eradicate inauthentic accounts.

“One of the challenges for the platforms is that, on the one hand, the public understandably demands more transparency from them about how they take down or identify state-sponsored attacks or how they take down these big networks of authentic accounts, but at the same time they can’t reveal too much at the risk of playing into bad actors’ hands,” said Oxford’s Marchal.

Researchers have already observed extensive efforts to distribute disinformation through user-generated posts — known as “organic” content — rather than the ads or paid messages that were prominent in the 2016 disinformation campaigns.

Foster, for example, cited trolls impersonating journalists or other more reliable figures to give disinformation greater legitimacy. And Marchal noted a rise in the use of memes and doctored videos, whose origins can be difficult to track down. Jesse Littlewood, vice president at advocacy group Common Cause, said social media posts aimed at voter suppression frequently appear no different from ordinary people sharing election updates in good faith — messages such as “you can text your vote” or “the election’s a different day” that can be “quite harmful.”

Tech companies insist they are learning, too. Since the 2016 election, Google, Facebook and Twitter have devoted security experts and engineers to tackling disinformation in national elections across the globe, including the 2018 midterms in the United States. The companies say they have gotten better at detecting and removing fake accounts, particularly those engaged in coordinated campaigns.

But other tactics may have escaped detection so far. NYU’s Barrett noted that disinformation-for-hire operations sometimes employed by corporations may be ripe for use in U.S. politics, if they’re not already.

He pointed to a recent experiment conducted by the cyber threat intelligence firm Recorded Future, which said it paid two shadowy Russian “threat actors” a total of just $6,050 to generate media campaigns promoting and trashing a fictitious company. Barrett said the project was intended “to lure out of the shadows firms that are willing to do this kind of work,” and demonstrated how easy it is to generate and sow disinformation.

Real-life examples include a hyper-partisan skewed news operation started by a former Fox News executive and Facebook’s accusations that an Israeli social media company profited from creating hundreds of fake accounts. That “shows that there are firms out there that are willing and eager to engage in this kind of underhanded activity,” Barrett said.

4) Not all lies are created equal

Facebook, Twitter and YouTube are largely united in trying to take down certain kinds of false information, such as targeted attempts to drive down voter turnout. But their enforcement has been more varied when it comes to material that is arguably misleading.

In some cases, the companies label the material factually dubious or use their algorithms to limit its spread. But in the lead-up to 2020, the companies’ rules are being tested by political candidates and government leaders who sometimes play fast and loose with the truth.

“A lot of the mainstream campaigns and politicians themselves tend to rely on a mix of fact and fiction,” Marchal said. “It’s often a lot of … things that contain a kernel of truth but have been distorted.”

One example is the flap over a Trump campaign ad — which appeared on Facebook, YouTube and some television networks — suggesting that former Vice President Joe Biden had pressured Ukraine into firing a prosecutor to squelch an investigation into an energy company whose board included Biden’s son Hunter. In fact, the Obama administration and multiple U.S. allies had pushed for removing the prosecutor for slow-walking corruption investigations. The ad “relies on speculation and unsupported accusations to mislead viewers,” the nonpartisan site FactCheck.org concluded.

The debate has put tech companies at the center of a tug of war in Washington. Republicans have argued for more permissive rules to safeguard constitutionally protected political speech, while Democrats have called for greater limits on politicians’ lies.

Democrats have especially lambasted Facebook for refusing to fact-check political ads, and have criticized Twitter for letting politicians lie in their tweets and Google for limiting candidates’ ability to finely tune the reach of their advertising — all examples, the Democrats say, of Silicon Valley ducking the fight against deception.

Jesse Blumenthal, who leads the tech policy arm of the Koch-backed Stand Together coalition, said expecting Silicon Valley to play truth cop places an undue burden on tech companies to litigate messy disputes over what’s factual.

“Most of the time the calls are going to be subjective, so what they end up doing is putting the platforms at the center of this rather than politicians being at the center of this,” he said.

Further complicating matters, social media sites have generally granted politicians considerably more leeway to spread lies and half-truths through their individual accounts and in certain instances through political ads. “We don’t do this to help politicians, but because we think people should be able to see for themselves what politicians are saying,” Facebook CEO Mark Zuckerberg said in an October speech at Georgetown University in which he defended his company’s policy.

But Democrats say tech companies shouldn’t profit off false political messaging.

“I am supportive of these social media companies taking a much harder line on what content they allow in terms of political ads and calling out lies that are in political ads, recognizing that that’s not always the easiest thing to draw those distinctions,” Democratic Rep. Pramila Jayapal of Washington state told POLITICO.

Article originally published on POLITICO Magazine

Related posts

Russia and 2020 Elections

One week after Robert Mueller’s testimony shined a spotlight, once again, on election interference, Senate Majority Leader Mitch McConnell is feeling the heat. The leader turned heads on the Senate floor Monday as he rose to decry critics who have dubbed him “a Russian asset” and “Moscow Mitch” for stonewalling congressional measures to improve election security. And with momentum building in the House to formally start impeachment proceedings against President Trump, the pressure is unlikely to let up anytime soon.

Focusing on election interference from 2016 is backwards thinking, though, at least according to Virginia Senator Mark Warner. With 2020 just around the corner, he tells WIRED—in an exclusive interview—that the upcoming election is where both parties need to direct their attention right now.

As the top-ranking Democrat on the Senate Intelligence Committee, Warner has long been a vocal proponent of new legislation to strengthen election protections, such as the Honest Ad Act, which would compel Silicon Valley firms to disclose when political ads are paid for by a foreign nation. He’s also behind a bill that would require campaigns to alert federal officials if they’re approached by a foreign operative offering information or other assistance. Both bills have bipartisan support—Senator Susan Collins became the first Republican to cosponsor the Foreign Influence Reporting in Elections Act earlier this week.

Even as GOP leaders try to position election security as a partisan issue, Warner—a former governor of Virginia and a cofounder of the firm that eventually became Nextel—has maintained the respect of his colleagues across the aisle. But his frustration seems to be growing, especially now that Trump has tapped Representative John Ratcliffe (R-Texas) to be his next director of national intelligence. Unlike Senate Minority Leader Chuck Schumer, who has already come out opposed to Ratcliffe, Warner tells WIRED he’s still got some patience left. Even if it’s wearing thin.

This transcript is slightly edited for length and clarity.

WIRED: After Mueller testified, the president and Republicans say case closed. What do you make of that?

Mark Warner: I’m not here to relitigate 2016, or the Mueller testimony, specifically. I would point out, out of the Mueller investigation: 37 indictments, the president’s national security adviser pled guilty. The president’s campaign manager pled guilty. The president’s deputy campaign manager pled guilty. The president’s chief political adviser is coming to trial in the fall, Roger Stone. The attorney general had to resign. There were literally hundreds of contacts between the Trump campaign and Russian agents.

That’s not normal. And I think the biggest takeaway from the Mueller testimony was that the Russians who attacked us in 2016 are still attacking us and, in Bob Mueller’s words, on a daily basis. You combine that with the warnings from Trump’s own FBI director [Christopher Wray] and Trump’s own director of national intelligence [Dan Coats]. And one of the things that concerns me the greatest is that we’ve not done more to protect the integrity of our election system in 2020.

I was just talking to your [Intelligence Committee] cochair, Senator [Richard] Burr, and he was saying the states in 2018 weathered these attacks, the national infrastructure is good on election security. Basically, case closed, again, not much more is needed.

I think everyone picked up their game in 2018, including the Department of Homeland Security, and our intelligence community was more active as well. But the intelligence community’s own reporting was that Russia didn’t throw its full force of efforts in 2018. Chances are they’ll reserve those for the presidential election. So I think there is some low-hanging fruit that would get 75 votes on the floor of the Senate—if we could get these bills to the floor of the Senate.

I think there ought to be an affirmative obligation that if a foreign government, the Kremlin, offers you campaign help, your obligation ought to be not to say thank you, but to report to the FBI. I think we ought to make sure that every polling station in America has a paper ballot backup, so that if a machine was hacked, you’ve still got ability to protect the integrity of the voting system. And I haven’t met anyone that doesn’t think we need some basic guard rails around the manipulation of Facebook, Twitter, and Google by foreign entities and others. So at least there ought to be the requirement that if somebody advertises on a political basis on Facebook, but in truth it’s a foreign government, they ought to have the same disclosure requirements as somebody who advertises on radio or television.

Isn’t it a little bit ironic that in this highly digital era, we’re going back to paper ballots?

I think we need to make sure that we use the best technology, but if technology, as we see from banks this week, can continue to be hacked into, if voting machines are not as protected as needed, if the private companies who control the voter files could have their information moved around … You don’t need to change votes to cause chaos. I think people’s overall confidence in the system goes up if there is that back check of having a paper ballot backup. Again, this is not saying we wouldn’t still use voting machines, but across the election community everyone believes it’s safer if you have that paper ballot backup that goes along with the voting counting machines.

And now we know we’re getting attacked, cybersecurity is on the top of many minds. And then the president this week announced he’s nominating Representative John Ratcliffe to be DNI, who seems like more of a politician and a Trump supporter than someone from the intel community. Does that worry you?

It worries me greatly. The irony is that Donald Trump’s appointees in the intel world—his director of national intelligence, Dan Coats; his director of the FBI, Chris Wray, his director of the CIA, Gina Haspel—have been pretty good about speaking truth to power, even when Trump did not want to hear the truth. They’ve been very good at not allowing America’s intelligence to get politicized—while I’m going to give Mr. Ratcliffe the courtesy of a meeting, I fear that he is being appointed in the mold of a Bill Barr, the attorney general, who basically is simply a loyalist first to Donald Trump and doesn’t maintain that kind of independence.

If there’s ever been a time when everyone says that Russians and others will be back, when we’ve got as many potential conflict spots around the world, we need to make sure that the head of our national intelligence is not going to politicize the intelligence. That intelligence product goes to our military, it goes to the executive, it goes to us in the Congress. It cannot be a political product. And we’ve got to make sure that the intelligence community is going to be willing to speak truth to power, and that means telling Donald Trump the truth, even if he doesn’t want to hear it. And so far it appears to me that Mr. Ratcliffe, who doesn’t have much experience and who seems—based upon press reports—that his audition was based on questioning Mueller and questioning the legitimacy of the Russian’s intervention in our electoral system, is pretty chilling.

What do you see as the biggest threats—or are there any new threats—facing America in 2020?

So I think there are a couple of new threats. One, Russia in 2016 was surprised at how vulnerable our systems were, our electoral systems. And how easy Facebook and Twitter and YouTube were to be manipulated. So I think that playbook is now out there, they’ve used the same tactics in the Brexit vote [and] the French presidential elections. So my fear is we may not only see Russia, we can see Iran, we could potentially see China, who has a great deal of control over a number of their Chinese tech companies, start to use these tools because they’re cheap and effective. I like to point out that if you add up all Russia spent in the Brexit vote, the French presidential elections, and the 2016 American elections, it’s less than the cost of one new F-35 airplane. So Russia and our adversaries, I think, have decided the way to engage with us in conflict is not through straight up old-school military but through cyber activities, misinformation and disinformation, increasingly trying to weaken and interfere, for example with our space communications, and I think Russia will up their game … and others … [It] means there will be more adversaries in 2020.

Second is, I think in 2016 we saw Russia try to misrepresent—the Russian agents misrepresent themselves as Americans on Facebook and Twitter by simply posting fake messages. The next iteration, the next generation of that will be the so-called “deepfake” technology, where an American may not be able to view what his eyes are telling him, because you’ll see an image of you or me or a political figure that may sound like that person but isn’t that person at all.

Now, if McConnell doesn’t allow some of these bills, like the Honest Ads Act or just broader election security bills, to come up, what do you think the Silicon Valley tech firms can do on their own?

Look, we’ve seen progress made by Facebook, Twitter, some progress made by Google. But I don’t think self-regulation, particularly when a regulation may mean they may not be collecting as much information as they like, or self-regulation may mean they have to go against or limit some of the fake content. It goes against their very business model. So I think Facebook has made progress in particular, but some of the tools they have—for example, the ability to access on an easy basis the campaign ads that they promised, that tool is not effective at all.

So at the end of the day, when we’re talking about something as critical as protecting the integrity of our democracy, when Americans lack faith in so many of our institutions to start with, if we don’t go the extra mile and put in place a set of rules and regulations—and god forbid should Russia or Iran or another foreign enterprise massively interfere again—and we didn’t do our duty, then shame on all of us.

This week, two fairly senior Senate Democrats called for impeachment proceedings to begin. Where are you on that? We started this conversation with you saying you don’t want to relitigate 2016, but it seems like there’s this growing chorus amongst Democrats to impeach.

I actually think Speaker [Nancy] Pelosi has navigated that challenge very well. I understand the frustrations with President Trump—his activities and tweets and antics. I think, though, the best way we can show that that’s not who we are as Americans is to defeat him at the ballot box in a free and fair election. And what I worry about is if we don’t guarantee that free and fair election, then we haven’t done our job.


Related posts