Trolls will use fake videos to try and sway the 2020 election, warns Alphabet researcher

FAN Editor

Yasmin Green, director of research and development for Google’s Jigsaw.

Brian Snyder | Reuters

There’s a race to “inoculate” consumers against false and inflammatory content ahead of the 2020 elections, according to the head of R&D for an Alphabet subsidiary that monitors online disinformation. But so-called “deepfake” videos, where videos are altered to show speakers saying inflammatory things, as well as propagandists using real videos out of context, are going to be particularly hard to stop.

Yasmin Green is the director of research and development at Jigsaw, an Alphabet subsidiary created to monitor abuse, harassment and disinformation online. She was speaking on a panel of experts in disinformation at the Aspen Institute Cyber Summit in New York on Wednesday.

Election influence is likely to be pushed through different channels, on different websites and using different techniques than in 2016, Green said. Social media companies and researchers like those at Jigsaw are working both to pinpoint these new or expanded techniques, and find “interventions” for them that protect free speech but alert consumers of the authenticity of what they’re consuming.

“I’m not as worried about faked accounts at this time,” Green said, referring to the fake, popular social media accounts started sometimes years in advance of the 2016 election on Twitter and Facebook, and were used to sow discord among voters. Social media companies are doing a better job of removing those accounts, and would-be trolls are having to now “start from scratch.”

“I do commend Facebook, and I see them doing a lot,” she said.

Instead, consumers should expect trolls to use a far wider variety of platforms in the upcoming elections, especially companies who don’t have a strong advertising business like the social media giants.

‘Inoculating’ users

Jigsaw and other researchers have been trying out different methods of warning consumers about altered, fake or false content before they view it, she said.

Results of these interventions have been mixed.

In one study, researchers showed a group of participants a “deepfake” video featuring a comedy routine by actor Walter Matthau, which had been altered to feature the face of former President Richard Nixon. The researchers told all viewers that the video was fake.

Even after being told, only around one-third of the participants correctly identified it as fake.

Further, 17% of participants answered “yes” to the question “Were you familiar with Richard Nixon’s background in comedy?” The former president does not have a background in comedy.

Green described another recent research project conducted by Jigsaw, in which a group of people was told how disinformation campaigns and propaganda work, then shown a propaganda video. For a second group, researchers showed participants the propaganda video first, and then later described how disinformation and propaganda work.

The group who learned about disinformation first was far less likely to believe the videos, she said, suggesting that it’s possible to inoculate users against fakes.

Real videos, false pretenses

Green also cited the story of Brooklyn civil rights activist Omowale Adewale to consider in advance of the elections.

In that incident, a martial arts instructor was approached by a group that said it was involved with charitable initiatives to support African-American civil issues and provide free self-defense courses to minorities in the community. The organization paid Adewale for providing the free trainings, and provided “swag,” including logo t-shirts that he could wear in videos, Green said.

“For him, it felt very much in line with his social justice and activism,” she said.

But promises of meetings to organize the goals for the organization didn’t materialize. Eventually, Adewale learned the group was a Russian front organization, and it had been using the real videos out of context to create propaganda.

Adewale’s case has Green particularly concerned about “Americans either knowingly or unknowingly [creating] real videos that are out of context and used to manipulate people at the other end.”

Another panelist, reporter Nina Jankowicz, also said she worried about how the new, widespread knowledge of how foreign influence campaigns work will crop up domestically in the next election.

Jankowicz pointed to an “astro-turfing” technique using fake online profiles used by a Massachusetts Senate candidate, which on which she reported for Buzzfeed. This displayed a concerning trend, she said, of even American candidates and groups deploying similar tactics that were used by foreign influence campaigners in 2016.

“The Russian playbook has been split wide open not only for other foreign actors but also domestic actors,”Jankowicz said.

Follow @CNBCtech on Twitter for the latest tech industry news.

Free America Network Articles

Leave a Reply

Next Post

U.K. sends "compromise" Brexit proposal to EU

London — The U.K. government sent its proposal for a new Brexit deal to the European Union on Wednesday, which British Prime Minister Boris Johnson called a “compromise.” In a speech to members of his Conservative Party ahead of the publication of the proposed plan, Johnson said he hoped the […]