Experts: Russian influence efforts constitute “information warfare”

FAN Editor

Experts were in agreement on Wednesday during the Senate Intelligence Committee’s latest hearing into foreign actor’s attempts to interfere in the U.S. election process that influence efforts being carried out by Russia and other adversaries constitutes “information warfare” and that Americans must be aware their efforts span far beyond the U.S. election system. 

Panelists from organizations like RAND, Graphika and the Alliance for Securing Democracy urged lawmakers that Russia’s attacks on the democratic process are far greater than a single election, pointing to disinformation campaigns that seek to weaken western institutions as well as target world industries. 

Renee DiResta of New Knowledge cited evidence of campaigns targeting agriculture and energy sectors with anti-fracking messages in areas with strong oil interests as well as stoking fear about GMOs in farming sectors of the country. As for what other countries pose the greatest threat in disinformation warfare, Professor Philip Howard of the Oxford Internet Institute said that while Russia has been the most innovative in developing techniques, “it’s safe to say dictators learn from each other as they see successful campaigns run in particular countries they emulate.”

He said certain authoritarian regimes are even actively directing military units to focus on social media campaigning. Howard said as a likely contender in sowing discord and disinformation, China has the “next best capacity” to Russia. 

While lawmakers continue to grapple with how to best approach the issue of interference from a legislative standpoint, Laura Rosenberger of the Alliance for Securing Democracy called on Congress to follow through on communicating clear consequences for Russia’s attempts at infiltration. She said lawmakers must articulate that such behaviors “won’t be tolerated” and that a consequence for bad actors must be automatic. 

“[Our] credibility has to be very clear, Vladimir Putin can not see from one place there is a potential for consequences but then getting a very mixed message,” she said. 

The hearing on Wednesday followed an announcement from social media giant Facebook that they had detected a coordinated influence operation similar to the ones carried out during the 2016 presidential election. The activity was identified some two weeks ago during the site’s ongoing investigation into election meddling, and has since removed eight pages and 17 profiles from Facebook as well as seven accounts from Instagram.

The evidence of targeting deeply divisive political issues, a tactic Russian entities deployed in 2016, is the one of the most concrete indications that foreign actors are primed to target this year’s midterm elections in November. 

See highlights from Wednesday’s hearing below:


Warner: Congress is “just scratching the surface” on Russian meddling

In his opening remarks, committee vice chairman Mark Warner, D-Virginia, acknowledged Facebook’s revelation that foreign actors are still actively trying to influence the U.S. election process. He warned that the U.S. is “not well positioned to detect, track, or counter these types of influence operations on social media.”

He added of the Russian-backed Internet Research Agency’s (IRA) methods: “They are effective. And they are cheap. For just pennies on the dollar, they can wreak havoc in our society and in our elections. And I’m concerned that even after 18 months of study, we are still only scratching the surface when it comes to Russia’s information warfare.”

International effort to sway Russian interference

RAND’s Dr. Todd Helmus says based on studies focusing on Russian-speaking nations, it’s incumbent upon the U.S., NATO and European Union to do a “better job of telling their story” in swaying Russian nationals from participating in the effort to sow discord in democratic processes. He said they must offer a “compelling argument” for Russian-speaking populations to align with the West.

Helmus also recommended improving access to local and original content instead of state-sponsored and inaccurate information in Russian-speaking regions of the world in an effort to foster more understanding and education. 

Future attacks 

Renee DiResta of New Knowledge warned lawmakers that future attacks on the U.S. election process will include campaigns by use of “witting and unwitting actors” as well as deploying disinformation with video and audio produced by artificial intelligence (AI). 

John Kelly of Graphika says Russia’s efforts “did not stop in 2016” and just one day after the election, the Russian government “stepped on the gas.” 

“The assault on the democratic process is much bigger than an attack on a single election,” Kelly warned. He said that Russians are targeting both sides of the political spectrum simultaneously, exploited an “already divided political landscape.” Kelly said Graphika’s findings show that automated accounts created by foreign actors produce as many as 25 to 30 times the amount of political messages that a genuine political account does. 

As for long term goals, Kelly says that Russia aims to “weaken western institutions and traditional sources of information and authorities.” Short term, Kelly says they will continue to inject hacked information to sway a particular event or election. 

And as the president uses negative rhetoric to personify the news media, Kelly warned that the American media is also being targeted, and that journalists, like social media uses, have a “responsibility to harden themselves against manipulations.”

While social media sites have taken the first step to step in and stop disinformation, DiResta noted that the IRA has the potential to bring accounts that have been dormant back online, warning “they’re not going away.”

Asked what other countries pose the greatest threat in disinformation warfare, Professor Philip Howard of the Oxford Internet Institute said that while Russia has been the most innovative in developing techniques, “it’s safe to say dictators learn from each other as they see successful campaigns run in particular countries they emulate.”

He said certain authoritarian regimes are even actively directing military units to focus on social media campaigning. Howard said as a likely contender in sowing discord and disinformation, China has the “next best capacity” to Russia. 

Experts also explained that the overall cyber misinformation campaign spans far beyond elections and intend to increasingly target world industries. DiResta cited evidence of campaigns targeting agriculture & energy sectors with anti-fracking messages in areas with strong oil interests as well as stoking fear about GMOs in farming sectors of the country. 

Next steps

DiResta suggested addressing the “scale and sophistication” of Russia’s disinformation campaign is an aspect the government hasn’t looked and would be a “good place to start.”

Howard suggested that news media outlets of democratic allies around the world should be of the greatest concern as Russia has now moved from targeting the U.S. media to its global partners. He said news outlets in America are “already on the defense” and “have ways to ensure quality of news product isn’t shaped by disinformation campaigns.” 

When asked to provided real-world implications of social media influence operations including just how many Americans might have been impacted by disinformation messaging, Laura Rosenberger of the Alliance for Securing Democracy at The German Marshall Fund of the United States said that the issue lies in the “entire information ecosystem.”

“This is why it’s really difficult to quantify in any meaningful way the reach of these activities,” she explained. Rosenberger added that there were “clear steps we can take on defensive side and deterrent side we need to be taking urgently” citing Senate proposals to defend election systems. 

Rosenberger also called on Congress to follow through on communicating clear consequences for Russia’s attempts at infiltrating the U.S. election process. She said lawmakers must articulate that such behaviors “won’t be tolerated” and that a consequence for bad actors must be automatic. 

“[Our] credibility has to be very clear, Vladimir Putin can not see from one place there is a potential for consequences but then getting a very mixed message,” she said. 

Helmus looked to those in Eastern Europe as a model for Americans looking to protect themselves from misinformation, saying they are well aware of Russia’s intentions and “know what’s going on.”

“The way you apply that to the U.S. is you need to know sources of information, be able to adjudicate and assess the truthfulness and potential biases of that information and try to make your own decisions as a careful consumer,” said Helmus. 

Generational gaps

When pressed on the need for online literacy as a means to prevent future attacks by Sen. Angus King of Maine, who suggested more programs in America’s schools, Rosenberger argued that it can’t just be in the nation’s schools, saying that older populations that have not grown up with tech advances like social media “may be more vulnerable” to the disinformation campaigns playing out. 

CBS News’ Emily Tillett contributed to this report. 

Free America Network Articles

Leave a Reply

Next Post

"It just goes on and on": New wildfires erupt in scorched NorCal

UPPER LAKE, Calif. — Days after wildfires left a deadly swatch of destruction in Northern California rural counties, new blazes exploded into life and threatened more homes in what has become an endless summer of flame in the Golden State. North of San Francisco, a fire threatened homes in an […]

You May Like