3 minute read

Combated malicious influence

Defending against adversarial influence operations

Founded in 2021, the Center on Narrative, Disinformation and Strategic Influence works to identify and mitigate the impacts of strategic influence campaigns from foreign adversaries intent on undermining the interests and security of the U.S. and democratic allies.

The center takes a uniquely interdisciplinary approach to address this globe-spanning challenge — think mathematics and engineering with communications, psychology and journalism.

Influence campaigns are a major component of global competition today, used by competitor nations like China or Russia to sow confusion or mold public opinion in a strategically important region, like the Indo-Pacific or Eastern Europe. The goals can vary — ranging from diminishing confidence in the U.S. as an ally to preemptive justification for military actions. Geopolitical influence operations are often inexpensive and low-risk, but can cause real damage to a nation’s ability to respond to strategic threats.

With the broad deployment of generative AI tools that could supercharge influence campaigns, the center has evolved to focus more heavily on understanding our rapidly evolving media and information landscape, developing tools to identify AI-generated content, and combining mathematics with the study of narrative structure.

“One question that I am very interested in is, how does a nation or a society heal from a successful malicious influence campaign?” said Joshua Garland, interim director of the center. “Adversarial influence campaigns are almost always premised on an ‘us versus them’ narrative meant to increase polarization. How do you start to reduce that polarization? I think that’s a really difficult challenge that I want to be working on.”

Studying Russian propaganda to detect a planned military invasion

The Office of Naval Research, a division of the U.S. Department of the Navy, awarded ASU a grant in 2018 to examine thousands of mass media and social media postings in the Baltic States — Lithuania, Latvia and Estonia — to help detect if Russia is planning a military invasion there.

Researchers from the Center for Strategic Communication at the Hugh Downs School of Human Communication, the School of Computing and Augmented Intelligence, and GSI collaborated on the work, a portion ofwhich was subcontracted to the aerospace and defense company Lockheed Martin.

“About four months before the 2014 annexation of Crimea, we saw huge changes in Russian anti-Ukrainian propaganda,” said Steve Corman, director of the Center for Strategic Communication, who leads the study. “The Russians were clearly trying to rile up the Russian-speaking minorities to sow support for their cause. Obviously, there hasn’t been an invasion in the Baltics yet, but we will be trying to figure out if there are correlations between propaganda framing and conflict events in the Baltics.”

Understanding covert influence through propaganda

As information proliferates across new media technologies at a faster rate than ever before, nation-state actors can manipulate it, including through propaganda, and sometimes undermine a nation’s sovereignty and internal political stability. Over the last decade, China has made increasing use of such information operations as part of its “Three Warfares” strategy, which combines psychological, public opinion and legal warfare.

Scott Ruston, founding director of the Center on Narrative, Disinformation and Strategic Influence, received a Department of Defense Minerva Research Initiative award to study online influence targeting Indonesia and the Philippines that uses narratives to manipulate public opinion and create political action favoring China.

The research helps to fill the critical capability and knowledge gap the United States faces with regard to China’s engagement in “informationized” warfare, and will more broadly establish a model for effective analysis of strategic influence in the Middle East, Europe and other regions.

This article is from: