A University of Maryland professor is undertaking a long-term project to study Russian digital disinformation campaigns in the United States.
Sarah Oates, a professor and senior scholar in the journalism college, began the project this month at the Woodrow Wilson International Center for Scholars in Washington, D.C., and will continue until May 2019.
Oates said her work will consist mainly of studying the motivations behind Russian disinformation and the ways Russian President Vladimir Putin’s narrative “disseminates into the U.S. media,” she said.
“We know there’s a lot of propaganda around, but we really don’t know exactly where it’s coming from and how it’s being spread,” Oates said.
In the wake of speculation on possible Russian interference in the 2016 presidential election, the U.S. needs to be wary of disinformation in the midterm election season, she said.
Oates hopes her work will allow her and other scholars to help policymakers deter disinformation campaigns, create precise measurements on how propaganda spreads in the digital age and understand why certain stories resonate with the public.
“My research is based on the fact that you can’t practice deterrence if you don’t know what’s coming at you,” Oates said. “All I’m saying is, know your enemy.”
Prior to this project, Oates has been working with Joe Barrow, a graduate research assistant, to develop a software for detecting language within the media in an effort to understand how propaganda and disinformation spread.
The software allows users to search for a keyword in the news cycle and see how often it shows up during certain months.
“The goal is something you can put in front of journalists and policy analysts that would allow them to be able to track the disseminated disinformation from narratives in the media and ideally come up with some way to counter them,” Barrow said.
Dana Priest, a longtime reporter for The Washington Post and a professor of censorship and disinformation course in the journalism school, said Russia has become incredibly sophisticated at “information warfare.”
“[Putin] sees the U.S. as a major adversary,” she said. “He cannot send armies to fight other countries, so he sends cyber-warriors instead.”
Karlis Dagilis, a doctoral candidate from Latvia and a research assistant to Priest, said digital platforms — particularly Facebook, Twitter and Google — play a major role in the spread of false news stories and “have some responsibility” for the problem.
Some global leaders are now using digital platforms as a manipulation tool in their international relations strategies, Oates said. That isn’t necessarily an issue as long as people are media literate, though, Dagilis said.
“There are many different sources who spread misinformation,” Dagilis said. “You can’t prevent them from doing anything. I think the digital world is without borders.”
Although deterring this information is complex, there are ways individuals can be more aware of what they are taking in, Priest said. People can use fact-checkers online, look for red flags when reading information on new accounts or websites and demand that social media companies mark misinformation, she said.
After decades of studying Russian media, Oates said she is excited to take on these questions and have the opportunity to collaborate with policy-makers and other scholars.
“I’m like a kid in a candy shop,” she said. “I mean, it’s like nerd dream come true. I am equally delighted and terrified.”