Ahead of election, researchers use virus model to map the spread of fake news

Nov. 11, 2019, 11:41 p.m.

As the U.S. 2020 presidential election approaches, two Stanford cyber-risk researchers are mapping the growth of fake news and targeted disinformation by utilizing models typically used to track infectious diseases, essentially treating the spread of fake news as a deadly strain of ebola. 

For the past three years, management science and engineering professor Elisabeth Paté-Cornell and Ph.D. candidate Travis Trammell have been working on the project, spurred by Russian influence in the 2016 presidential race through false information in fake social media accounts, often on Facebook or Twitter. Almost four years later, such tactics are now recognized as a serious threat to national security and are expected to occur again in the next election. 

“I’m very concerned about a deliberate disinformation attack in a situation where there is a conflict between the two nations,” Trammell said. “But I guess what I’m more concerned about is death by a thousand cuts –– that this contributes to the erosion and the undercutting of the institutions of democracy, but it does so slowly and over time, so that we don’t recognize the gravity of what’s going on until perhaps it’s too late.” 

Researchers are not the only ones who recognize the threat. According to a survey by the Pew Research Center, 68% of Americans say that fake news greatly impacts their confidence in government institutions, and 79% of Americans believe something should be done to combat the barrage of misinformation. 

Top U.S. officials have echoed this concern. The Senate Intelligence Committee released a report asserting the importance of the issue, with Sen. Richard Burr (R-NC), the committee chairman, stating that “the Russian intelligence service is determined, clever, and I recommend every campaign and every elected official take this seriously.”

Paté-Cornell and Trammell’s research maps who has been exposed to fake news, and how many of those people are likely to believe the story and further spread it –– in other words, how many of those people have been “infected.”

They found that people are more susceptible to fake news when they have high levels of confirmation bias, such as those who hold niche or extreme political views on individual issues. The modeling also indicated that older people are more likely to believe fake news, although there isn’t a definitive answer yet as to why.

Trammell hopes the research will shed light on methods for decreasing the spread of disinformation. Keeping with the infectious disease framework, their research indicates that they can either attempt to “inoculate” those impacted by fake news, or “quarantine” them. 

“The [research] community talks about inoculation in the sense that if you warn people that fake news is in the area, the studies suggest that they are generally better at picking up fake news,” Trammell said. “But there is also a downside — it appears that they are also less trusting of legitimate news sources.”

Still, much like a virus, fake news, bots and targeted disinformation have evolved to override societal precautions and potential barriers. This includes AI-backed “smart trolls” and deep fakes. The Stanford Internet Observatory recently revealed that Russia had been testing new disinformation techniques in Africa through an increase in fake Facebook profiles. 

Trammell believes that in order to diminish disinformation, the U.S. will have to take an interdisciplinary approach, with researchers, psychologists, lawyers, government officials and private companies working together. 

“I just don’t see this problem going away anytime soon,” Trammell said. 

Contact Meghan Sullivan at meghans8 ‘at’ stanford.edu.



Login or create an account