Government professor Brendan Nyhan and other scholars call for increased study of fake news
Government professor Brendan Nyhan has joined 15 other scholars from different disciplines in calling for increased interdisciplinary efforts to study and eventually counter the spread of “fake news.” In an article published on March 9 in the journal Science, the 16 researchers discussed potential interventions that may effectively stem from “the flow and influence of fake news.”
“[The article] was basically a call for action,” said Brown University cognitive, linguistic and psychological sciences professor and article co-author Steven Sloman.
Harvard Kennedy School global communications and public policy professor Matthew Baum, one of the article’s authors, added that the article was a response to “concerns about the political environment, especially the problems with information.”
According to the article, social bots — automated accounts impersonating humans — are estimated to take up nine to 15 percent of active Twitter accounts and 60 million Facebook accounts, and can magnify the spread of fake news online by liking, sharing and searching for information. However, the effects of bots’ prevalence on the dissemination of fake news must be “interpreted cautiously” because of the lack of methods to “derive representative samples of bots and humans on a given platform.”
Interdisciplinary collaboration in researching fake news is essential, Nyhan said, as fake news is a “multifaceted” problem. He added that researchers need greater access to data than they currently have.
Sloman added that sharing information will lead to positive outcomes in countering fake news, but noted that Facebook allowing independent scholars to access its user data will help academics learn more about the creation and spread of fake news.
Baum said that efforts to promote information-sharing in researching fake news will include organizing more academic conferences and funding graduate student participation in these conferences.
“We’re trying to generate an identity to this area of research,” he said.
However, the role of social media in the spread of fake news is uncertain, Sloman said, adding that cable TV actually has more influence on people’s perceptions.
Sloman added that he is also unsure of how fake news on social media may have impacted the 2016 election.
Nevertheless, fake news’ use as a vehicle to manipulate democracies is a very serious problem, Nyhan said.
“We need to consider how much power we want to give to the platform that determined what type of information should be shared,” he said.
Sloman said that while fake news may only have a small effect on democracies, he is still worried because “small differences matter.” Changes in the perspectives of a small percent of Americans can completely change the result of an election, he said.
“Democracies rely on people pursuing their true interests, which relies on knowing the truth,” Sloman said. “If we don’t have access to the truth, all that will collapse.”
Regarding the future of fake news, the researchers are both pessimistic and hopeful.
Baum said he is worried about fake news’ capacity to be used as marketing tools or political weapons.
“Until we develop very robust counter measures, the problem will get worse,” he said.
However, fake news may become so prevalent that people will eventually just start ignoring it, Sloman said. As people become more aware of the dangers of fake news, social media will also become much more responsive and responsible in countering fake news, he said.
“We’re so aware of the problem that it’s got to be less in the next election,” Sloman said.
Nyhan noted that fake news’ “increasingly international” feature is worrisome. He added, however, that the 2018 midterm and 2020 presidential elections will provide a better understanding of how fake news will progress in the long term.