Skip to Content, Navigation, or Footer.
Support independent student journalism. Support independent student journalism. Support independent student journalism.
The Dartmouth
April 24, 2024 | Latest Issue
The Dartmouth

Democracy and Conspiracy: Q&A with government professor Brendan Nyhan

Government professor Brendan Nyhan's research focuses on misinformation.

Government professor Brendan Nyhan's research focuses on misinformation.

The distinction between fact and fiction should be very obvious — however, in this age of “fake news” and conspiracy theories, the line separating the two can become blurred. The Mirror sat down with government professor Brendan Nyhan, an expert on political misconceptions and conspiracy theories, to discuss his take on the sometimes-incorrect distribution of political information.

Could you give an overview of your class, Government 30, “Political Misinformation and Conspiracy Theories”?

BN: It’s a mid-level course that I’ve been teaching since I came here. It’s one that’s become increasingly topical as the world has changed — there’s so much we can talk about in terms of how the course material relates to what’s going on in the world.

For how long, exactly, have you taught the course?

BN: I’ve been teaching it since I got here in the 2011-2012 academic year.

What inspired you to create a curriculum specifically focused on misinformation and conspiracy theories?

BN: It’s my research specialty. When I got here, my colleagues encouraged me to teach the course. I was worried that the topic might be too specialized, but they encouraged me to teach what I was most knowledgeable and most passionate about. I think it’s worked out very well. The topic is very specific, but that means we can go deep into the material and the students can really reach the cutting edge of what we know in terms of research, which isn’t always possible in a single quarter. This research literature has also grown dramatically even as I’m teaching the course. So the students are not just reading the most important work; they’re often reading the newest work that has just come out. They’re participating in the field as it’s evolving.

While reading the course syllabus, I noticed that you emphasized how many beliefs don’t qualify as misconceptions or conspiracy theories. What are the definitions used in the course?

BN: I encourage students to make up their own minds about where we should draw those lines, but the conceptual definition that we tend to use is one that I’ve proposed in my research. We’re defining misperceptions as beliefs that are either false or unsupported by the best available evidence. Conspiracy theories are a little trickier because the very nature of a conspiracy theory is that it’s not directly falsifiable. It’s a claim about an unobserved or secret action taken by some powerful elite. Similarly, we try to focus on those conspiracy theories for which there is no credible evidence. We try to be careful to distinguish misperceptions from cases where there really is substantial disagreement about the underlying facts among experts. There are many cases where experts disagree or we simply don’t know the truth with a high degree of confidence, and those are really different from the kinds of cases we talk about, which are ones in which claims can be directly falsified or are contradicted by a strong expert consensus, as with climate change.

Of course, it’s impossible to condense an entire term’s worth of material into a single interview. But if there were just a few key points from your course that you’d hope all students know, what would those be?

BN: What I try to do throughout the term is challenge students to think about why giving people factual information isn’t always the best response to misperceptions and to help them see why people could come to believe in misperceptions. We are all vulnerable, as human beings, to these mistaken beliefs. It’s not something that other people do or that dumb people do; it’s something that human beings do. I think we try to understand the psychology of why people might hold these beliefs, including ourselves, and also talk about what the best response to them might be, which the research suggests is often different from the approach that people tend to take and practice.

Could you elaborate a bit more on what solutions the research does suggest?

BN: There are no easy answers, as the students learn. The typical response to misinformation is to give people more facts or studies. So, people don’t believe in climate change — well, look at all these studies showing that climate change is real. That’s the response you’ll often see in practice. But we’ve seen again and again that just giving people more facts and evidence is often not enough, and in some cases can even be counterproductive. That’s not always the case, of course; we also talk about the circumstances under which people are more open-minded. That information might contradict beliefs they hold, values that they are a member of. But it certainly seems clear that, for the most politicized controversial issues, giving people facts is often an ineffective strategy. We’ve seen myths persist for years and decades when the evidence against them is overwhelming. Barack Obama was born in this country, but a non-trivial minority of Americans believe he wasn’t. Climate change is real, but a substantial percentage of people think it’s not true. Giving people more and more evidence on these points is not necessarily going to be the most effective strategy. It misunderstands how people come to form these beliefs or why they are unwilling to change their minds. It’s often not a question of people lacking access to accurate information, so we often talk about what approaches might be more effective.

Also while reading the syllabus, I saw that many of the assigned readings are very recent, and several were published this year, while others discuss controversies from years ago. Could you comment on how you believe these issues have changed, or stayed the same, over the years?

BN: Misperceptions aren’t new, but they’ve become more prominent in our national political debate. I think a couple of factors are important in understanding why that’s changed. The first one is that we’ve become more polarized, so partisan misconceptions have become more common. People have very negative views of the other party, and misconceptions often concern accusations against the opposition party that people would like to believe are true. Also, the elites who often promote misconceptions are themselves more polarized, so they may be more inclined to promote misinformation For both reasons, political conditions are especially favorable to partisan misperceptions and conspiracy theories right now. That’s made these issues very prominent.

The other factor that has changed is that social science has started to take misperceptions more seriously. When we studied what people know about politics in the past, we often emphasized civics knowledge, the kind of stuff you would have learned in high school — naming the branches of government, and other facts and trivia that aren’t especially relevant to how people experience politics. We’ve shown for decades that people don’t know very much about politics in the sense of being able to answer these pop quiz-style questions. I’m not sure how important that kind of knowledge is, though. I think we failed to consider that people could not only be uninformed, but also misinformed. That distinction was lost. When you’re simply asking someone who the chief justice of the Supreme Court is, it was rarely consequential whether they thought they knew the right answer and were really wrong. Most people were uninformed: They didn’t know the right answer and they knew they didn’t know the right answer. For many other issues, by contrast, people are misinformed. They think they know the correct answer and they don’t. But until maybe the last 15 years or so, we didn’t study misconceptions especially carefully. We kept documenting what people didn’t know about politics in terms of facts and failed to focus on these politically consequential misperceptions they might have held. That’s the other thing that’s changed, and that’s why the syllabus is full of new research. There simply wasn’t that much scholarship out there when I started studying this area.

Of course, in today’s very politically polarized climate, we hear phrases such as “fake news” being used all the time. Given how timely this is to your research and the course, what do you personally think are some of the worst popular misconceptions today?

BN: There’s a long list, and it changes by the day. I’d say the myths that have been prominent recently that I think are potentially most damaging are Donald Trump’s false claim of millions of illegally cast votes in the election and the various conspiracy theories that are circulating which claim a vast Trump-Russia conspiracy beyond what the evidence can support. Both of those are very politically consequential. The voter fraud myth, which is baseless, directly challenges the integrity of our electoral system. Imagine if Trump had lost, and he told his supporters that the election had been stolen. In contemporary American politics, we’ve never had a presidential candidate challenge the integrity of the vote. There’s no evidence to suggest that millions of illegal votes were cast. That’s a really big deal.

The Trump-Russia conspiracy theories are important because we’re seeing Democrats move into a style of conspiracy thinking that we previously observed more commonly among Republicans during the Obama years. At that time, Democrats were quite critical of conspiracy theories about Benghazi and the birther myth and other issues like that. Now, some of them are starting to fall victim to the same kinds of conspiracy theories. It doesn’t mean that there aren’t legitimate questions to ask about Trump and Russia that could be investigated, but we’re seeing a classic form of conspiracy theorizing arising. If you go on Twitter and look around, you will see all sorts of people who think that they’re connecting the dots in ways that no credible investigator has been able to support.

Something you’ve brought up is that many of these misperceptions are baseless, but I know that the people who believe in them are trying to provide evidence that supposedly supports them. Could you comment on how you think that process is taking place?

BN: One thing we study in my course is what’s called directionally motivated reasoning. The idea is, when we’re processing information, we have an accuracy motive: a desire to hold a correct view about the world. But we also have a directional preference about the right answer, especially when it comes to controversial political issues. We don’t always have a directional preference. There are plenty of issues where we just want to get the right answer — we don’t care who’s right. But when it comes to politics, we do care who’s right. It may feel like we’re dispassionately considering the evidence, but our directional motives are influencing the information we find to be moral and convincing. To use a line from a journalist I know, “We think we’re being scientists, but we’re actually being lawyers.” We’re arguing a case, not dispassionately considering the evidence, at least when it comes to politics. You’ll see huge swings in people’s perceptions of the world depending on who’s in power. When Trump took office, for instance, Republican views of the economy dramatically improved. Similarly, Democrats were far more likely to say that George Bush could have reduced gas prices if he really wanted to compared to Barack Obama. The powers of the presidency didn’t change between 2008 and 2009, but we saw partisans’ beliefs about the powers of the presidency to affect gas prices change. Those shifts reflect the motivations people have as partisans.

Is there anything else you’d like to add?

BN: One of the great things about Dartmouth is that we get to work with smart undergraduates like you closely. I also teach a seminar called [Government 83.21] “Experiments in Politics” in which the students get to work with me on research about misinformation and misperceptions. In addition to the mid-level course that we’ve discussed, the students in the seminar and I design and execute experimental studies of information together with funding from the Office of Undergraduate Research. Right now, we’re running two pilot studies online that test different approaches to countering fake news and misinformation in social media feeds, which I think could help inform the debate over how companies such as Google and Facebook should respond to the false information that’s circulating on their platforms. That’s a great example of what’s possible to do at Dartmouth and a great way for students in my course to continue studying these issues. It allows them to go from being consumers of research to producers. Students from the 2014 seminar and I published an article based on our research, and the students from last year and I are about to submit our article based on that research as well. We’re hoping to keep doing that and contribute to knowledge further. Hopefully in the future, I’ll be teaching the work that my students and I did as well as producing new work.

This interview has been edited and condensed for clarity and length.


More from The Dartmouth