Insights

Americans’ Knowledge Deficit, and Confidence Surplus, about Politics

November 7, 2022 1042

With the midterm elections here, many Americans have had to choose which candidates they will support. This decision-making process is fraught with difficulties, especially for inexperienced voters.

Voters must navigate angry, emotion-laden conversations about politics when trying to sort out whom to vote for. Americans are more likely than ever to view politics in moral terms, meaning their political conversations sometimes feel like epic battles between good and evil.

The Conversation logo
This article by Ian Anson, originally appeared on The Conversation, a Social Science Space partner site, under the title “Americans think they know a lot about politics – and it’s bad for democracy that they’re so often wrong in their confidence.”

But political conversations are also shaped by, obviously, what Americans know – and, less obviously, what they think they know – about politics.

In recent research, I studied how Americans’ perceptions of their own political knowledge shape their political attitudes. My results show that many Americans think they know much more about politics than they really do.

Knowledge deficit, confidence surplus

Over the past five years, I have studied the phenomenon of what I call “political overconfidence.” My work, in tandem with other researchers’ studies, reveals the ways it thwarts democratic politics.

Political overconfidence can make people more defensive of factually wrong beliefs about politics. It also causes Americans to underestimate the political skill of their peers. And those who believe themselves to be political experts often dismiss the guidance of real experts.

Political overconfidence also interacts with political partisanship, making partisans less willing to listen to peers across the aisle.

The result is a breakdown in the ability to learn from one another about political issues and events.

A ‘reality check’ experiment

In my most recent study on the subject, I tried to find out what would happen when politically overconfident people found out they were mistaken about political facts.

To do this, I recruited a sample of Americans to participate in a survey experiment via the Lucid recruitment platform. In the experiment, some respondents were shown a series of statements that taught them to avoid common political falsehoods. For instance, one statement explained that while many people believe that Social Security will soon run out of money, the reality is less dire than it seems.

My hypothesis was that most people would learn from the statements, and become more wary of repeating common political falsehoods. However, as I have found in my previous studies, a problem quickly emerged.

The problem

First, I asked respondents a series of basic questions about American politics. This quiz included topics like which party controls the House of Representatives – the Democrats – and who the current Secretary of Energy is – Jennifer Granholm. Then, I asked them how well they thought they did on the quiz.

Many respondents who believed they were top performers were actually among those who scored the worst. Much akin to the results of a famous study by David Dunning and Justin Kruger, the poorest performers did not generally realize that they lagged behind their peers.

Of the 1,209 people who participated, around 70 percent were overconfident about their knowledge of politics. But this basic pattern was not the most worrying part of the results.

The overconfident respondents failed to change their attitudes in response to my warnings about political falsehoods. My investigation showed that they did read the statements, and could report details about what they said. But their attitudes toward falsehoods remained inflexible, likely because they – wrongly – considered themselves political experts.

But if I could make overconfident respondents more humble, would they actually take my warnings about political falsehoods to heart?

Poor self-assessment

My experiment sought to examine what happens when overconfident people are told their political knowledge is lacking. To do this, I randomly assigned respondents to receive one of three experimental treatments after taking the political knowledge quiz. These were as follows:

  1. Respondents received statements teaching them to avoid political falsehoods.
  2. Respondents did not receive the statements.
  3. Respondents received both the statements and a “reality check” treatment. The reality check showed how respondents fared on the political quiz they took at the beginning of the survey. Along with their raw score, the report showed how respondents ranked among 1,000 of their peers.

For example, respondents who thought they had aced the quiz might have learned that they got one out of five questions right, and that they scored worse than 82 percent of their peers. For many overconfident respondents, this “reality check” treatment brought them down to earth. They reported much less overconfidence on average when I followed up with them.

Finally, I asked all the respondents in the study to report their levels of skepticism toward five statements. These statements are all common political falsehoods. One statement, for example, asserted that violent crime had risen over the prior decade – it hadn’t. Another claimed the U.S. spent 18 percent of the federal budget on foreign aid – the real number was less than 1 percent.

I expected most respondents who had received my cautionary statements to become more skeptical of these misinformed statements. On average, they did. But did overconfident respondents learn this lesson too?

Reality check: Mission accomplished

The results of the study showed that overconfident respondents began to take political falsehoods seriously only if they had experienced my “reality check” treatment first.

While overconfident respondents in other conditions showed no reaction, the humbling nature of the “reality check,” when they realized how wrong they had been, led overconfident participants in that condition to revise their beliefs. They increased their skepticism of political falsehoods by a statistically significant margin.

Overall, this “reality check” experiment was a success. But it reveals that outside of the experiment, political overconfidence stands in the way of many Americans’ ability to accurately perceive political reality.

Capitol building

The problem of political overconfidence

What, if anything, can be done about the widespread phenomenon of political overconfidence?

While my research cannot determine whether political overconfidence is increasing over time, it makes intuitive sense that this problem would be growing in importance in an era of online political discourse. In the online realm, it is often difficult to appraise the credibility of anonymous users. This means that false claims are easily spread by uninformed people who merely sound confident.

To combat this problem, social media companies and opinion leaders could seek ways to promote discourse that emphasizes humility and self-correction. Because confident, mistaken self-expression can easily drown out more credible voices in the online realm, social media apps could consider promoting humility by reminding posters to reconsider the “stance,” or assertiveness, of their posts.

While this may seem far-fetched, recent developments show that small nudges can lead to powerful shifts in social media users’ online behavior.

For example, Twitter’s recent inclusion of a pop-up message that asks would-be posters of news articles to “read before tweeting” caused users to rethink their willingness to share potentially misleading content.

A gentle reminder to avoid posting bold claims without evidence is just one possible way that social media companies could encourage good online behavior. With another election season soon upon us, such a corrective is urgently needed.

Ian Anson is an associate professor in the Department of Political Science at the University of Maryland, Baltimore County.

View all posts by Ian Anson

Related Articles

There’s Something in the Air, Part 2 – But It’s Not a Miasma
Insights
April 15, 2024

There’s Something in the Air, Part 2 – But It’s Not a Miasma

Read Now
The Fog of War
Insights
April 12, 2024

The Fog of War

Read Now
To Better Forecast AI, We Need to Learn Where Its Money Is Pointing
Innovation
April 10, 2024

To Better Forecast AI, We Need to Learn Where Its Money Is Pointing

Read Now
A Community Call: Spotlight on Women’s Safety in the Music Industry 
Insights
March 22, 2024

A Community Call: Spotlight on Women’s Safety in the Music Industry 

Read Now
Using Translational Research as a Model for Long-Term Impact

Using Translational Research as a Model for Long-Term Impact

Drawing on the findings of a workshop on making translational research design principles the norm for European research, Gabi Lombardo, Jonathan Deer, Anne-Charlotte Fauvel, Vicky Gardner and Lan Murdock discuss the characteristics of translational research, ways of supporting cross disciplinary collaboration, and the challenges and opportunities of adopting translational principles in the social sciences and humanities.

Read Now
Charles V. Hamilton, 1929-2023: The Philosopher Behind ‘Black Power’

Charles V. Hamilton, 1929-2023: The Philosopher Behind ‘Black Power’

Political scientist Charles V. Hamilton, the tokenizer of the term ‘institutional racism,’ an apostle of the Black Power movement, and at times deemed both too radical and too deferential in how to fight for racial equity, died on November 18, 2023. He was 94.

Read Now
Four Reasons to Stop Using the Word ‘Populism’

Four Reasons to Stop Using the Word ‘Populism’

Beyond poor academic practice, the careless use of the word ‘populism’ has also had a deleterious impact on wider public discourse, the authors argue.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments