Election polls are more accurate if they ask participants how others will vote
- Centre for Decision Research
<p>Most public opinion polls correctly predicted the winning candidate in the 2020 U.S. presidential election – but on average, they overestimated the margin by which Democrat Joe Biden would beat Republican incumbent Donald Trump. </p>
<p>Our research into polling methods has found that pollsters’ predictions can be more accurate if they look beyond traditional questions. Traditional polls ask people whom they would vote for if the election were today, or for the <a href="https://academic.oup.com/poq/article-abstract/78/S1/233/1836783">percent chance</a> that they might vote for particular candidates.</p>
<p>But our research into <a href="https://priceschool.usc.edu/people/wandi-bruine-de-bruin">people’s expectations</a> and <a href="https://www.santafe.edu/people/profile/mirta-galesic">social judgments</a> led us and our collaborators, <a href="https://www.santafe.edu/people/profile/henrik-olsson">Henrik Olsson</a> at the Santa Fe Institute and <a href="https://economics.mit.edu/faculty/dprelec">Drazen Prelec</a> at MIT, to wonder whether different questions could yield more accurate results.</p>
<p>Specifically, we wanted to know whether asking people about the political preferences of others in their social circles and in their states could help paint a fuller picture of the American electorate. Most people know <a href="http://dx.doi.org/10.1037/rev0000096">quite a bit about the life experiences of their friends and family</a>, including how happy and healthy they are and roughly how much money they make. So we designed poll questions to see whether this knowledge of others extended to politics – and we have found that it does.</p>
<p>Pollsters, we determined, could learn more if they took advantage of this type of knowledge. Asking people how others around them are going to vote and aggregating their responses across a large national sample enables pollsters to tap into what is often called “<a href="https://www.penguinrandomhouse.com/books/175380/the-wisdom-of-crowds-by-james-surowiecki/">the wisdom of crowds</a>.”</p>
<p><iframe id="Lo86h" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/Lo86h/6/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<h2>What are the new ‘wisdom-of-crowds’ questions?</h2>
<p>Since the 2016 U.S. presidential election season, we have been asking participants in a variety of election polls: “<a href="https://www.nature.com/articles/s41562-018-0302-y">What percentage of your social contacts will vote for each candidate?</a>” </p>
<p>In the 2016 U.S. election, this question predicted that Trump would win, and did so more accurately than questions asking about poll respondents’ own voting intentions. </p>
<p>The question about participants’ social contacts was <a href="https://www.youtube.com/watch?v=v9WSmM8VeQ0">similarly more accurate</a> than the traditional question at predicting the results of the 2017 French presidential election, the 2017 Dutch parliamentary election, the 2018 Swedish parliamentary election and the 2018 U.S. election for House of Representatives.</p>
<p>In some of these polls, we also asked, “What percentage of people in your state will vote for each candidate?” This question also taps into participants’ knowledge of those around them, but in a wider circle. Variations of this question have worked well <a href="https://academic.oup.com/poq/article/78/S1/204/1836551">in previous elections</a>.</p>
<p><iframe id="ZO25h" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/ZO25h/6/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<h2>How well did the new polling questions do?</h2>
<p>In the 2020 U.S. presidential election, our “wisdom-of-crowds” questions were once again better at predicting the outcome of the national popular vote than the traditional questions. In the <a href="https://election.usc.edu">USC Dornsife Daybreak Poll</a> we asked more than 4,000 participants how they expected their social contacts to vote and which candidate they thought would win in their state. They were also asked how they themselves were planning to vote. </p>
<p>The current election results show a <a href="https://cookpolitical.com/2020-national-popular-vote-tracker">Biden lead of 3.7 percentage points</a> in the popular vote. <a href="https://projects.fivethirtyeight.com/polls/president-general/national/">An average of national polls</a> predicted a lead of 8.4 percentage points. In comparison, the question about social contacts <a href="https://osf.io/j54rz">predicted a 3.4-point Biden lead</a>. The state-winner question predicted Biden leading by 1.5 points. By contrast, the traditional question that asked about voters’ own intentions in the same poll predicted a 9.3-point lead. </p>
<h2>Why do the new polling questions work?</h2>
<p>We think there are three reasons that asking poll participants about others in their social circles and their state ends up being more accurate than asking about the participants themselves.</p>
<p>First, asking people about others effectively increases the sample size of the poll. It gives pollsters at least some information about the voting intentions of people whose data might otherwise have been entirely left out. For instance, many were not contacted by the pollsters, or may have declined to participate. Even though the poll respondents don’t have perfect information about everyone around them, it turns out they do know enough to give useful answers. </p>
<p>Second, we suspect people may find it easier to report about how they think others might vote than it is <a href="https://healthpolicy.usc.edu/evidence-base/could-shy-trump-voters-discomfort-with-disclosing-candidate-choice-skew-telephone-polls-evidence-from-the-usc-election-poll/">to admit how they themselves will vote</a>. Some people may feel embarrassed to admit who their favorite candidate is. Others may fear harassment. And some might lie because they want to obstruct pollsters. Our own findings suggest that Trump voters might have been more likely than Biden voters to hide their voting intentions, for all of those reasons. </p>
<p><iframe id="CpNcN" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/CpNcN/3/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>Third, most people are influenced by others around them. People often get information about political issues from friends and family – and those conversations <a href="https://psycnet.apa.org/record/1995-98606-000">may influence their voting choices</a>. Poll questions that ask participants how they will vote do not capture that social influence. But by asking participants how they think others around them will vote, pollsters may get some idea of which participants might still change their minds. </p>
<h2>Other methods we are investigating</h2>
<p>Building on these findings, we are looking at ways to <a href="https://osf.io/zv726">integrate information from these and other questions</a> into algorithms that might make even better predictions of election outcomes. </p>
<p>One algorithm, called the “<a href="https://science.sciencemag.org/content/306/5695/462">Bayesian Truth Serum</a>,” gives more weight to the answers of participants who say their voting intentions, and those of their social circles, are relatively more prevalent than people in that state think. Another algorithm, called a “<a href="https://osf.io/gp96y/">full information forecast</a>,” combines participants’ answers across several poll questions to incorporate information from each of them. Both methods largely outperformed the traditional polling question and the predictions from an <a href="https://projects.fivethirtyeight.com/polls/president-general/national/">average of polls</a>.</p>
<p>Our poll did not have enough participants in each state to make good state-level forecasts that could help predict votes in the Electoral College. As it was, our questions about social circles and expected state winners predicted that Trump might narrowly win the Electoral College. That was wrong, but so far it appears that these questions had on average lower error than the traditional questions in predicting the difference between Biden and Trump votes across states.</p>
<p>Even though we still don’t know the final vote counts for the 2020 election, we know enough to see that pollsters could improve their predictions by asking participants how they think others will vote.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img src="https://counter.theconversation.com/content/150121/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1" style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important; text-shadow: none !important" /><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>
<p><span><a href="https://theconversation.com/profiles/mirta-galesic-1176668">Mirta Galesic</a>, Professor of Human Social Dynamics, Santa Fe Institute; External Faculty, Complexity Science Hub Vienna; Associate Researcher, Harding Center for Risk Literacy, <em><a href="https://theconversation.com/institutions/university-of-potsdam-738">University of Potsdam</a></em> and <a href="https://theconversation.com/profiles/wandi-bruine-de-bruin-275600">Wändi Bruine de Bruin</a>, Provost Professor of Public Policy, Psychology and Behavioral Science, USC Price School of Public Policy, <em><a href="https://theconversation.com/institutions/usc-dornsife-college-of-letters-arts-and-sciences-2669">USC Dornsife College of Letters, Arts and Sciences</a></em></span></p>
<p>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/election-polls-are-more-accurate-if-they-ask-participants-how-others-will-vote-150121">original article</a>.</p>
Contact us
If you would like to get in touch regarding any of these blog entries, or are interested in contributing to the blog, please contact:
Email: research.lubs@leeds.ac.ukPhone: +44 (0)113 343 8754
Click here to view our privacy statement
The views expressed in this article are those of the author and may not reflect the views of Leeds University Business School or the University of Leeds.