November 6, 2022
Posted by Jay Livingston
In the last few elections, the pre-election polls have gotten it really wrong. Partly that’s because cell phones and low response rates have made sampling difficult. But it also might be that pollsters are not asking the right question. Maybe the usual question — “Who are you going to vote for?” — is not the best way to predict election results.
The most recent episode of NPR’s Planted Money explored this question and in the end tried a less direct approach that some polls are now using. They went to the Marist College poll and got the directors to insert two questions into their polling on local House of Representatives races. The questions were:
- Who do you think will win?
- Think of all the people in your life, your friends, your family, your coworkers. Who are they going to vote for?
At the time, the direct question “Who will you vote for?” the split between Republicans and Democrats was roughly even. But these new two questions showed Republicans way ahead. On “Who will win?” the Republicans were up 10 points among registered voters and 14 points among the “definitely will vote” respondents. On the friends-and-family question, the corresponding numbers were Republicans +12 and +16.
Planet Money sees these results as an example of “the wisdom of crowds” — the idea that the best estimate comes not from the experts but from the collective judgment of everyone with an opinion on the matter at hand. The idea goes back to Galton at the Fair – statistician Francis Galton at the Plymouth (UK) farmers’ fair in 1906.
At the fair, Galton noticed people submitting their guesses on the weight of an ox. Galton the statistician kept track of all the guesses— some 800 in all— and computed the group mean. Galton the eugenist assumed that the guesses of the ignorant would detract from the overall accuracy, while the guesses of farmers and butchers would be closer. The mean of the group was 1197 pounds; the ox’s weight, 1198 pounds. The group did better than the most expert individual |
That’s from one of the many blogposts I have done on the topic (here). I’ve looked at predictions in the Superbowl, the Oscars, and securities trading. In some cases, notably the speculation that led to the financial crisis of 2008, the crowd has not always been wise.
Planet Money thinks that the crowd — the people being polled — is wiser than the experts doing the polling and analysis and that Republicans are going to win big.
But there are two other ideas from social science that can also explain the discrepancy between the responses to the questions.
- Pluralistic ignorance. This is the cognitive error where people think, mistakenly, think they are in the minority. For example, college students may think that just about everyone else on campus is having great sex and having it frequently when in fact most of their fellow students are in the same unfulfilled boat that they are.
- Social desirability. When asked questions in a survey, people avoid answers they think will make them look bad. Ask “How many books have you read this year?” and you’ll probably get an overcount.
It’s complicated. The people you call, even the few who don’t hang up, might give answers that are inaccurate — about what others think and even about what they themselves think. That may always have been true, but in what Planet Money calls “the Golden Age of polling,” roughly from the seventies to 2014, pollsters could make the necessary adjustments. Since then, poll problems have been sort of like Covid — you manage to solve one, and then a new variant comes along.