May 10, 2010
Posted by Jay Livingston
Ask a silly question, you get a silly answer. Ask a politically loaded question, you get a political answer – even if the literal meaning of your question seems to be asking about matters of fact and not opinion..
Here are eight questions from a Zogby poll. Respondents were given a Likert scale from Strongly Agree to Strongly Disagree, but the authors treat answers as either correct or incorrect according to basic economic principles.
1. Restrictions on housing development make housing less affordable.
2. Mandatory licensing of professional services increases the prices of those services.
3. Overall, the standard of living is higher today than it was 30 years ago.
4. Rent control leads to housing shortages.
5. A company with the largest market share is a monopoly.
6. Third-world workers working for American companies overseas are being exploited.
7. Free trade leads to unemployment.
8. Minimum wage laws raise unemployment.
Respondents were also asked to classify themselves on a political spectrum – Progressive, Liberal, Moderate, Conservative, Very Conservative, Libertarian.
This survey wasn’t designed to discover what people think. It was designed to prove a political point: “The Further Left You Are the Less You Know About Economics.” That’s the title of a
post about it at Volokh Conspiracy. A paper by Zeljka Buturovic and Dan Klein, who designed the survey, gives the results.
(Click on the image for a view large enough to actually read)
The results were similar for the other questions.
To be sure, the liberals view of economic cause-effect relationships reflects the way they would like the world to be rather than the way the world actually is. But the bias of the poll is obvious. As monkeyesq says in his comment at Volokh,
1. Pick 8 liberal positions that have a questionable economic basis;
2. Ask people whether they “agree” or “disagree” with the statements;
3. Find that liberals are more likely to support liberal positions;
4. Claim that liberals don’t understand economics.
There’s an even larger problem here – a problem that affects not just polls that have an obvious ax to grind,* but a basic problem of all survey research: the question the survey asks may not be the question the respondent hears or answers.
These eight questions have a literal meaning. As Todd Zywicki, who wrote the Volokh post, says, “Note that the questions here are not whether the benefits of these policies might outweigh the costs, but the basic economic effects of these policies.”
True, the questions do not ask about costs and benefits, although I don’t think that the survey included an explicit caveat like the one Zywicki adds after the fact. Still, we have to wonder about how people really heard these questions.
“Mandatory licensing of professional services increases the prices of those services” – Agree or Disagree? Maybe some people hear a different question, a question about policy implications: “Would you like cheaper, but unlicensed, doctors.”
“A company with the largest market share is a monopoly.” Maybe the what the person hears is: “Can companies with large market share – though less than the share required for it to be a monopoly (100%?) – still exercise monopolistic powers?”
As for the “exploitation” of third-world workers, the word may have a precise economic definition (e.g., it’s exploitation only if the worker has no choice) – I don’t know. But even if such an economic definition exists, to most people the word evokes moral judgment, not economics.
The other items also have flaws, as some of the comments at Volokh (now 200 and counting) point out. (I confess that I’m still puzzled by the responses to Standard of Living. Nearly a third of all the respondents think that the standard of living today is no better than it was 30 years ago – 55% on the left, 12% on the right 21% of libertarians.)
The survey may tell us that “epistemic closure” is a disease that can infect the left as well the right. But it also tells us to be cautious about interpreting survey questions literally. Even innocuous questions may mean different things to survey respondents. Until a question has been tested several times, we can’t be sure what respondents hear when they are asked that question.
*A Kos poll that set out to show that quite a few Republicans were extremist nuts suffers from a similar problem. I blogged it here.