The following article, by Keith Humphries, is cross-posted from Washington Monthly’s Ten Miles Square This is a simple and clear introduction to a complex problem. It provides a very useful note of caution for Democrats in interpreting and using opinion data.
There are many ways, either through error or chicanery, that a poll can misrepresent public opinion on some issue. For example, the chosen sample can be unrepresentative, the questions can be poorly worded, or, as in this classic demonstration from Yes, Minister, respondents can be lead by the nose to give a certain answer.
Yet none of those problems is as serious as the one that afflicts almost every poll: The presumption that those polled care a whit about the issue in question. Whoever commissioned the poll of course considers it important, but that is no guarantee that respondents have ever thought about it before they were polled, or will act on their opinions in any way afterwards.
Advocacy organizations exploit this aspect of polls relentlessly. If the Antarctic Alliance polls 1000 people and asks “Would you like it if there were a law that protected penguins?”, probably 80% of people will say yes because it’s hard to hate on penguins: They are always well-dressed, they waddle in a cute way and many people are still feeling bad for them because of that egg they lost in that movie where they marched all that way in the cold — what was it called? — anyway, man that was sad, so yeah, happy to tell a pollster that we should protect those furry little guys.
Anarctic Alliance will then argue that Congress should pass the Protect the Penguins Act immediately because their new poll shows that 80% of Americans “want penguins to be protected”. But if you asked those same poll respondents if they’d be willing to donate even $10 to help the law pass, most of them would say no. And if you asked them if they would vote for the Congressional Representative on the basis of how s/he responded to the Protect the Penguins Act, most of them would say no. And if you asked them the open ended question “What are the 10 biggest challenges Congress should be addressing now?”, probably none of them would put penguin protection on their list.
To give a darker variant of this problem, gun control laws generally poll well yet don’t pass. How can we not pass something that we “support”? Easily, if the people who say they support it are not willing to do much to see it pass and the people who are against it are willing to do a lot. Polls usually miss this sort of nuance because they don’t assess how much people care about what they are being polled about.
The few polls that somewhat surmount this problem are those that assess the voting intentions only among people who intend to vote, and, those that try to assess how intensely people feel about the opinions they express (e.g., With a follow-up question of “would you be willing to have your taxes rise to make this happen?”).
The only way I can see to consistently avoid the problem of assuming respondents actually care about the issue of interest as much as do poll commissioners is to expand the usual response format of “Yes, no, or don’t know” to include the option “Don’t care”. But I doubt pollsters would ever do this because it would put them out of business to tell their clients that most people simply don’t give a fig.
[Cross-posted at The Reality-based Community]