Why Are 21st Century Polls So Hard?

We were recently asked if we do polling to collect data for our analytics. The answer is “yes” but people are too polite to ask some other questions they wonder about. This blog is for those nice folks who don’t want to risk asking offensive questions.

In addition to the question of “why polls are so hard,” one question people often wonder about is “why do polls at all?” Why bother to do your own polls when there seem to be polls on anything and everything these days?

Rarely Polled Topics

There are a couple of important answers to this “why do polls” question. First there are many things which are rarely polled, or never polled. For example, as part of a federal acquisition reform study, we wondered whether the general public cared about some topics. Inside the beltway, Washingtonians had deeply held beliefs, but did anyone else care? It turned out the answer was “no.”  So, there was really no political cost from voters in making some reforms.

Another example of niche polling is the emergence of AI. We wondered how juries would respond to a self-driving car accident. So, we have from time to time conducted polls on the use of AI for things like driving cars and making choices which affect our lives.  Our mock jurors were 20 to 40 percent more likely to vote against complex and unexplainable AI when it ended up in court.

At a talk near Stanford several months ago, an earnest PhD objected that AI should be better than humans at most things, and as such, juries should not feel that way. He might be right, but you do polling to find out how people think, not to confirm how they ought to think.

Poorly Polled Topics

Another reason we sometimes do our own polls is the low quality we find on some issues. A current example is polling about Covid-19. Our research has focused on public health, whether citizens will comply with guidelines, and whether mandates yield intended consequences.

Our polling sometimes agrees with the research of others, but often it does not. When we find significant differences, we nearly always find the other pollsters were only looking at likely voters. Depending on the use of polling data, voters might be interesting. But for our purposes, we wanted to know things like “will people wear masks?”

What we found was that some of the findings were very different from other polls, because about 40-45% of voting age citizens in the U.S. don’t vote. So, in most elections you have about 27% for one party, 28% for the other, and 45% who essentially vote for “none of the above” by not voting at all.

Whether to include the silent 45% is a critical decision. For our purposes, leaving them out was unacceptable. So, we did our own polling of thousands of people.

Why Good Polling Is Hard

The 2020 election is wrapping up as this is being written, so right now people think of politics when they think about polling.

It’s become common to claim polls are bad. And there is some evidence that’s true. So, why is polling hard now? There are several reasons.

First, we start with finding people to poll. In the 1970’s, each family had one phone in their home. A pollster could call that number, and they would know the location of the person they were talking with. Moreover, people actually answered their phones, to see who was calling. There was no caller ID.

So, a well-designed poll could collect a sample from U.S. households and have some hope it was representative.

Today, we might find more phones in the home than people. And, with number portability, your area code might be 10 years and three moves old. Caller ID means you will screen your calls, and thanks to spam, you are unlikely to answer the pollster’s call (or email).

It’s now much harder to collect a well-controlled sample.

The second thing we need to do when we finally get to talk to people is to find out some things about them. How old are they? What are their other demographics? Are they likely to vote?

In days gone by, we think most people were honest in responding to these questions. But after data theft and cybercrimes, people are reluctant to disclose information we need. Moreover, they are distressingly prepared to fib. They will fib about their age, their political background, and most importantly they will fib about how likely they are to vote.

Of course, the third and most important thing is to ask about the real topic of the survey. Here too, we find candor has declined. This may be the least important change today. However, with the first two problems, we expect to see significant bias, even if 98% of our respondents are honest about their voting intentions.

In our polling we often think we see 5 – 10% political biases. That’s not always a problem if our topics are not political. But it’s a huge problem for others whose goals are political.

The irony is how the networked age, with everyone online has turned out to be an age of polling confusion and difficulty. Web savvy citizens are not easy to understand.