Monday, May 11, 2015

British Election Polls Show Dangers of Relying on Surveys

Britain’s election shocked the country (and much of the world) not because Prime Minister David Cameron and his Conservative Party won but because of the size of their victory.

Last week’s election results once again demonstrates to associations the dangers of over-reliance of polls and surveys.

So what happened in Britain’s elections? And, what does it mean for associations?



A Wall Street Journal article headlined A Dark Election Day for U.K.’s Pollsters,  notes that pre-election polls showed a “knife’s-edge” race that was to close to call. Thus, the media and public were shocked when Prime Minister David Cameron and his Conservative Party earned a major victory.

And, the media proclaim the results as “news” without sharing much background about the methodology. In fact, most of the headlines I’ve seen announcing that “so-and-so” is leading fail to note that the “lead” doesn’t even cover the margin of error.

But far more significantly, pollsters and statisticians failed badly when turning survey results into answers to the all-important question of who’d win how many seats.

The chief challenge is making the sample representative—every voter should have the same probability of being selected for the poll.

Here in the U.S., the media (and, apparently, the public) love pre-election surveys. There must be about 12 to 15 outlets conducting surveys for the 2016 Presidential election (yes, already!).

As noted in the WSJ story, one of the challenges in public opinion polling is getting a survey sample that “represents” the general population. This sometimes means “adjusting” results by projecting subsets of the sample to reflect the overall population.

Associations love surveys:


  • member surveys
  • reader surveys
  • conference evaluation surveys
  • staff and/or board surveys
Years ago, a marketing professional told me: 
  • “You do a lot of surveys but not much research.”
I’ve often pondered on this comment ... especially when I see advocates use survey results to promote their cause or campaign.

I believe in market research and have used it with associations for more than 30 years. I’ve seen polling results that are spot on and others that missed the mark.

Last year, I wrote a post titled Poll/Survey Results: Advocacy or Knowledge for Associations?
It provided two take-aways for association professionals:
  • Consider how you can use survey research and public opinion to help advocate issues for your profession or industry. Note: you may want to review my post on this topic: http://www.scdgroup.net/2013/08/should-associations-use-faux-research.htmlShould associations use faux research to advocate a cause? 
  • Carefully review methodology before accepting the various polls and surveys you see or read or hear.

Methodology is crucial.

Was it a survey (sent to everyone and only those who responded counted) or market research of a randomly selected audience?

How many were surveyed? Was it large enough to reflect the entire population? Was it enough to accurately project subsets of the population?

Were respondents randomly selected? Did those who responded reflect the total population? If not, were results adjusted to fit the population and then results projected?

If the results were compared with some prior survey to project a trend, did the two (or more) surveys use the same methodology? Did they use the same sample? If not, what steps did the researchers use to get both surveys to reflect the population?

What questions were asked? How were they worded? (I once had a research firm say, “Steve, you can’t ask it that way, it will bias the responses.”)

Analysis

What was the margin of error? Were the results being shared outside the margin of error?

Did the analysis consider the “energy” of the responses. For example, if a 5-point scale, did you look only at those who strongly agreed and strongly disagreed? (This provides more accurate results since “those in the middle” don’t have strongly held beliefs.)

I learned this the hard way. Our polls showed we had a slight lead in a referendum but when the votes were tallied, we lost two to one. A few years later, I mentioned this to a research consultant. She told me “throw out the middle” then look at the ratio of strongly agree to strongly disagree. If you have twice as many who strongly agree, you are likely to win.

If you use sound methodology and accurate analysis, the survey results won’t shock you.

If you challenge the methodology and analysis of the person or group using poll results to influence your decision, you will have a better picture for your decisions.

If you are using surveys to make association decisions, you will get more accurate results if you use the correct methodology and analysis.

No comments:

Post a Comment