Why did the Opinion polls get it so wrong for the UK 2015 General Election?

Why Did The Opinion Polls Get It So Wrong in 2015?

The Market Research Industry Is In The Dock Over UK General Election Polls

The shock result of the the 2015 UK general election and the EU referendum called into question the market research industry’s ability to accurately predict future behaviour. Is this an example of an outdated and flawed measurement method being shown to be incapable of delivering the insights it is meant to provide? Is it time for a fundamental review of pre-election polling methods?

The reasons for pre-election opinion polls potentially being inaccurate are nothing new. They are well understood in the industry, but until recently little has been done to address these issues.

People are poor predictors of their own behaviour:

People  are very social creatures who are heavily influenced by those around us (our in-group). As a result we behave in ways that are consistent with our ‘in-group’ rather than what might be best for others. However, in order to maintain self-esteem we like to think about our behaviour in a way that projects our actions in a favourable light.

We can feel uncomfortable about thinking about  our true motivations as this can harm our ego and self-image.  Most people like to think they are acting in a way that is for the good of others and so we rationalise our behaviour in ways that reinforces our self-image.

This can lead us to act dishonestly, perhaps by not being truthful about how we intend to vote. But, even if we are found out we will create a story to justify our actions to maintain a positive self-image. This means relying on asking people direct questions about future behaviour is fundamentally flawed as it ignores how we make decisions.

The undecided voters are of course a major problem for the pollsters as they can swing one way or the other during the campaign. A sudden change in their intentions can dramatically alter the final outcome. The mass media in particular can have a strong impact on how salient we perceive issues and how we might ultimately vote.

Exit polls have the advantage that they ask people about how they have just voted. Pre-election polls are just taking a snap-shot of voting intentions which may not be a true reflection of actual future behaviour.

Wisdom of the crowd: 

When we want to predict an outcome using people to estimate a likely result, scientific research has shown that we are far better at forecasting what “other” people will do than being asked about our own intentions. Provided a crowd is drawn from a diverse group of people, they act independently and decision making is decentralised, crowd research has been shown to be more accurate than traditional opinion polls.

Indeed, an ICM ‘wisdom’ poll did put the Conservative’s ahead of Labour, though not by as much as the final outcome. However, this could be related to the size of the crowd and other factors that determine crowd accuracy.

The political landscape has changed.

Until recently we had just two main parties across the whole of the UK. This has changed and we now have a more fragmented political system and regional dynamics (i.e. the SNP) that undermines the traditional nationally representative poll.

Sample size influences the reliability of surveys.

The exit poll had a sample size of  20,000 compared to many pre-election polls that survey between 1,000 and 2,000 potential voters. This directly impacts upon how accurate research the can be given that there are 650 constituencies in the UK. Together with the UK’s first past the post electoral system this makes it difficult for national polls to accurately predict  the final outcome. Cost also appears to be the main driver rather than reliability.

The final YouGov survey did have a more robust sample of 10,000. However, with local tactical voting to deal with and the other limitations of polls still being present boosting sample size wouldn’t necessarily  resolve the challenge of  accurately predicting voting intentions.

Who is going to vote?

The market research companies have to define a very changeable group. As voting in the UK is not mandatory it is impossible to predict who will vote and who will not. There is plenty of evidence to show that certain party’s supporters and some demographic groups are more or less likely to vote than the average.

Further, how pollsters define a “likely voter” varies and may be determined by whether they voted in the last election. But this is often based upon self-reporting which is not always accurate. In other instances people are simply asked if they plan to vote.

Dam lies and statistics.

To compensate for all these different factors market research companies employ complex weighting and sampling techniques to adjust for demographic and behavioural differences.  However, these are just models and don’t necessarily reflect real behaviour.

Consistency and disclosure of methodology:

As I have searched the internet for details of the pre-election polls I’ve noticed how difficult it is to find technical and methodological information on some polls. Even the sample size of some polls are not always clearly shown in articles.

However, different companies by definition do have their own methodologies and this lack of consistency may again lead to differing results. For example, for telephone polls, how are phone numbers sourced and how how many attempts are made to interview a respondent?

How accurate is the Census?

With the UK experiencing significant net-migration over a number of years there have been changes in the profile of the population . However, the demographic information used for weightings is primarily based upon the Census which is only conducted once every 10 years. 

What next?

Given the inherent limitations of pre-election polls and the lack of consistency in how they are conducted and analysed there is certainly a need for a review of their usage during election campaigns. Greater disclosure of how polls are conducted and results are processed may also help with this process.

It remains to be seen whether the traditional opinion poll with all its limitations will continue in its current form. Given their failure to accurately predict the outcome of the 2015 UK General Election it seems likely that pressure will grow to improve their performance going forward. This could be a watershed moment for opinion polls in the UK.

You can view my full Digital Marketing and Optimization Toolbox here.

To browse links to all my posts on one page please click here.

  • About the author:  Neal provides digital optimisation consultancy services and has worked for  brands such as Deezer.comFoxybingo.com, Very.co.uk and partypoker.com.  He identifies areas for improvement using a combination of approaches including web analytics, heuristic analysis, customer journey mapping, usability testing, and Voice of Customer feedback.  By  aligning each stage of the customer journey  with the organisation’s business goals this helps to improve conversion rates and revenues significantly as almost all websites benefit from a review of customer touch points and user journeys.
  • Neal has had articles published on website optimisation on Usabilla.com  and as an ex-research and insight manager on the GreenBook Blog research website.  If you wish to contact Neal please send an email to neal.cole@outlook.com. You can follow Neal on Twitter @northresearch and view his LinkedIn profile.