Why did the Opinion polls get it so wrong for the UK 2015 General Election?

Why Did The Opinion Polls Get It So Wrong in 2015?


The Market Research Industry Is In The Dock Over UK General Election Polls

The shock result of the the UK general election has again called into question the market research industry’s ability to accurately predict real behaviour. Is this an example of an outdated and flawed measurement method being shown to be incapable of delivering the insights it is meant to provide? Is it time for a fundamental review of pre-election polling methods?

The reasons for pre-election opinion polls potentially being inaccurate are not new. They are well understood in the industry, but some have become more challenging with recent political events in the UK.


People are poor predictors of their own behaviour: Opinions change over time and we are very social creatures who are heavily influenced by those around us. Relying on asking people direct questions about future behaviour is fundamentally flawed as it ignores how the human brain works. People like to show the world a consistent exterior, but underneath we are complex and deeply inter-related creatures. We like to feel in control, but  dislike uncertainty, change and focus more on avoiding loss than potential gains.

The undecided voters are of course a major problem for the pollsters as they can swing one way or the other during the campaign. A sudden change in their intentions can dramatically alter the final outcome. The mass media in particular can have a strong impact on how important we perceive issues and how we might ultimately vote.

Exit polls of course have the advantage that they ask people about how they have just voted. Pre-election polls are just taking a snap-shot of voting intentions which may not be a true reflection of actual behaviour.


Wisdom of the crowd: When we want to predict an outcome using people to estimate a likely result, scientific research has shown that we are far better at forecasting what “other” people will do than being asked about our own intentions. Provided the crowd is drawn from a diverse group of people and they act independently, crowd research has been consistently shown to be more accurate than traditional opinion polls. Indeed, an ICM ‘wisdom’ poll did put the Conservative’s ahead of Labour, though not by as much as the final outcome. However, this could be related to the size of the crowd and other factors that determine crowd accuracy.


The political landscape has changed. Until recently we had just two main parties across the whole of the UK. This has changed and we now have a more fragmented political system and regional dynamics (i.e. the SNP) that undermines the traditional nationally representative poll.


Sample size influences the reliability of surveys. The exit poll for instance had a sample size of  20,000 compared to many pre-election polls that survey between 1,000 and 2,000 potential voters. This directly impacts upon how accurate research the can be given that there are 650 constituencies in the UK. With so few respondents potentially in each constituency it is not surprising that polls are inaccurate. Cost appears to be the main driver rather than reliability.

The latest YouGov survey that I came across does have a more robust sample of 10,000. However, with more regionally dynamic voting patterns this makes it much more difficult for the average nationally representative surveys to accurately predict voting intentions.


Who is going to vote? The market research companies have to define a very changeable group. As voting in the UK is not mandatory it is impossible to predict who will vote and who will not. There is plenty of evidence to show that certain party’s supporters are more or less likely to vote depending upon such factors as the weather.

Further, how pollsters define a “likely voter” varies and may be determined by whether they voted in the last election. But this is often based upon self-reporting which is not always accurate. In other instances people are simply asked if they plan to vote.


Dam lies and statistics. To compensate for all these different factors market research companies employ complex weighting and sampling techniques to adjust the demographics to provide a representative group of voters. However, as the wisdom of crowds has shown the less representative sample can often produce more accurate predictions.


Consistency and disclosure of methodology: As I have searched the internet for details of the pre-election polls I’ve noticed how difficult it is to find technical and methodological information on the polls. Even the sample size of some polls are not always clearly shown in articles.

However, different companies by definition do have their own methodologies and this lack of consistency may again lead to differing results. For example, for telephone polls how are the phone numbers sourced and how are they selected?

How accurate is the Census? With the UK being part of the European Union we have seen significant flows of people coming into and leaving the country. However, the demographic information used for weightings is primarily based upon the Census which is only conducted once every 10 years.


What next?

Given the inherent limitations of pre-election polls and the lack of consistency in how they are conducted and analysed there is certainly a need for a review of their usage during election campaigns. Greater disclosure of how polls are conducted and results are processed may also help with this process.

It remains to be seen whether the traditional opinion poll with all its limitations will continue in its current form. Given their failure to accurately predict the outcome of the 2015 UK General Election it seems likely that pressure will grow to improve their performance going forward. This could be a watershed moment for opinion polls in the UK.

You can view my full Digital Marketing and Optimization Toolbox here.

To browse links to all my posts on one page please click here.

  • About the author:  Neal provides digital optimisation consultancy services and has worked for  brands such as Deezer.comFoxybingo.com, Very.co.uk and partypoker.com.  He identifies areas for improvement using a combination of approaches including web analytics, heuristic analysis, customer journey mapping, usability testing, and Voice of Customer feedback.  By  aligning each stage of the customer journey  with the organisation’s business goals this helps to improve conversion rates and revenues significantly as almost all websites benefit from a review of customer touch points and user journeys.
  • Neal has had articles published on website optimisation on Usabilla.com  and as an ex-research and insight manager on the GreenBook Blog research website.  If you wish to contact Neal please send an email to neal.cole@outlook.com. You can follow Neal on Twitter @northresearch and view his LinkedIn profile.