11 May 2015
Schoolzone isn't a polling organisation but we are in the "knowing about what people think" industry, and the fuss about how badly wrong the pollsters got it, tells us something.
The polls were showing a roughly equal split between Labour and the Conservatives in the weeks and even days leading up to the election. They got it so badly wrong, apparently, that there are calls to ban them.
But were they so wrong? This incredible Wikipedia page shows exactly what happened in the polls during the period of the last government. Indeed, only two polls had the 6.5% Tory lead correct in the days immediately before the election.
The inaccuracy is down to poor segmentation: something we advise our clients about often.
It's not the 6% difference that has caused he fuss, it's the fact that the Tories ended up with 42% more seats than Labour. But this post isn't about PR or constituency boundaries, it's about total numbers versus segments. Here's another clue: Labour's share of the total vote increased by 1.5% whereas the Conservatives' increased by 0.8%.
The two issues then are:
1. The polls don't take into account the way the whole vote is divided: the data needs to be segmented to reflect the way decisions are actually made. At schoolzone we would have applied the relevant criteria, based on our Educational Intelligence data, to segment the responses according to the real question. The segments required in the election were by party and constituency, not by party alone.
2. No-one accounted for the floating voters and don't knows (DKs). In election polls the DKs are around 10%. DKs tend heavily towards maintaining the status quo so, given that the winning party only received 37% of the vote, 10% tending only slightly towards the Tories gives them the observed 6% point gap. We saw a similar effect in the Scottish referendum, too.
So, what we're saying is: don't shoot the messenger, just listen more carefully to the message.
Recent blog posts