You are here:   Brexit > Will The Pollsters Get It Right On The Referendum?
Projections projected: The results of last year’s exit poll, the only poll to have come close to the election’s actual outcome (©Jack Taylor/AFP/Getty Images)

In the final days before Scotland’s 2014 independence referendum, when polls showed that the contest had narrowed, The Times reported the sale of £17 billion worth of UK stocks, bonds and other financial assets — the largest such sell-off since the worst days of the financial crisis. The cause was concern that the end of the Union was nigh. The same polls prompted unionist politicians to make a hurried promise of greater devolution if Scots voted to remain part of the Union. In the build-up to last year’s general election, the illusion of a neck-and-neck race, created by the polls, focused the minds of voters sympathetic to the Conservatives but contemplating voting for a smaller party, pushing many of them back into the arms of the party that would go on to win a surprise majority.

Both of these are examples of the power of opinion polls. The difference between the two, however, is that while most agree that public opinion did in fact narrow just before the Scottish referendum, polling ahead of the general election famously got it wrong. To state the obvious, polls matter because, in a democracy, what the electorate thinks matters. Knowing what they think in theory makes politicans more considerate of voters’ concerns. But polls will only continue to be taken seriously if the pollsters can be trusted to get it right, something they appeared to be getting better at — until last May. And that failure has only raised the stakes for polling companies in this year’s referendum on EU membership. 

It is still early in the campaign but disconcertingly for pollsters, there is an undeniable gap between two groups of polls. What divides them is how the data is collected. Telephone polling consistently gives Remain a healthy lead — 54 to 36 with 10 per cent undecided according to one average of telephone polls — while online polling puts the numbers closer together — a YouGov poll in early March put Remain on 40 and Leave on 37 with 18 per cent undecided. (See the graph, right, for a fuller comparison.) Professor John Curtice of Strathclyde University, the doyen of British psephology who led the team that ran the remarkably accurate exit poll last May, is perplexed by the discrepancy. “There is a ten-point gap, roughly speaking, between the internet and the phone polls,” he told me. “And if you break it down by age, party support and social grade, there is still a ten-point gap across all these groups.”

A possible explanation for the gap is “social desirability bias”, according to which people are more likely to declare their support for a less socially acceptable position in an online poll than over the telephone. In this case, that would mean telephone polls underestimate support for Brexit and the number of undecided voters. Brexiteers will hope this theory is right. After all, the privacy of the polling booth resembles taking an online survey more closely than it does talking to a stranger on the telephone. Curtice is not entirely satisfied that this explains it. “The only clue we’ve got,” he said, “and it’s not a very strong clue, is that it has long been the case that internet polls have found more UKIP supporters than phone polls in the general election. In 2015 they got quite close to each other but the internet polls slightly overestimated UKIP and the phone polls slightly underestimated them, so maybe the truth is somewhere in between. That is the $64,000 question. But we don’t know.”

That such a consistent gap in results appears to be a direct consequence of something so fundamental — how a poll is conducted — is troubling to pollsters because it means they can’t all be right.

View Full Article

Post your comment

This question is for testing whether you are a human visitor and to prevent automated spam submissions.