Will The Pollsters Get It Right On The Referendum?

After failing to predict the outcome of the general election, polling companies face a high-stakes challenge to redeem themselves in June

Features
Projections projected: The results of last year’s exit poll, the only poll to have come close to the election’s actual outcome (©Jack Taylor/AFP/Getty Images)

In the final days before Scotland’s 2014 independence referendum, when polls showed that the contest had narrowed, The Times reported the sale of £17 billion worth of UK stocks, bonds and other financial assets — the largest such sell-off since the worst days of the financial crisis. The cause was concern that the end of the Union was nigh. The same polls prompted unionist politicians to make a hurried promise of greater devolution if Scots voted to remain part of the Union. In the build-up to last year’s general election, the illusion of a neck-and-neck race, created by the polls, focused the minds of voters sympathetic to the Conservatives but contemplating voting for a smaller party, pushing many of them back into the arms of the party that would go on to win a surprise majority.

Both of these are examples of the power of opinion polls. The difference between the two, however, is that while most agree that public opinion did in fact narrow just before the Scottish referendum, polling ahead of the general election famously got it wrong. To state the obvious, polls matter because, in a democracy, what the electorate thinks matters. Knowing what they think in theory makes politicans more considerate of voters’ concerns. But polls will only continue to be taken seriously if the pollsters can be trusted to get it right, something they appeared to be getting better at — until last May. And that failure has only raised the stakes for polling companies in this year’s referendum on EU membership. 

It is still early in the campaign but disconcertingly for pollsters, there is an undeniable gap between two groups of polls. What divides them is how the data is collected. Telephone polling consistently gives Remain a healthy lead — 54 to 36 with 10 per cent undecided according to one average of telephone polls — while online polling puts the numbers closer together — a YouGov poll in early March put Remain on 40 and Leave on 37 with 18 per cent undecided. (See the graph, right, for a fuller comparison.) Professor John Curtice of Strathclyde University, the doyen of British psephology who led the team that ran the remarkably accurate exit poll last May, is perplexed by the discrepancy. “There is a ten-point gap, roughly speaking, between the internet and the phone polls,” he told me. “And if you break it down by age, party support and social grade, there is still a ten-point gap across all these groups.”

A possible explanation for the gap is “social desirability bias”, according to which people are more likely to declare their support for a less socially acceptable position in an online poll than over the telephone. In this case, that would mean telephone polls underestimate support for Brexit and the number of undecided voters. Brexiteers will hope this theory is right. After all, the privacy of the polling booth resembles taking an online survey more closely than it does talking to a stranger on the telephone. Curtice is not entirely satisfied that this explains it. “The only clue we’ve got,” he said, “and it’s not a very strong clue, is that it has long been the case that internet polls have found more UKIP supporters than phone polls in the general election. In 2015 they got quite close to each other but the internet polls slightly overestimated UKIP and the phone polls slightly underestimated them, so maybe the truth is somewhere in between. That is the $64,000 question. But we don’t know.”

That such a consistent gap in results appears to be a direct consequence of something so fundamental — how a poll is conducted — is troubling to pollsters because it means they can’t all be right.

According to Curtice, the referendum result could change British polling forever. “If the internet polls prove to be right, that will be the end of phone polling in the UK,” he said. “But, you know, I have to be honest with you — because of the fact it’s within social groups as reported by polling companies, it is very difficult to say which one is right and which one is wrong.”

More generally, plebiscites are harder for pollsters than general elections. As Anthony Wells, Research Director at YouGov, put it, “The be-all and end-all of a referendum from a polling perspective is that it is a one-off event.” With general elections, almost every polling company “has past votes in their sampling and weighting targets to some degree so a lot of what we do to construct an accurate sample is ask, ‘Does it correctly reflect how people voted last time around?’ If it does it will probably do quite well at predicting how people will vote next time around.” That is not possible with a referendum.

According to Prof Curtice, that challenge is heightened by the fact that views on EU membership cut so strongly across party voting intentions. Tory voters are divided, with a slight majority in favour of leaving, while Labour voters are between two and three to one in favour of Remain, depending on which poll you look at.

Given these challenges, it would seem wise to take poll findings with a pinch of salt during the referendum campaign. Will the polls be treated differently this year? “To a degree,” says Curtice. “But the truth is journalists love them.”

How seriously the polls should be taken depends in part on whether or not the pollsters have learned the lessons of their failure last year. They got the vote shares of Labour and the Conservatives very wrong, overestimating the former and underestimating the latter. (Most got quite close to the final vote share of the smaller parties.) The average of the ten leading polling companies’ final polls before the election put the Conservatives on 33.6 per cent and Labour on 33.5; in the election itself the Conservatives won 36.8 per cent of the vote to Labour’s 30.4 per cent. No pollster — with the honourable exception of Curtice’s exit poll team — can claim to have got it right.

What went wrong? At the most basic level, a carefully selected group of people, when asked to predict their own behaviour on election day, said they’d do one thing while the electorate at large did something marginally different when that day came. The gap between those two groups minus the number of voters who changed their mind in the final hours of the campaign is the size of the pollsters’ failure. Explaining that gap is the hard bit but a consensus appears to have emerged among polling companies. Broadly speaking, sample groups were more interested in politics than the electorate at large. This was a problem when it came to younger voters, who are more likely to vote Labour but less likely to turn out on polling day. In other words, it was Lazy Labour rather than Shy Tories that led to the pollsters’ error.

Wells of YouGov is wary about the referendum becoming a test of whether polling companies have recovered from their mistakes last year: “The intricacies of each vote are very different,” he said. “The sad truth is we could have corrected all the methodological things that caused us to get last year’s election wrong, we could get those bang-on right and still get one of the forthcoming elections wrong for completely different reasons. Or we could have failed to address all these problems from last year and get the vote bang-on correct and it still wouldn’t mean anything. But that won’t be the narrative in the media.”

The paradox of polling is that while political polls have become a central component of the democratic process, they are, for the companies that conduct them, something of a sideshow. They may make headlines but the companies make their money by conducting market research for private clients. Political polling is a party trick; the closest — although somewhat flattering — parallel is Formula One teams, which were traditionally set up to exhibit a commercial car manufacturer’s engineering prowess. “I used that metaphor once,” Anthony Wells told me. “It may not seem so apt any more.”

Ben Page, CEO of Ipsos Mori, is dismissive of the claim that last year was a disaster for pollsters. “The election year was our best financial year for nearly a decade,” he said. Of his 1,300 employees in London, only three of them are working on political polling. “Although the media have portrayed the election as a great debacle for the industry, most informed purchasers of market research would never anticipate the level of accuracy the media expect — and to be honest the pollsters encourage.” 

“One of the funny things about this profession,” said Wells, “is that while most people spend their time telling you how good their product is, pollsters spend their time saying, ‘We’re not that good’. Before we got things wrong people had more faith in us than was warranted.”

The mistake with polls is to obsess over the horse race, worrying whether Remain has moved one point closer to Leave or thinking that one atypical poll means the views of the British people have suddenly changed. But polls can do more than just tell you today what will happen tomorrow. John Curtice’s study, “How Deeply Does Britain’s Euroscepticism Run?”, which uses answers to questions asked as part of the annual British Social Attitudes survey, is a good example of how sophisticated polling can inform political decision-making. The survey was thorough: 3,000 people interviewed face-to-face between July and the beginning of November last year. Curtice looked at the relationship between Euroscepticism in Britain and a desire to leave the EU. His conclusion is that the referendum will be decided by the relationship between cultural concerns about the EU and economic concerns about life outside the Union. Explaining his findings, he said: “When it comes to things like immigration and identity it is perfectly clear that the public is deeply sceptical about the EU, but when it comes to the risks associated with leaving, the public are clearly inclined to stay inside the EU, and it is a question of how those two play out.”

As he puts it in his report, “Only two in five (40 per cent) of those who believe that the EU is undermining Britain’s identity but are not convinced that the economy would be better say they wish to withdraw from the EU. But that figure is at least double (82 per cent) among those whose cultural concern is married with a belief in the economic benefits of withdrawal.”

The study explains the Remain side’s focus on the economic risks of Brexit. The lesson for Leave is to listen to the electorate’s economic worries. The Eurosceptic but risk-averse chunk of the electorate will decide June’s referendum. If the Leave campaign dismisses their concerns as a product merely of Project Fear brainwashing, Leave will lose.