Still together, at least for now Photo by Toby Melville – Pool/Getty Images.
The British pound took a … well … pounding on Monday, falling -1.5% against the dollar. Derivatives markets went a little nuts, with the cost of hedging the pound against “collapse” reaching its highest level since 2009—implying markets expect a -22% drop over the next month. The culprit, as ever, was “Brexit”—this time, courtesy of three new polls putting Leave in the lead. They got a bit of an assist from some old-fashioned conspiracy theory, as some unnamed cabinet ministers told the BBC they have a plan to use Parliamentary procedure to override any potential Brexit vote and keep the UK in the EU regardless of what voters want.[i] It is all quite manic and probably will be for the next 17 days, until the results are in. (And maybe for a while after, if Leave wins.) As regular readers know, we don’t think the vote is a terribly big deal for UK markets or the economy—see here, here, here, here and here. The status quo has been fine for decades, and both sides have wildly overstated the potential risks and rewards of leaving. Regardless of voters’ decision, uncertainty over its outcome vanishes once the results hit, which should be bullish for stocks. Sentiment, however, can send stocks swinging in the short term, and such swings can drive investors into trading at the worst possible times. So here, irrespective of your (or anyone else’s) opinions on the merits of Brexit, is our advice: Don’t get hung up on these latest polls, or any that hit between now and Referendum Day. Their predictive power is basically nonexistent.
For political scientists, the Brexit referendum polling is a fount of fascination and academic research. Not because polls are precise—rather, because they aren’t. After polls failed to predict last year’s general election, the industry has been in a state of flux. Independent studies, pollsters’ own soul-searching and a formal inquiry all reached various conclusions about what went wrong, and pollsters have spent the last year trying to refine their methods and improve accuracy. No fewer than 10 agencies have conducted referendum polls this year, and all do things a little bit differently. Some do online polls, some are phone-only, some do both. Some weight the sample based on respondents’ voting history, some don’t. Some weight according to sociological factors, some don’t. Some weight according to likely turnout, some don’t. Several have changed methodology in recent months, making it useless to track even one single polling agency’s results over time. The result is a huge, inconclusive blob of data. So inconclusive, as it happens, that the only logical way to show it is with that most vile of visuals, the scatterplot.[ii]
Exhibit 1: UK Brexit Referendum Polling in 2016
Sources: UK Polling Report, ICM, YouGov, IpsosMori, ORB and ComRes, as of 6/6/2016. Each point shows the difference between support for Remain and Leave. Positive numbers indicate Remain in the lead; negative numbers indicate Leave in the lead.
As far as we can tell, there is only one reason these polls stole headlines and whacked sentiment: There were three of them. As Exhibit 1 showed, plenty of polls have shown Leave in the lead, with margins comparable to Monday’s. But this is the first time three polls published on the same day have given Leave the edge. And they followed a Sunday poll that showed Leave sort of winning and sort of losing. For most media outlets, that was enough evidence to proclaim the momentum is swinging toward Brexit.
But it isn’t. For one, Monday’s polls showed between 9% and 16% of voters are undecided. In all three cases, the percentage of undecided voters far exceeded Leave’s margin of victory—to say undecided voters could thus swing things either way is to state the obvious. All three were online surveys, which have tilted far more heavily toward Leave all year. Two of the three (from YouGov and ICM) are in line with their longer-term trends. The other, from TNS, isn’t—but it incorporates a new methodology change, making its relationship to past polls irrelevant. Plus, TNS did the fieldwork in mid-May, compared to June 1-3 for YouGov and June 3-5 for ICM. So, in short, you have two fresh polls with time-tested methodology showing basically no change, and one old poll with untested methodology showing a bit of a swing. None of it is telling.
Nor is the Sunday poll, conducted by Opinium for The Observer. According to Opinium’s press release, it showed Remain with 43%, Leave with 41% and 14% Undecided (and 2% “Prefer Not to Say”). But according to The Observer, it was 40% for Remain versus 43% for Leave. Compared to Opinium’s most recent prior poll, which put Remain ahead 44% to 40%, The Observer’s figures showed a seven-point swing toward Leave. Yet here, too, there was a methodology change. Based on their analysis of the British Election Study, considered the gold standard of face-to-face polling (and conducted as part of the “what went wrong” effort after last year’s polling debacle), Opinium determined that their failure to account for voters’ attitudes toward national identity and other sociological issues made their results less accurate. As Opinium’s Adam Drummond wrote in a must-read blog post on Monday:
The reason these differences matter is that while they do not have a clear impact on which party people say they support (though naturally they will affect things below the surface and UKIP's level of support in particular), they do have a direct impact on whether a person thinks Britain should remain in or leave the European Union. We can end up with samples that have too many small-c conservatives even though we have the right number of Conservatives which means that we may have the vote shares broadly correct for parties but be wrong on issues that cut across party such as this referendum.
The latest poll tried to correct this, and as Drummond notes, produced “a small but significant movement towards Remain.” The Observer, however, used the results tallied under the original methodology, which favored Leave.
Opinium and TNS are merely the latest polling outlets to change methodology midstream. Last month, ComRes added turnout modeling to their samples, and the refinement showed a modest push toward Remain. In late April, TNS added Northern Irish voters to their sample. Earlier that month, ICM started incorporating turnout modeling and political registration, giving Leave a boost. In February, YouGov incorporated education, how much attention respondents paid to politics and changed their age brackets (the impact, they say, was negligible).
The other big issue is the discrepancy between phone and online polls. As Exhibits 2 and 3 show, phone polls give Remain a much bigger advantage, with far fewer undecided voters.
Exhibit 2: Phone and Online Polling
Sources: UK Polling Report, ICM, YouGov, IpsosMori, ORB, TNS, Panelbase, Survation and ComRes, as of 6/6/2016. Each point shows the difference between support for Remain and Leave. Positive numbers indicate Remain in the lead; negative numbers indicate Leave in the lead.
Exhibit 3: Undecided Voters in Phone and Online Polling
Sources: UK Polling Report, ICM, YouGov, IpsosMori, ORB, TNS, Panelbase, Survation and ComRes, as of 6/6/2016.
Pollsters have puzzled over the phone/online gap for months (and not just in the UK), and most agree only time will tell which is right. Those who’ve studied it have noticed phone and online polls tend to reach different demographics (as determined by age, social attitudes, education and others), making weighting crucial. YouGov and ICM have experimented with simultaneous phone and online polls, with varying results. YouGov’s Andy Morris concluded phone polls overweight college graduates, making his agency’s online polls more representative. ICM’s Martin Boon was less committal, and amusingly so. From his May 16 blog post:
If you want to ask me, which is unlikely, the answer you’d get is “I just don’t know”. I can see reasons why phone polls overstate Remain shares, and reasons why online polls overstate Leave shares. That inevitably leads to a conclusion that reality lies somewhere in the middle, but just hold that thought. More aggressive weighting schemes (privately) employed on these very data sets – schemes intended to correct for observed Westminster vote intention skews the like of which have previously consumed us – are not reducing the gap on the EU referendum but increasing it.
The narrative that phone polls are more likely to be right ignores some fundamental flaws in phone methods. Labour supporters are continually oversampled by phone, and that may matter more than those same phone polls missing out on supposedly pro-Remain types, who are disproportionately less likely to turn out to vote. Similarly, what’s lurking under online covers could be equally nasty, and we should not ignore that the fact the UKIP voters are again, as they have long since been, higher in online polls than phone (or indeed at recent elections).
Bemused? You have every right to be.
Professors Patrick Sturgis and Will Jennings reached a similar conclusion in their own study, published on May 24:
While there are of course many caveats required here, this comparison suggests that the true picture may lie somewhere between the two modes, possibly somewhat closer to online. At the very least it suggests a good deal of caution is needed before concluding that one method is right and the other wrong. That will only be known for sure on June 24.
In short, all these polls are less an exercise in prediction and more an academic experiment. As YouGov’s Anthony Wells wrote Sunday evening:
It also leaves us with an ever more varied picture in terms of polling. In the longer term this will be to the benefit of the industry – hopefully some polls of both modes will end up getting things about right, and other companies can learn from and adapt whatever works. Different companies will pioneer different innovations, the ones that fail will be abandoned and the ones that work copied. That said, in the shorter term it doesn’t really help us work out what the true picture is. That is, alas, the almost inevitable result of getting it wrong last year. The alternative (all the polls showing the same thing) would only be giving us false clarity, the picture would appear to be “clearer”… but that wouldn’t mean it wasn’t wrong.
So if polls aren’t useful, what is? You might try more market-based indicators, like bookmakers’ odds or prediction markets (similar to the late, great InTrade). The bookies give Remain a 72% chance of winning, compared to 28% for Leave—but that’s a big increase from the 20% chance Leave had last month. A relatively new prediction market called PredictIt—an academic research project blessed by US regulators—gives Leave a 37% chance of winning, up about 10 points over the last month. However, we’d also take these with a grain of salt. PredictIt is in its infancy, with relatively few participants, and its accuracy is unproven. As for the bookmakers, we remember the time they paid out early on the Greek bailout referendum—and got it wrong.
If you do prefer polls to market-based indicators, we’d suggest skipping the media’s coverage—often hyperbolic and biased—and go straight to the polling agencies’ own write-ups. Many keep blogs, where they provide oodles of insight and even question their own findings. As a public service, here are links to the blogs of three major pollsters:
It still won’t be predictive, but their perspective may help you keep a much more level head.
And a level head is really what it’s all about it. As we mentioned at the outset, sensational coverage heightens emotion, and emotion often heightens the urge to trade. When you read something scary about Brexit, it can create in you a need to just do something in your portfolio, and that “something” could amount to selling amid a brief blip of Brexit freak-outery—and potentially missing a relief rally as reality beats expectations, whether that reality is a win for Remain or a Brexit that goes largely fine. As always, thinking long-term—and not getting caught up in the daily noise—is the key to investing success.
[i] We’d recommend skepticism where any rumor based on leaks from unnamed political sources is concerned. Politicos have a long, long history of using leaks as a political weapon.