MythBuster: The 2017 'Youthquake'

By Campaign Agent Joe Perry 

Whilst the 2017 British General Election may seem like a lifetime ago, the events of 8th June 2017 remain at the forefront of the public mind. One particular area that has interested politicians and the media alike has been the importance of young people in facilitating Labour’s better than expected result – something that has retrospectively been nicknamed the 2017 ‘Youthquake’. Britain’s youth have a history of political apathy- usually tabling the lowest voter turnout for any age demographic. But for many, 2017 was the turning point.  

Moving into 2018, the intrigue of the election has endured- leading to subsequent revision to the ‘Youthquake’ narrative. Last week, the ‘British Election Study’ (BES) - the so called (and self proclaimed) ‘gold standard measure of electoral behaviour in Britain’, the largest ‘face-to-face’ survey of voting habits in the country- delivered a metaphorical bombshell. Writing in an article for the BBC, the authors of the report categorically ruled out the possibility of a ‘Youthquake’ at the 2017 General Election, stating that their results show ‘very little change in turnout by age group between the 2015 and 2017 elections’. Over the next few days, The Guardian and other media outlets went on to produce similar articles dismissing the notion of the ‘Youthquake’ altogether. 

Yet the story does not end here. Writing a day later for Prospect magazine, political commentator Peter Kellner attempted to resurrect the possibility of the ‘Youthquake’ by interrogating the British Election Study’s methodology. Importantly, out of the 2,194 voters interviewed, Kellner correctly points out that only 109 of them were under 25 years-old. For Kellner, the BES ‘skate on thin ice when they seek to draw precise conclusions from small sub-groups’ – as making conclusions about ‘such small subsamples are subject to large margins of error’. For most individuals, who are not familiar with the methodology of political science, the debate surrounding the ‘Youthquake’ seems utterly unintelligible. 

So, was there a ‘Youthquake’ at the 2017 British General Election? The honest answer (disappointingly) is twofold: a) we don’t know and b) we can’t actually tell. Given this conclusion, an article in this light may appear futile; however, there is an important trend to chart. It will examine the rise and fall of the ‘Youthquake’ notion and seek to explain why political scientists can and will never (with a high degree of probability) prove or disprove a surge in youth votes. 

To begin, it is vital to understand where the ‘Youthquake’ notion came from. Arguably, it’s genesis can be traced to only a few hours after the votes had been counted, when David Lammy (Labour MP for Tottemham) took to Twitter to announce: ‘72% turnout for 18-25 year olds. Big up yourselves’. Although Lammy’s figure was quickly dismissed as Chinese-whispers, talk of a youth surge began to spread – no doubt compounded by Labour’s fortunes that night. A few days later, the term ‘Youthquake’ was planted in the public mind- a reference to a 1960s fashion movement in which young people suddenly began to have more influence on culture and the arts. 

However, as I have written in a previous article, reality gradually took over as the major polling companies began to present more modest figures. Ipsos Mori, for instance, estimated that only 54% of 18–24 year olds had voted in the 2017 election. Similarly, YouGov found that out of over 50,000 participants, only 57% of the 18–24 year olds had voted. Whilst both reports found young voters to have increased between 2015 and 2017 (Ipsos Mori predicted only 43% of 18–24 year olds voted in 2015), 18–24 year olds still polled as the age group least likely to turn-out to vote. Progress, yes. A ‘Youthquake’? – perhaps not. 

Nevertheless, for many, Ipsos MORI and YouGov’s reports were the appetisers to the main course: The British Election Study. Accordingly, all analysis of the voter turnout subsided until January 2018 when the BES published their results (shown below). Here, one can see the major difference between Ipsos MORI and YouGov’s polling to that of the BES. For Ipsos MORI, for example, voter turnout for 18-24 year olds had gone up by 11% since the 2015 election (43% to 54%). In contrast, the BES calculated voter turnout had gone down 5.6% for 18-24 year olds.  

 The British Election Study’s results (Image:  Prospect )

The British Election Study’s results (Image: Prospect)

Why the big difference? Here, the contrast is best explained by illustrating the polling methodology used. The first point to note is there is no exact method of assessing how many or how young people voted. Firstly, voters (due to the nature of the Secret Ballot) never have to declare who they voted for, nor can they confirm as to whether they turned up on the day. Instead, the only figure that can be produced is how many votes were cast and how many people were signed up on the electoral register. So, to more accurately judge voter turnout, companies contact individuals and simply ask if they voted and who they voted for. 

The second point is that researchers cannot contact every voter in the country. This (aside from being almost impossible) would cost far too much time and money. Instead, researchers choose what is called a ‘random sample’ of the population. These individuals are then contacted in the hope that their results will be demonstrative of the population as a whole. To aid this process, polling companies often ‘weight’ their results to make them more representative. In layman’s terms, ‘weighting’ means adjusting the figures to fit the population. For example, in a very extreme example, if 20% of the UK are aged between 18 and 24 years old, but only 10% of the sample interviewed fall in this age bracket, researchers will ‘weight’ the results to make up for this deficit: a formula will be applied to ‘fine-tune’ the results.   

Significantly, the surveys noted so far vary considerably in sample size. YouGov, for example, used a sample size of around 50,000 participants; Ipsos MORI used 7,505; and the British Election Study used 2,194. Does this make YouGov the most reliable? No, most would answer. Why? Because YouGov can generate a large number of participants by using online survey forms. Whilst this method has had a fair amount of success in recent times, it is usually considered to be the least reliable. Due to the fact that participants fill in surveys at their own leisure, some argue they have little obligation to tell the truth- thus affecting the accuracy of the results. 

For Ipsos Mori’s report, polling was conducted by telephone interviews – mobile numbers are randomly generated and called by interviewers. As a consequence, there are fewer participants; nonetheless, political scientists tend to hold this method in higher regard than internet polling. It is expected participants will be more honest when speaking to a researcher. 

The question which remains to be answered is why the BES is hailed as the ‘gold standard’, even though it has the smallest sample. The simple answer is that the BES strives for quality not quantity. For the BES, participants are chosen from the Post Office record of addresses and pursued for their surveys (rather than the more voluntary basis of ‘picking up the phone’). Interviewers are then conducted face-to-face (thus explaining the long wait for the survey’s results) and then are ‘confirmed’ by checking the electoral register as to whether that participant was in fact a registered voter or not.

More importantly, such a method tends to give the best view of voter-turnout because those who fill in online questionnaires or answer the phone to a polling company tend be more politically engaged – a factor noted on Ipsos MORI’s website. As a consequence, whilst the BES may have only interviewed 109 under-25 year olds, these results are of the “highest quality” and are weighted to be representative of the country as a whole. Is this sample size too small (as Kellner suggests)? The answer largely comes down to personal perspective and the ‘margin of error’. 

The truth is, we do not know if there was a ‘Youthquake’ or not - hopefully this article has shed some light on why that is. Personally, the most important issue is that young people are still (whether you prefer Ipsos MORI, YouGov or the BES) the age group  least likely to turn up to vote. Youthquake or not, Britain’s youth still have a lot of work to do.

Sources and Further Reading:

Image: secretlondon123 @Flickr


TalkPolitics is proud to be supported by Audible. For 50% off your new membership, click here.