Skip to content

Blog

Who answers panel surveys every time, and who doesn’t?

Understanding attrition helps make sure our sample is representative

a crowd of people in a shopping area

One of the strengths of panel studies like Understanding Society is that the data are longitudinal, so researchers can study the same people over time, seeing what changes and what stays the same. With each wave, the data become more valuable for longitudinal analysis.

One of the challenges of running a panel survey, though, is that participants leave. They may simply stop responding, or move house and not be contactable. Others will move out of the scope of the survey – that is, a change in their circumstances might make them no longer eligible – and some, inevitably, die.

This is known as panel attrition, and it’s important because it reduces the size of the survey’s sample, which could make it less representative of the population as a whole. And, if some groups – ethnic minorities, perhaps, or people on low incomes – are more likely to drop out than others, that could mean they’re under-represented. If we can understand who ‘leaves’ and who ‘stays’, it might be possible to target the people most likely to leave to encourage them to keep taking part.

Existing research

The existing research on attrition suggests that people less likely to respond to surveys are young, male, single, or students, and tend to live in urban areas, in rented properties or flats. Those more likely to take part will be married or more educated, and be older, be homeowners, and have higher incomes.

There are other factors, too. Some studies show that people with children are more likely to respond, but in others, balancing work and family life means they may be less available for an interview. By contrast, retired people are more likely to have time – and to be at home when the interviewer calls.

More vulnerable groups, such as those with lower levels of education, lower income, the unemployed, and people in poor health may be less easy to find, and less likely to respond. However, they may also find that the survey asks questions about these elements of their lives – and find the questions intrusive.

A different approach

The existing research tends to think of someone not responding to one wave of a survey as attrition – but patterns are more complicated than that. Rather than dropping out completely, people may come and go over a number of waves.

Also, we can’t be sure that someone who’s left has done so because of their interactions with the survey. It may be down to something else. The more we know, the more we can see whether some groups are likely to drop out completely, or to respond intermittently – and who might be easier to encourage back.

Using the data

To find out more – and specifically about Understanding Society – I looked at 9,912 people who were in the British Household Panel Survey when it started in 1991, and transferred to Understanding Society soon after it started in 2009, looking at a total of 26 waves of data. I used a technique called latent class analysis, which helps to group data – and, in this case, to identify and categorise patterns in the way people respond to the survey.

Because Understanding Society asks such a range of questions, I was able to look at factors such as age, gender, ethnicity, education, employment, self-rated health, and whether respondents had a partner or not. I could also consider household characteristics, including net income, number of children, what kind of housing they lived in, and whether they rented or owned. I also looked at population-wide death rates to estimate how many people who had stopped responding to the survey were likely to have died.

Who leaves – and who stays?

I identified seven classes of respondents:

  1. ‘Loyal’ – 34% of the sample (3,366 people)
  2. Attrition by Wave 8 – 21%
  3. Attrition by Wave 22 – 17%
  4. ‘Stayers’ – 12%
  5. Attrition by Wave16 – 12%
  6. ‘Abruptly nudged’ – 2%
  7. ‘Gradually nudged’ 1%

…and plotted their trajectories on this chart:

chart shows trajectories of groups who stay loyal and/or leave the study

The ‘loyal’ group have taken part in every wave since the beginning of BHPS. The three ‘attrition’ groups stopped taking part, but at different rates over time. The ‘stayers’ were very loyal throughout the BHPS, but their numbers started to fall after the change to Understanding Society.

In the two smallest groups – the ‘nudged’ ones – we see rates fluctuate over time, dropping off and then coming back up. Class 6 begins to decline after Wave 8 of BHPS, but picks back up significantly when invited to join Understanding Society. They decline again, but, unlike the attrition groups, their rate doesn’t drop to zero – many keep responding. Class 7 declines early on in the lifetime of BHPS, but something encourages them to re-engage, especially after BHPS Wave 8.

Make-up of groups

I’ve created a table to show some of the details of how each class is made up. As you can see, classes 1-6 are 50-59% female, but 7 is just 42% female, meaning that the ‘gradually nudged’ class has more men than women. Also, numbers of ethnic minorities in classes 1-6 are mostly low, at 2-6%, but class 7 is 12% ethnic minority respondents.

table showing %age of male, female, and ethnic minority in each group, and mean age

The classes who leave the survey completely – 2, 3 and 5 – add up to exactly half of the sample, but follow different patterns. It’s interesting, too, that classes 3 and 4 both stick with the survey until the changeover to Understanding Society – while for class 6, that’s their cue to begin responding again.

Conclusions

Overall, I found that loyal respondents tended to be older, especially pensioners, more highly educated, and from smaller households. They had also moved house less – and all of these are consistent with current attrition research.

The classes who have remained relatively loyal, but not responded every time (‘stayers’ and the ‘nudged’ classes) have more in common with the ‘loyal’ than they do with the other classes. The nudged classes represent atypical patterns of response not seen in previous research. We need to take account of these classes and their characteristics to avoid bias.

I found that ethnic minorities were more likely to be in the attrition classes than the loyal one. They were also not likely to be in the ‘abruptly nudged’ class – the one which dropped off, but re-engaged after the change to Understanding Society. However, they were more likely to be ‘gradually nudged’ than loyal, suggesting a growing interest in the survey as it goes on. This suggests that people from ethnic minorities may respond to encouragement to take part, but that the process may take time.

In the future, research could use these classes to test different interventions – incentives and reminders – to see if they encourage more participants to respond to the survey, and if so, which is best. Future research could also look at whether response patterns are different for people who are difficult to contact and those who don’t want to respond. All of this could help us to understand more about attrition and increase participation.

Read the original paper

Authors

Nicole James

Nicole James is Survey Data Officer at Understanding Society, and a PhD Student in Survey Methodology at the Institute for Social and Economic Research at the University of Essex
Survey methodology

Email newsletter

Sign up to our newsletter