Skip to content

Blog

Getting informed consent to link data

New research looks into what people understand about data linking consent and how they decide whether to give it

person walking across station concourse

Linking different datasets boosts social science and policy research, because gaps in one can be filled by another, giving researchers a more complete picture. As a result, survey methodologists want to maximise consent, but also want to make sure people’s decisions on consent are informed.

Experiments

Our project used Understanding Society’s Innovation Panel of 2,900 people and three surveys ranging in size from 2,000 to 5,700 from the online PopulusLive panel to examine how people think about linkage and test ways of increasing levels of consent.

We carried out a range of experiments, including:

  • different kinds of wording on consent questions – does it help to make the wording easier to read?
  • asking about consent either at the beginning of the questionnaire or the end
  • offering more information about consent and linkage
  • emphasising the trustworthiness of the organisation we wanted to link with – the National Health Service (NHS) or HM Revenue & Customs (HMRC), for example
  • asking if people are willing for their data to be linked to several other datasets (multiple consents)

Where we had permission, we also listened to recordings of face-to-face interviews (which took place in 2018) to see how interviewers and respondents interact when asking and answering questions about consent for data linkage. .

People’s understanding

We found that people with a better understanding of what they’re being asked are more likely to say yes to having their data linked. However, if we managed to increase their understanding by making the consent question easier to read, this does not make it more likely that they will consent. If the wording was complicated, fewer people agreed to data linkage if the question came at the end of the questionnaire than if it was at the beginning. However, the position of the question didn’t affect people’s understanding.

There are some potential reasons for this, one of which is cognitive fatigue – people being bored and/or tired at the end of an interview. But if this was the case, the placing of the question at the beginning or end would make a difference, and it doesn’t. Could it be ‘giving fatigue’ – people feeling they’ve already handed over enough information about themselves? If this was the case, we would expect the placing of the question to have a consistent effect, whether the question was easy or hard, and it doesn’t.

That leaves us with ‘risk fatigue’ – people being less willing to accept the risk/uncertainty of consenting when they’re tired/bored at the end of a questionnaire. Why this happens and whether we can reduce the effect is a question for future research.

Individual characteristics

There was no social class or demographic group that was more likely to consent, although people who are more highly educated and/or already share their data (by using social media and apps, and buying or banking online, for example) were more likely to understand the request. Those who did consent tended to have positive attitudes to data sharing, trust in the organisations we asked about linking with, and knowledge of what they do.

Fast and slow decisions

We also found that people make decisions in different ways. Around 30-40% of people are reflective, considering the issue before they decide, but the rest make quicker decisions based on habit or gut feeling. A quicker decision is more likely online, and a reflective one more likely in a face-to-face interview.

The more reflective process is more likely to lead to understanding and consent, so we might expect to find that providing more information would increase levels of consent – but we did not find this to be the case. This may mean that, if people decide quickly, they are unlikely to incorporate the extra information into the process, so it might be more effective to try another way to shift them towards more reflection. This, again, is an avenue for further investigation.

Face-to-face v. online

As with previous research in this area, we found that there was more consent in face-to-face interviews – and easy or hard wording made no difference here. Web respondents were more likely to be concerned about security, to have less understanding, and to make quicker, habit-based decisions.

However, we also found that in face-to-face interviews, respondents rarely ask for more information about data linkage, and interviewers rarely give it – so this is not what’s driving consent. The most likely explanation is that people find it more difficult to say no when there’s someone in the room with them.

Multiple consent requests

We tested various ways of asking for consent to link several different datasets in the same survey, including:

  • asking about each dataset individually, on a separate page of the online survey
  • asking about each one individually, but all listed on one page
  • listing the datasets, but asking one question – yes or no to linking to all of them

We also varied the order of the different datasets – with either the NHS at the top of the list, or HMRC first.

Our overall finding here was that the format of multiple consent requests doesn’t seem to matter. The order mattered – if the NHS was at the top of the list, consent levels were higher than if HMRC was – but otherwise format had no effect on average consent rates. Interestingly, asking one question about all five datasets led to a higher ‘yes to all’ rate than asking about them separately – but it also led to higher rates of ‘no to all’.

Our conclusions

We’ve learnt a number of things from this research – perhaps most importantly that easy to read consent questions can increase informed consent. Emphasising the trustworthiness of the organisations involved in the linkage can, too – and when asking for multiple consents, it helps to start the list with an organisation for which consent rates are typically higher.

Perhaps most importantly, we know that survey respondents make their decision in different ways. Acknowledging this, and tailoring the information provided may help increase informed consent. Future research in this area could be based on related research on consent forms for clinical trials.

Looking ahead, if we can shift respondents towards reflective decision making, for example by telling them before we ask for consent that we are also going to ask if they understood the request, this might increase informed consent.

In web surveys, we could highlight the security of data transfers and the fact that neither the survey nor the linked data are stored anywhere on the web. Emphasising the social desirability of giving consent might increase consent in web surveys as well.

This research was funded by the Nuffield Foundation with co-funding from the ESRC.

More information on this project 

Authors

Annette Jäckle

Annette Jäckle is Professor of Survey Methodology at the Institute for Social and Economic Research at the University of Essex

Jonathan Burton

Jonathan Burton is Associate Director, Surveys at the University of Essex

Mick Couper

Mick Couper is Research Professor at the Survey Research Centre, University of Michigan

Thomas Crossley

Thomas Crossley is Professor of Economics at the European University Institute and University of Essex

Sandra Walzenbach

Sandra Walzenbach is a Research Assistant at the University of Konstanz

Survey methodology

Email newsletter

Sign up to our newsletter

Enter your email to receive our newsletter updates.