Each year the Innovation Panel is used to test survey methods and content. The results of these experiments help to develop the main Understanding Society study and provide new insights for survey practitioners around the world.
The Innovation Panel mirrors the same protocol and questionnaire content as the main Understanding Society Study. Over the years that the panel has been active additional experiments and tests have covered a variety of topics, including how participants respond to incentives, question wording, how participants give consent for data linkage, and what additional information are people willing to share with the Study. Some experiments run in just one wave of the Study, others take place over several waves.
You can read about the results of all Innovation Panel experiments in the series of Working Papers produced by the Innovation Panel research team. Below you can find examples of some of the experiments carried in recent waves of the Study.
Event triggered data collection
In each annual interview, Understanding Society collects data about life events, such as new partnerships, moving house, moving job, having a new baby and the onset of health problems. Asking for this information only once a year limits the amount and type of information the Study can collect, as participants might not remember exact details and feelings, expectations and subjective wellbeing can change over time. Collecting information about life events as they happen can be more reliable and give more detail about these life changes. But, collecting information more regularly during the year can be a burden to participants and may affect whether they decide to take part in their annual Understanding Society interview. As a longitudinal study, Understanding Society is careful to minimise respondent burden, to ensure that we maintain high response rates.
This experiment looked at whether participants would be prepared to send the Study information about their life events on a monthly basis and whether taking part in monthly event triggered data collection had an impact on participation in the main annual survey. When the research team looked at attrition rates, they found no difference between the randomised group who were invited to the event triggered data collection and the control group who were not invited, suggesting that asking for more regular information does not impact on participation in the main survey. The research team also asked respondents for permission to send them occasional survey questions by text message. This would be a simple way of sending a monthly question to ask whether respondents had experienced one of the life events of interest. Overall, 67% of respondents gave permission to be sent survey questions by text message.
You can read the full details for this experiment in this Working Paper. And you can read about the first phase of developing and testing Event Triggered Data Collection in this Working Paper.
Mapping questionnaire instruments designed to measure health-related quality of life
The EQ-5D is a short questionnaire instrument designed to measure health-related quality of life. It is used to measure health benefits in many studies and data from the EQ-5D is used provide evidence for the National Institute for Health and Care Excellence in England and other policymaking bodies. The original instrument (EQ-5D-3L) has been redesigned to include more detail (EQ-5D-5L), but much of existing cost-effectiveness evidence is based on the older, shorter, version. Statistical mapping from one version to another is used, but little is known about the effects of having both instruments in surveys. This experiment explored whether the inclusion of both versions of the EQ-5D at different stages of the interview gives a reliable picture of the relationship between health measures from the two instruments.
When the results were analysed, this experiment found that the sequencing and timing of the instruments within an interview may affect response, so there is a strong case for survey designs that randomise the ordering of the two instruments. The question ordering also has an effect on the estimated mapping models.
Read the research paper: Mapping between EQ-5D-3L and EQ-5D-5L: A survey experiment on the validity of multi-instrument data, Mónica Hernández-Alava and Stephen Pudney
Consent for data linkage
This experiment was designed to provide insights into how survey participants make the decision whether to consent to linking their survey responses to administrative data records. It also aimed to help researchers understand why participants are less likely to consent to linkage if they answer the request online, rather than in a face-to-face interview. The experiment made use of the mixed-mode design of the Innovation Panel, where part of the sample are first allocated a face-to-face interview, with the rest of the sample being allocated a web survey. At Wave 11 all participants were asked for consent to link their survey data to HMRC tax records. Participants were randomly allocated either a ‘standard’ version of the consent question, that had previously been used in the main Understanding Society sample, or they were allocated an ‘easy’ version of the question, which used shorter sentences and words, no passive sentences and was broken up using bullet points, making it easier to read.
The consent request was followed up by a series of questions asking about how the participant made their decision, what they had understood about the consent request, how much they trusted the organisations involved and to what extent they perceived the request as sensitive.
You can see the results of this experiment in this presentation from Associate Director for Innovations, Professor Annette Jäckle:
This experiment was part of a broader project on understanding consent to data linkage. You can read about this project on the ISER website.
Fieldwork compression
Fieldwork for the main Understanding Society survey currently takes over two years per wave of the Study. The sample is issued as 24 monthly batches and each sample month is in the field for five and a half months. To reduce the time between data collection and the release of the data for research, this experiment considered whether the fieldwork period could be compressed by issuing the sample over 12 months, instead of 24. One issue with compressing fieldwork is that half the sample would miss the rotating content from a particular wave. To avoid this loss of content, this experiment used Wave 13 of the Innovation Panel to look at different ways to collect an additional set of modules. One group in the sample were given a longer, continuous interview having received an advance letter noting that their interview would be longer this year. This group was also given an increased, unconditional incentive. A second group were given a ‘break point’ in their interview, where they had the option to answer the additional questions once they had reached the end of their normal interview. Completing the additional questions earned participants a further incentive. A final group received their normal Innovation Panel questionnaire and no additional incentive.
This experiment found that informing participants of the longer interview in advance and giving them a larger incentive positively affected the response rate, compared to groups who were not given this information. In the group with the break in the survey, around a fifth of those who responded to the survey opted out of the additional content. The overall take-up of the additional content was highest for the group who were told about the extra questions in advance. The results of this experiment suggest that if Understanding Society was to compress fieldwork for the main survey and wanted to add rotating content to the standard interview, participants need to know this information before they start their survey.
You can read about the full experiment in this Working Paper.
Mental health questions comparisons
The main survey questions about the diagnosis of health conditions remained close to static for the first nine waves of the survey. At Wave 10 and Wave 14 of the main survey the question wording about whether a doctor had ever diagnosed a mental health condition changed. Reviewing the data, there is some evidence that the change in wording has resulted in changes in prevalence. This experiment in Wave 16 compares the three versions of the questions; pre Wave 10, with Wave 10 changes, and with Wave 14 changes.