Skip to content

Innovation Panel Competition

Have you got a research idea and need certain data to carry out your work? Are you a researcher wanting to develop an experimental or methodological element? The Innovation Panel could help you.

Why enter?

It’s an opportunity to field your study, with support from the Understanding Society Innovation Panel team.

Access rich data

The Innovation Panel has rich household data you wouldn’t get through other research channels.

Test your ideas

The Innovation Panel is a testbed for new and existing projects, without the associated cost.

Get valuable support

Understanding Society will provide you with a high level of support throughout your research.

There are no costs to successful applicants, unless the proposal includes non-standard survey elements (e.g., task-related incentive pay-outs or additional mailings).

How to apply

The Innovation Panel Competition is open.

The Competition runs each year so if your project is not ready now, sign up to our mailing list to find out when future versions of the Competition open.

Researchers can submit two types of proposals:

Experiments and methods tests

Proposals for survey methods experiments and evaluations, and other studies that use experimental methods. Methodological studies could relate to the design of survey instruments (e.g. question wording, item order, etc.) or to survey design features (e.g. procedures intended to reduce non-response or to improve fieldwork efficiency).

Experiments and methods tests application (Microsoft Word)

New survey questions

Proposals for new survey content that will be used to address innovative research questions. The proposed survey questions can be on any relevant topic, and are limited to one minute of questionnaire time.

New survey questions application (Microsoft Word)

Template for specifying questions

Proposals must be accompanied by a draft specification of all questions being proposed. If you would like, you can use this spreadsheet-based template to help specify them and calculate question timings.

Spreadsheet for specifying questions (Microsoft Excel)

In preparing proposals, applicants are encouraged to consult with Jim Vine (jim.vine@essex.ac.uk), about how their study could be designed to meet the Innovation Panel criteria.

Completed proposal forms must be accompanied by:

  • the draft specification of questions or other required text, along with details of what each question is intended to measure, whether using the spreadsheet template or not,
  • a CV (maximum 2 pages) for each named proposer, and
  • a list of any references cited in the Case for Support.

Entry criteria

For all studies

Proposals of all types must meet the following mandatory criteria:

  • The research questions must be within the general remit of Understanding Society.
  • The proposal must include a draft specification of all proposed questions and other required text. (Proposers may use the spreadsheet template to provide this specification, if they would like.)
  • Descriptions must be provided of constructs or concepts that questions are intended to measure or other details of question objectives, along with descriptions of how the questions measure them. If using the spreadsheet template, there is a column that can be used.
  • The expected sample size must be sufficient to meaningfully investigate the proposed research questions. See the FAQs below for information on the estimated number of respondents in this wave.
  • The design must be within the resources of the Innovation Panel, in terms of questionnaire time, development time, and system costs. The Innovation Panel questionnaire is around 40 minutes on average, including all the standard Understanding Society questions that need to be asked. See the FAQs below on how to calculate expected question timings.
  • The proposal must not pose a threat to the future of the panel.
  • The proposed questions must be free of copyrights, licence fees or any other intellectual property concerns that would hinder Understanding Society from using the questions.
  • The proposed questions must be justifiable to an ethics committee.
  • The proposers must plan to analyse the resultant data and seek publication of the findings.

For this wave’s competition (IP20), there is an additional mandatory criterion:

  • The proposal must not contain any additional data collection from panel members outside of the annual questionnaire. See the FAQs for further information.

For experiments and methods tests

Proposals must meet the following additional mandatory criterion:

  • Proposals must be for projects contributing to the survey methodology discipline, such as through the design of survey instruments (e.g. question wording, item order, etc.) or through survey design features (e.g. procedures intended to reduce non-response or to improve fieldwork efficiency); or they must use experimental methods as a central part of their design.

For new survey questions

Proposals related to new survey content may be on any topic within the broad remit of Understanding Society, but must meet the following additional mandatory criterion:

  • The new questions must be limited to about one minute of expected questionnaire time. See the FAQs below on how to calculate expected question timings.

Additional judging criteria

Proposals that meet the mandatory criteria will be judged on their scientific merit and value for money based on the following:

  • The proposal has significant scientific value, with innovative, well-formulated research questions that clearly articulate the topic of the study, and communicate its importance. The proposal demonstrates its potential to contribute to scientific knowledge, including by addressing research questions on topics for which there are important gaps in existing knowledge or possibly (for methodological studies) by proposing innovative methods with promising potential.
  • The research design is well thought through and appropriate for investigating the research questions, including the specification of the study design and implementation, adequacy of sample sizes, and strength of the analysis plan.
  • The proposed survey questions are feasible, ensuring respondents will be able to understand them, and likely to know or be able to recall the answer to them, and whether respondents will generally be willing to answer the questions. The constructs / concepts / objectives of the questions should be well-defined, and the questions should be likely to measure what they are intended to measure.
  • The proposal demonstrates strong potential for the findings to be published in the scholarly literature.
  • The proposal represents good relative value of its potential contribution when compared to the amount of questionnaire time (or other resources) it will require.

Preference will be given to proposals that exploit the longitudinal or household nature of the Innovation Panel data.

In addition to assessing the quality of individual proposals according to the criteria above, we will aim for a balance of topics and methods across proposals. This is to ensure that the Innovation Panel interview remains interesting and reasonable for respondents.

Decisions and next steps

Proposers will receive preliminary notification of the review panel’s decisions in March 2026. Final acceptance will be conditional on fully establishing the feasibility of the proposed study.

Successful proposers will be expected to work with the Innovation Panel survey team to develop and finalise the details of the implementation of the study. This will include refining the questions to be asked, checking the questionnaire specification or other relevant documents, and testing the computerized questionnaire script. The Understanding Society team will provide guidance and support successful proposers in finalising the question specification, and where appropriate will conduct pre-testing of draft questions.

Once data are available, proposers will be expected to analyse and report the main outcome(s) in a summary form appropriate for inclusion in a Working Paper.

Successful proposers will be asked to sign a memorandum of agreement indicating their willingness to carry out these development and analysis activities.

Proposers are expected to publish their research based on the resultant data. Proposers will be given early access to the data, as soon as possible and in advance of general release via the UK Data Service.

Background to the Innovation Panel

Understanding Society: the UK Household Longitudinal Study is a major research study designed to enhance understanding of life in the UK and how it is changing. The Study, funded primarily by the UK Economic and Social Research Council, takes a sample of 40,000 households containing around 100,000 individuals and attempts to interview all household members annually. A large boost sample of ethnic minority persons is also included. The first wave began in January 2009 with interviewing spread over 24 months.

The Innovation Panel is an important and integral part of the design of Understanding Society. It consists of the original sample of around 2,500 persons clustered within households, plus refreshment samples of around 700 persons added in 2011, 2014, 2017, 2018, 2019, 2021, and 2025. Interviews have been attempted with all adult sample members at annual intervals starting with wave 1 in 2008. Wave 18 went into the field in spring 2025, and featured a larger-than-usual refreshment. Subsequent waves of interviewing will continue to be carried out at annual intervals (subject to funding), replicating the household panel survey design employed by Understanding Society, whereby attempts are made to re-interview all sample members regardless of changes in household composition or geographical location, as well as any other (non-sample) members of the current households of sample members.

The Innovation Panel uses mixed mode data collection, whereby some sample members complete the interview online, some with a face-to-face interviewer, and a few with a telephone interviewer. The survey instruments contain the same questions in each mode, with only minimal differences. The survey also collects interviewer observations (where the interview is conducted face-to-face), call records (where the interview is conducted face-to-face or by telephone) and time stamp data for questionnaire sections.

To date, a range of experiments and tests have been incorporated into the Innovation Panel. These are documented in the Innovation Panel User Guide and in a series of Understanding Society Working Papers summarizing findings from experiments and other studies.

Timetable

The Innovation Panel Competition for wave 20 is taking place between now and summer 2026. The key milestones are summarised below.

PhaseAction and eventsDate
Deadline for entriesEnsure proposal meets mandatory criteriaFriday, 12 December, 2025
Notification of decisionsProposers notified by emailMarch 2026
Development workWork with implementation team to finalise the details of your studyMarch-June 2026
FieldworkInterview IP householdsApril-October 2027
Early data available to proposersReport on key outcome(s) for a Working PaperEarly 2028
Data released through UK Data ServicePublish your findingsSummer 2028

Tips for experimental designs

Randomised allocations to treatments

The randomised allocations can be created at the household or the individual level. We tend to favour household level randomisation, whereby all members of a household receive the same treatment. Since the sample households are all known in advance, we can calculate the random allocations with stratification in advance of fieldwork, reducing the likelihood of accidental imbalance across characteristics.

In contrast, not all individuals are known in advance of fieldwork, because new members of sample households are eligible to complete the survey. Consequently, we cannot pre-allocate individual randomisations and have to randomise within the questionnaire script. That is, the randomisation variable is created while the respondent completes the survey. This prevents stratification; allocations made within the questionnaire script are completely at random.

If you have a strong justification for requiring allocation at the individual level, please provide it within your proposal. Similarly, if you require the randomisation to be responsive to the answer to an earlier question, please state that.

Question order experiments for duplicate information

In the past, we have sometimes carried experiments where two questions aiming to gather equivalent information were asked at different points in the questionnaire, using a randomised question order design to ask some respondents Version A first and Version B later, and other respondents receiving Version B first and Version A later.

In principle, these permit both between-subject and within-subject analyses of the resultant data. In practice, we have found a sizeable proportion of respondents notice the second question is asking for essentially the same information again, and many of them view it as a problem. Consequently, both from a respondent-experience perspective, and because we are sceptical of the data quality of the second question each respondent answered, we will not normally accept designs where variant questions seeking equivalent information are asked twice. Proposers should normally consider between-subjects designs, where each respondent is asked only one variant of the question.

Question order experiments are normally fine in cases where the questions being asked are seeking different information such that they would not be viewed as duplicates by respondents.

FAQs

What studies have previously been carried out?
The studies implemented in the Innovation Panel are summarised each year in an Understanding Society Working Paper and in the Innovation Panel User Guide.

I’m not sure whether I meet the criteria. Who can I talk to?
Please get in touch with Jim Vine (jim.vine@essex.ac.uk) to discuss your ideas in the first instance.

How much questionnaire time can my proposal take?
Proposals for new survey questions have a limit of one minute of estimated questionnaire time.

Proposals for experiments and methods tests do not have a hard limit but do need to fit within the overall time available. All else being equal, a shorter proposal will be favoured over an equally strong longer proposal, as it will leave more space for other proposals.

As IP20 is a biomarker collection wave, we expect there to be less questionnaire time available for allocation through the competition than at other waves. Proposers may wish to take particular care over ensuring their proposed questions represent good value for the amount of questionnaire time they would take.

In all cases, the amount of estimated questionnaire time should factor in the proportion of respondents who will be asked a given question, which can be less than 100% if certain questions are not asked of all respondents. See below for details.

How do I calculate expected question timings?
We provide a spreadsheet template that includes the rules of thumb listed below and helps proposers to specify questions for their proposals. Please feel free to use it if you find it helpful.

  • yes-no questions: 10 seconds,
  • ‘select one’ single choice questions: 15 seconds, (allow longer for long lists of response options),
  • ‘grid-like’ questions, selecting one option per row from consistent choices: 10 seconds per item,
  • ‘select all that apply’ multi-choice questions with a moderate number of options: 18 seconds, (allow longer for long lists of response options)
  • open numeric questions: 16 seconds, and
  • open text questions: 71 seconds.

The ‘yes-no’ estimate of 10 seconds only applies to questions with only those two options or very similar alternatives. If a question has variants on this (e.g., 1. “Yes”, 2. “No because of ABC”, 3. “No because of XYZ”) then the ‘select one’ estimate should be used.

Expected question timing calculations should take into account the proportion of the sample asked the question and the timing per question. For example, for a “single choice” question asked of women only, the expected question time is the proportion of female respondents (0.55) × the timing of a single choice question (15 seconds) = 8.25 seconds.

Do I need to include Understanding Society questions in my timing estimate?

If your study needs data from existing Understanding Society questions, you should check whether it is a question asked at each wave.

For each variable you want to use, please use the variable search tool to check whether it is collected at each wave.

If you plan to use data from existing Understanding Society questions that are asked as standard each wave, these do not add to your timing estimates. (You should still detail them in your proposal in case they might be affected by any other proposals.)

Understanding Society questions that would not otherwise be carried in this wave should be included in your timings estimates.

How should my questions handle ‘don’t know’ / ‘prefer not to say’ responses?
The standard approach in Understanding Society is that questions are initially displayed with only the substantive responses, and if the respondent tries to go to the next question without providing an answer, the question will be re-displayed with the non-substantive options added.

To ensure consistency of respondents’ experience, proposals should normally use the standard approach. We will automatically implement this on any successful proposals by default.

If, exceptionally, your research requires a different handling of non-substantive responses, please specify this within your proposal and provide a justification for why it needs to deviate from the standard approach.

What is the sample size of the Innovation Panel?
We estimate the sample size in wave 20 will be about 4,000 individuals. However, proposers should note that patterns of attrition / non-response have been quite variable in recent waves. Our estimates for wave 20 include panel members whose households joined in the large refreshments at wave 18 but whose first individual interview will be in wave 19, which has not yet gone into the field. Our estimates also attempt to take into account possible attrition both at wave 19 and wave 20. Consequently, there is a higher-than-normal degree of uncertainty about what the number of completed adult interviews at wave 20 might be.

Where can I find out about the content of the Innovation Panel?
Please see the Innovation Panel questionnaires and the User Guide.

Who judges the competition?
All eligible proposals are reviewed for feasibility and considered by a panel of reviewers, mostly drawn from Understanding Society’s Co-Investigators, Executive Team, topic champions and oversight board members.

Can I submit more than one proposal?
Yes – there is no set limit to the number or proportion of proposals that will be accepted. We will accept as many good proposals as can reasonably be carried out in conjunction with one another.

Can I ask for data to be collected in more than one wave?
Proposals are normally accepted for a single wave only. You can reapply in a later year to re-include the same or similar questions in a subsequent wave, but such a re-application would be considered on its own merit in the new competition.

If, exceptionally, your proposal has a strong requirement for inclusion of questions over multiple waves, please provide the justification within your proposal.

When will I find out if my idea has been accepted?
We expect to notify proposers in March 2026. Final acceptance is conditional on fully establishing the feasibility of the proposed study with the fieldwork agency.

Do I have to pay for anything?
Standard data collection costs will be borne by Understanding Society and there will be no cost to successful proposers. Costs for non-standard elements of data collection (e.g. additional mailings or task-related incentive payouts) will be borne by proposers.

When will the data be made available?
The data are made available to competition proposers as soon as possible and in advance of general release via the UK Data Service.

What happens next if my proposal is accepted?
Successful proposers work with the Innovation Panel survey team to develop and finalise the details of the implementation of the proposal. This will include addressing comments received from the assessment panel and addressing any clashes with existing (similar) questions already in the survey. Once the data have been collected, researchers are expected to analyse and report on the main outcome(s) in a summary form appropriate for inclusion in a Working Paper and to publish their findings based on the resultant data.

Can I propose a study that will collect data from participants outside of the annual questionnaire?

Not at IP20, but probably in future competitions.

Several previous studies on the Innovation Panel have included the collection of additional data outside of the annual questionnaire. These have included things like time-use diaries, app studies, and data donation projects.

Because IP20 is a biomarker collection wave, respondents are already being asked to complete an additional task outside of the annual questionnaire. In addition, the timelines for ethical approval of the biomarker data means it would not be feasible to complete the stages required for additional data collection tasks. Consequently, proposals to collect additional data from participants are out-of-scope for IP20.

This reflects the specific practical requirements of IP20 and does not reflect a decision to exclude these studies indefinitely. Subject to future reviews of the competition, the ability to propose such studies is likely to return in future waves.

If you are not sure whether your study would count as requiring data collection outside of the annual questionnaire, please get in touch with Jim Vine (jim.vine@essex.ac.uk).

Email newsletter

Sign up to our newsletter