5 tips for a successful researcher survey

By Andrea Chiarelli and Rob Johnson

Policymakers, university managers and academic publishers all have something in common: they have to work out what researchers want. Sometimes an educated guess is good enough, but in many cases knowing what your stakeholders, staff or authors really think will lead to better decisions.

Much of our work at Research Consulting depends on getting this information as quickly and efficiently as possible. We tend to take a mixed-methods approach, drawing on a combination of literature searches, data analytics, stakeholder interviews and our own knowledge to get to grips with issues. However, we often find that the easiest way to obtain feedback or gather information from a large number of people is to run a survey (if you are interested, our platform of choice is SurveyGizmo).

The problem is, designing a good survey is far from easy at the best of times – and designing a survey for researchers is a bit like cooking for Gordon Ramsey or singing for Simon Cowell. Fall even slightly short of the highest standards and you’ll soon know about it. Social scientists will scrutinise your approach to ethics, statisticians will critique your analysis, computer scientists will question your choice of software. Get these things right, however, and you can be confident that you’ll gain valuable insights into your target audience.

Since we have been doing this for a while now, we thought it worth sharing a few tips…

1. Speak the right language
Researchers are used to technical jargon and complicated concepts. However, when designing a survey, you need to provide them with simple (yet, accurate) formulations of questions and answers. This means that your questions should be precisely worded, go straight to the point, and follow a logical order.

At the same time, you should be careful not to patronise them, or they may feel you are wasting their time and leave the survey unfinished. It is also important that you use the right terminology, so as to build a set of questions and answers tailored and familiar to your audience.

What to avoid: vague questions, complex jargon, asking more than one question at a time, using inconsistent options throughout the survey

2. Prioritise, prioritise, prioritise
A survey has to serve a specific purpose and, thus, you must learn how to prioritise. For every question, you should ask yourself “How would my analysis change if I didn’t know this?” If the answer is “Not much”, then you should drop the question. Remember: the longer the survey, the lower the response rate.

Generally speaking, closed-ended questions (e.g., multiple choice, Likert scale) results in easier analysis and higher response rates. Even if you feel that free text could yield better insights, always consider whether a closed-ended alternative exists. Often, free text questions are non-specific and do nothing more than confuse the reader. However, used judiciously they can provide valuable information, allowing respondents to expand on the information provided by closed-ended questions.

In addition, we find it best to make most survey questions optional. Otherwise, error messages will simply frustrate respondents and make them inclined to give up altogether. If your researchers or authors care enough to complete the survey at all, you should trust them to answer the vast majority of the questions properly.

What to avoid: excessively detailed questions, too many free text questions, setting questions that aren’t key to your analysis as mandatory

3. Protect the survey data
Researchers are generally very aware of data protection and ethical considerations, so you should do your best to reassure them on these points. You should have a clear data protection and ethics policy, explaining exactly where the survey data will be stored, who will have access to it, and if/how it will be disseminated to the general public.

You should also provide clear statements on anonymisation and about your intention to seek respondents’ permission when quoting them. Ideally, you should follow a formal ethical approval process prior to dissemination of the survey. In our surveys, we usually provide a data protection statement and a link to a survey information sheet detailing all the information described in this section.

What to avoid: neglecting to provide a data protection statement or writing a sloppy one

4. Don’t overlook the value of testing
Before you disseminate your survey, always pre-test it! However much you think you have perfected it, a second, third or fourth pair of eyes will nearly always pick up something you have missed. You can ask your colleagues or even members of your target audience. This is an essential step of survey design, as it will allow you to spot glitches, issues, or problems with the interpretation of your questions.

What to avoid: rushing the survey into the dissemination stage, failing to listen to testers’ feedback

5. Send reminders and offer incentives
In order to maximise your response rates, it is always worth sending appropriately timed reminders, preferably excluding those who have already responded. After your first invitation to complete a survey, many potential respondents will likely forget about it. When you send reminders, you trigger new waves of responses (see the picture below).

Researcher survey response activity

Initial wave of responses and secondary wave induced by targeted reminders.

In addition, you can offer incentives to your respondents. Traditionally these might be vouchers or charitable donations but never be afraid of thinking outside the box. For example, we have offered additional compute time on a supercomputing facility in one case, and paid trips to an academic conference in another. Sometimes, it is enough just to appeal to researchers’ sense of community, particularly where the subject matter aligns closely with their professional or disciplinary interests.

Perhaps most importantly, though, let respondents know what will happen to the information they provide, and how they can access the final results. You’d be surprised how many are genuinely interested to see what comes of their contribution.

What to avoid: being scared or embarrassed of sending email reminders, picking generic incentives when other options are available, offering incentives when unnecessary

Some other resources
If you’d like to dig deeper into the creation of a survey and the analysis of its results, please see the following links:

Alternatively, if you prefer videos, we suggest these:


Leave a Comment