How to run a successful survey and raise response rates

The HEA runs three surveys: the Postgraduate Research Experience Survey (PRES), the Postgraduate Taught Experience Survey (PTES) and the UK Engagement Survey (UKES). Surveys Officer Ioannis Soilemetzidis here suggests some tips for making surveys in general more successful. 

A question that comes again and again when discussing issues related with ‘Surveys’ is: How to run a successful survey and raise response rates?

This simple, yet encompassing question has no easy answer, while at the same time it is influenced by a number of factors. Communication is one key aspect which often does not receive the attention it deserves. The golden rule to remember when thinking about communication is the Aristotelian "triptych" (which was well analysed by John Baldoni in 2012):

1. ‘Tell’ them what you will tell them.

2. Then ‘Tell’ them.

3. After that, ‘Tell’ them what you just told them.

To begin with, when working with surveys, one needs to bear in mind that ‘raising response rates’ does not mean that you should be aiming for 100%, or even anything remotely close to that percentage. This is simple not realistic, which has been also the conclusion of relevant research. For example Porter and Whitcomb (2005, p. 138) reviewing a number of surveys on a range of subjects, concluded that “in most populations a significant proportion of people never participate in a survey”.

A realistic target could be to aim for a response rate that you think (based on the nature, scope and desirable outputs) will give you a methodically sound representation of the particular student population viewpoint. Thus ensuring as best as possible that the results are robust and that you can base your institutional decision-making process on them with a realistic degree of confidence.

The length of the surveys is also something that could influence completion rates and the quality of comments (if one includes open-ended questions). One should bear in mind this factor, especially when adding institutional questions to a standard template (e.g. PRES, PTES, etc.), tempting as it might be to add as many questions as one might think could be useful to know the answer, restraint is a must, and less is more!

Consider developing a plan in advance about different aspects of your survey. For example, how you will deal with bounced emails, or how you will target (or motivate) those that can’t (for any reason) be reached via email (or don’t respond) by using an alternative mode of communication (Dillman et. al. 2008), thus consider other means of promotion, communication and perhaps alternative survey completion arrangements. Some might think that setting up a number of email reminders (the more the merrier) is job done! But unfortunately, nothing is further from the truth.

Porter and Whitcomb (2005, p. 128) concur and reinforce that very simple message: “The issue here is simple: no matter what efforts are made to elicit cooperation from survey non-respondents in a follow-up survey, large numbers of these non-respondents will also refuse to participate in the follow-up survey.”

Another point to think about is the target audience, for example one student survey found that postgraduate students’ response rates are higher than those of undergraduate students, when responding to the same survey (Sills and Song 2002). Other key issues are the use of appropriate promotional material, developing good relationships with key contacts/stakeholders that can help your effort, the language/tone of emails and other text/messaging, offering incentives, endorsements, and so forth.

Exploring the effects of key design features, such as wording, personalisation, incentives, and timing, Sauermann and Roach (2013, p. 273) suggested that “personalization increases the odds of responding by as much as 48%, while lottery incentives with a high payoff and a low chance of winning increase the odds of responding by 30%. Furthermore, changing the wording of reminders over the survey life cycle increases the odds of a response by over 30%, while changes in contact timing (day of the week or hour of the day) did not have significant benefits.” At times it could be about conflicting priorities and timing. Students after opening an email inviting them to complete a survey might think: ‘I will deal with this later’ and keep moving this down the scale of their list to do, until they simply forget about it, or think it is too late to take part, or their attention is overtaken by other priorities, until it is actually too late to respond.

Now, a simple question: if you receive, at the same time, an email from a senior manager outside your department and the Vice-Chancellor (both requesting a response) - which of the two you will give more attention to and respond to first? Students will usually do the same!

So, perhaps, having the Vice-Chancellor, Pro-Vice-Chancellor (or equivalent) signing the initial survey notification email could make a significant difference not only on how students will react after receiving your survey invitation but also ensure cooperation, active support and participation from other internal stakeholders (e.g. tutors, head of departments, student union, etc.).

Informing students that they will receive a survey and that their input is vitally important and much appreciated could also help. This can be done via a variety of mediums. Posters on campus, personal tutors, student reps, electronic notice board, flyers, student union magazine and website, social media, school/course base activities, etc. Also, as in other projects, one needs to be organised and have a flexible approach, but most importantly one will need other people’s help. So, knowing your key people (tutors, student reps, student union officers, key admin staff, IT support colleagues, etc…) and building long term good professional relationship is a key.

Do not try to build networks only just before you launch the survey, and when talking to people about what you do, don’t focus on why this is important for you. Develop good professional relationships continuously, and let colleagues and students know how they will benefit by helping you. Let them know why this survey is important for them! Get people on-board not just before you need their help, think and plan in advance, solid professional working relationships take time and a lot of effort.

Some other factors that affect response rates could be: the discipline of study, the characteristics of the higher education institution, “makeup of the student body” and the availability of resources (e.g. the number of computers per student), all of which can have “a strong positive effect for web survey response rates” (Porter and Umbach, 2006).

So, at the end there are no easy solutions or a one-size-fits all recipe - no quick fixes exist. Practice makes one better, learning from mistakes of other people, working collectively, trying and failing, persistence and resilience, are ‘the keys’ for both running a successful survey and for raising response rates.

Discussion

What is your opinion? What is the experience in your institution about raising response rates? What other aspects might affect student participation?

References:

Baldoni, J. (2012) Give a Great Speech: 3 Tips from Aristotle, Inc. Magazine, Published on 4 May 2012, available online at: http://www.inc.com/john-baldoni/deliver-a-great-speech-aristotle-three-tips.html

Dillman, D. A., Smyth, J. D. and Christian, L. M. (2008) Internet, mail, and mixed-mode surveys: The tailored design method, 3rd edition, Hoboken, NJ: Wiley

Porter, S. R. and Umbach, P. D. (2006) Student Survey Response Rates across Institutions: Why Do they Vary?, Research in Higher Education, Vol. 47, Issue 2, pp. 229-247

Porter, S.R. and Whitcomb, M. E. (2005) Non-response in student surveys: The Role of Demographics, Engagement and Personality, Research in Higher Education Vol. 46, Issue 2, pp. 127-152

Sauermann, H. and Roach, M. (2013) Increasing web survey response rates in innovation research: An experimental study of static and dynamic contact design features, Research Policy, Vol.  42, pp. 273-286

Sills, S. J. and Song, C. (2002) Innovations in Survey Research: An Application of Web-Based Surveys, Social Science Computer Review, Vol. 20, Issue 1, pp. 22-30

Blog work area: