One of the challenges we face in running national surveys is the tension between wanting to make improvements and the need for consistency.
We provide benchmarks to providers, compare them to groups of others. We can adjust for subject and demographic factors. We can dig in to look at relationships between engagement and perceptions of course delivery. Yet, the best benchmark is always the previous years within the same organisation. Whilst context varies so widely between institutions, much of the same structures and agents will be in place from one year to the next within an institution. That leaves the central variables as change within the institution and the new student cohort. Thus, improvement can often be readily identified and random noise discarded.
Yet, we cannot preserve our surveys in aspic. Priorities change and so must the measures we use to estimate and stimulate enhancement. Every year we have bright ideas to reflect current trends, feedback from HE providers calling for tweaks or improvements, or changes in survey platforms that allow us to do more. But even what seems like a minor change can impact on hard won trends.
In a prior revision of the Postgraduate Taught Experience Survey (PTES) the section on student motivations was moved from the front of the survey to the back of the attitudinal questions. It was thought that the priority was to have students answer questions around learning and teaching, and to reflect this these attitudinal questions should be placed first on the survey. Yet, despite there being no other changes, the responses to the motivations questions changed. Not a radical change, but one that was noticeable in the trends of the survey. This should have been expected, but to understand why, we need to look at how surveys actually work.
One of my favourite papers on this is that by Zaller and Feldman – “A simple theory of survey response”. As they say ‘individuals typically do not develop “true attitudes” of the type that opinion analysts routinely assume, but possess a series of autonomous and often inconsistent reactions to the questions asked by pollsters’. In other words - what is in the head of the respondent at the time of answering a question dictates how they will answer it, and that includes what was asked before. Context matters, and if you move a survey question, you will get a different answer.
For our motivations questions, students became less likely to give just one motivation for taking their course, and slightly more likely to give three or four motivations for attending a course. It was a subtle shift that only affected some, but for those students the process of reflecting on their course for five minutes before answering the motivations questions appeared to change their response. The risk is that such a change is interpreted as a change it the student experience, rather than the way we are attempting to measure it.
If that change was minor, the revision to the Postgraduate Research Experience Survey (PRES) in 2013 and PTES in 2014 were major. Trends were lost, or bucked in response to changing positions and scales. Efforts can be made to contextualise the shifts, but with every change that most exact of benchmarks, the year on year measurement within the same organisation, is lost. In the discussions about revisions to the NSS, there has been a big push from HE providers to retain much of the format. Whilst there are an army of scholars that would love to change the format of the NSS, losing the wider context of trends risks beginning from ground zero. With any change, we need to ensure that the knowledge built up in a previous incarnation is not lost, and that we understand just a word or a shift of place can change how the respondent experiences the survey and thus their response. Most of all, trends help show that glorious peaks and miserable troughs were nothing but random fluctuations and cohort effects. They help remove the temptation for knee-jerk panic and broaden out the picture to show what really impacts on experience: what initiatives have worked; what changes have had negative impact; what was just an artefact of that autonomous and inconsistent being, the respondent.