Below, Stephen Scott discusses how UKES has helped St Mary’s University develop a better understanding of their students’ experiences, providing valuable data on engagement across the institution.
Motivations for participating in HEA Surveys
We began thinking about running UKES at St Mary’s at the end of 2014 when the HEA announced that it would be offering the survey to all institutions for the first time in 2015 (after pilots in 2013 and 2014). Our primary reason was that UKES would fill a significant gap for us as we did not have a programme-level academic survey for our undergraduate students who were not eligible to complete the NSS. We had run both PTES and PRES at postgraduate level for a number of years, so running UKES for undergraduates seemed a natural step.
I had taken a particular personal interest in the HEA's pilots of UKES in 2013 and 2014 as I had researched the US National Survey of Student Engagement, on which UKES is based, as part of my own postgraduate studies. As a result, I became an advocate of student engagement surveys within the University!
Luckily the idea of running UKES was met with full support from the Director of Learning and Teaching and the then Pro Vice-Chancellor for Academic Strategy. Following approval from our relevant committees, we decided to run UKES as a pilot in 2015. This gave us an opportunity to test UKES and the usefulness of the data it generated, plus it gave me the chance to run a HEA survey for the first time without the pressure of rolling it out to over 2,500 students!
The 2015 pilot in Year 1 of a small number of programmes was a moderate success. The response rate was extremely low but the responses provided an interesting set of data on students’ engagement with their studies, peers and tutors. The results identified a potential issue relating to the amount of time our students spent in independent study in comparison with students at other institutions, which we planned to monitor further.
Following the pilot, we decided to roll UKES out across the University in 2016. All Year 1 and 2 students, except Foundation Degree students eligible for the NSS, were surveyed. For a first attempt, the running of the survey went very well. Support from our Students’ Union and many of our academic staff was encouraging as they saw the potential usefulness of the data from the survey.
The response rate was also one of the higher for participating institutions. I thank the HEA for this as I implemented some of the ideas I heard about improving response rates for surveys at one of the HEA's Survey Officers meetings. The strategy that was most effective was the personalised e-mails to students every 10 days or so, plus the regular prize draws for Amazon vouchers or Summer Ball tickets. While organising and sending these e-mails could be time consuming (and on one occasion led to me embarrassingly blocking the University's entire e-mail system on a Tuesday afternoon!), the impact on the response rate was worth the effort.
What have the findings of the survey highlighted?
St Mary’s results for UKES were very encouraging and our students were placed in the upper quartiles in the majority of the benchmarking data we received from the HEA. The issue identified in our 2015 pilot on time our students spent in independent study came up as an area where our students rated lower than the sector average. Work is currently ongoing to understand this result further through interviews with students whose programmes participated in both the 2015 pilot and in 2016.
An additional bonus in 2016 compared to 2015 was that the HEA had received permission from HEFCE to allow institutions to use the NSS questions in UKES. We saw this as an opportunity to start tracking certain NSS sections across all year groups. Rather than using the entire NSS within UKES, we decided to use the questions from the ‘Assessment and feedback’ and ‘Academic support’ sections. These were chosen because of the link to our institutional enhancement theme for 2016. By coincidence, these two sections of the NSS have since become core metrics in the TEF, and we were able to use data generated from the NSS questions in the 2016 UKES to support narrative included in our recent TEF 2 Provider Submission. Our results were in line with the equivalent sections of the 2016 NSS, showing an encouraging consistency of experience across all year groups.
In relation to acting on UKES results, we undertook a review of our annual monitoring report template for programmes during 2016 and this gave us a great opportunity to embed UKES as a key component in the annual monitoring process. As we had no previous data for UKES, we prompted programmes to comment on how their programmes performed against the benchmarking data provided by the HEA and, where applicable, identify actions for the coming year. Going forward, we intend to shift this so that programmes begin monitoring trends in their UKES results over a two or three year period.
What has the impact of participating in the survey been?
It is early days in terms of seeing the demonstrable impact of UKES on the student learning experience at St Mary’s. However, UKES has generated programme-level data which we did not have previously so even after one year, we have a much better understanding of students’ engagement with their studies than we had previously. Future developments include Learning and Teaching staff supporting programme teams with using UKES results to understand students’ engagement with aspects of their programmes and to inform future developments in modules and programmes.
Overall, I believe UKES is a valuable resource which institutions can benefit from in different ways depending on their needs. This stems from the flexibility offered by the HEA in terms of when exactly to run UKES, which optional questions to include and which students to survey. Anecdotal evidence from academic staff I’ve spoken to at St Mary’s and elsewhere indicates that they find UKES to be more interesting than the NSS or module evaluations, so tapping into such interest is vital to (yes) engaging students in UKES.
If you would like to find out more about the surveys you can visit our webpages, email us at email@example.com or call us on 01904 717500.