Striking a balance between internal and external pressures for using survey data

Many of you joined the #HEAchat #LTHEchat on 26th October 2016 20:00GMT hosted by the HEA surveys team #HEAsurveys discussing the use of data to drive teaching decisions. We have compiled a Storify of the conversation for you to review and share.  

If you would like to add to the conversation and share your views and experiences please tweet using the hashtag #HEAchat. We look forward to hearing about your experiences and sharing ideas.

Within the highly competitive higher education sector the pressure placed upon institutions to promote themselves is great. Achieving good scores in the National Student Survey [NSS], high levels of employment after graduation, and top positions within national league tables are likely to impact on an institution’s ability to attract students. However, public presentation of such student data is often unhelpfully simplified and generalised (Velden, 2012). This blog and associated twitter chat discusses the potential challenges of balancing internal and external pressures when attempting to utilise survey data for enhancement in teaching and learning.

Conducting student surveys is only the beginning; the real value is in the ways the results are used. In order to drive enhancement within teaching and learning it is important to explore students’ experiences to develop an understanding of areas of provision which are in need of attention, and areas which can be celebrated. Such understanding and enhancement needs to take place as part of a continuous cycle, in order to ensure development happens as and when needed.

External pressures on institutions come from potential and current students, their families, wider government, and other institutions both in the UK and globally. League tables and survey results enable institutions to promote areas of strength and highlight ways in which they compare to competitor institutions in order to maintain reputations, attract students, and ultimately achieve an income. Such information increases the visibility and accountability of institutions. An example is the UK-wide NSS which, since its introduction in 2005, has increased the visibility of students’ perceptions within institutions (Velden, 2012). An aim of the NSS is to provide potential students with information in order to support their ability to make informed decisions about where to study. Through the public dissemination of results, institutions are able to compare themselves to others in order to better understand their own provision within the context of the sector.

The danger of these external pressures are seen when too much focus is placed on these simplified data out of context. A focus on improving scores, rather than on enhancement in provision, can lead to frustration for staff members who are placed under pressure to tackle problems in isolation. Triangulation of findings is important to ensure a better, more holistic understanding of the data, in order to better understand potential areas for enhancement (Zaitseva & Stewart, 2014, p.2). By combining survey data and metrics with other data, institutions can explore the context and reasons behind the scores. In addition, the analysis and exploration of qualitative data enables further understanding of how to address issues, and provide context for low scores. Through doing so, institutions can develop a greater understanding of how they can enhance and develop their provision whilst teaching staff can be helped to excel in their teaching practice.

A recent QAA research report found that those institutions who were highest in the league tables appeared to focus more on improving the student experience than NSS scores, whilst those who were in the bottom 25% of the league tables were more focused on promoting the surveys in order to improve their league table position (Williams & Mindano, 2015). Therefore, as Kandiko and Matos (p9, 2013) state “A balance needs to be struck between collecting meaningful data that is used for institutional improvement and pedagogical enhancement and for external comparisons and marketing”.

For the twitterchat we would like to hear about the tensions that exist between measurement for improvement and measurement for judgement. It would be great to share your use of data to make decisions in your teaching practice and of course your perceptions on how the TEF is influencing the use of data within your institution.

Join our combined with #HEAchat and #LTHEchat 26th October 20:00–21:00 GMT

You can read more about how to take part in the chat here but it is very simple; just log on to Twitter on 26th October at 20:00 and look for the hashtags #HEAchat and #LTHEchat to join the discussion. We look forward to hearing about your experiences and sharing ideas.

Background material to help your thinking
A wealth of evidence exists on effective teaching practice, including the HEA’s work on flexible pedagogies which should provide direction and focus for ways to strive for excellence and teaching quality. These resources can be used in conjunction with our surveys, including the HEA’s UK Engagement Survey [UKES] which provides evidence on how students are supported and engaged throughout their studies. UKES has been designed to look further at students’ experience through exploring their interaction with their courses, moving away from the idea of students as consumers who should be satisfied, towards students playing an active role in their learning. By using student survey data alongside evidence of effective pedagogies, institutions can develop efficient enhancement cycles within their institutions.

The HEA has a number of case studies on using surveys for enhancement and offers consultancy to institutions who are interested in exploring how they can act upon their survey data to develop their teaching excellence. If you would like to find out more please contact us at surveys@heacademy.ac.uk

Reference list

Kandiko, C. & Matos, F. (2013) King’s College London case study. In A. Buckley (Ed.) Engagement for enhancement: Institutional case studies from a UK survey pilot. (York: Higher Education Academy). Accessed 04.10.2016 from https://www.heacademy.ac.uk/resource/engagement-enhancement

Velden, G. (2012) Foreword. In A. Buckley Making it count: Reflecting on the National Student Survey in the process of enhancement (York: Higher Education Academy).

Williams, J. & Mindano, G. (2015) The role of student satisfaction data in Quality Assurance and Enhancement: How providers use data to improve the student experience. (QAA) http://www.qaa.ac.uk/en/Publications/Documents/Subscriber-Research-Role-of-Student-Satisfaction-Data-15.pdf

Zaitseva, E. & Stewart, M. (2014). Triangulation in Institutional Qualitative Data Analysis: clarity from viewing through multiple lenses? Accesses 04.10.16 from http://www.heirnetwork.org.uk/wp-content/uploads/2014/01/OPS2-Elena-Zaitseva-and-Martyn-Stewart.pdf

 

Blog work area: