Using data for enhancement: Don’t let proving trump improving
Alexander C. McCormick, Ph.D.
Indiana University Bloomington, USA
Around the globe, higher education faces unprecedented demands to provide evidence of quality and value for money, often in the form of accountability and transparency regimes that aim to demonstrate quality and provide consumer information. At the same time, new tools provide university leaders, instructional staff, and others with detailed information to inform quality enhancement efforts. In this climate, “proving”— demonstrating worth and responding constructively to unfavorable results or ill-considered comparisons — can take priority over improving: using actionable diagnostic information to enhance university performance. Student engagement surveys offer an opportunity to examine this tension and imagine approaches that prioritise enhancement.
Using student voice, feedback and data to assist educational developments through targeted intervention
Jane Collings, Professor Pauline Kneale and Pollyanna Magne
This session aims to share how it uses data about programmes and the student experience to target intervention, and support programme enhancement activity.
Session outline: The 20 minute presentation will outline the types of data Plymouth University draws together as an evidence base. It will explain how this evidence base underpins the offer of support to departments and how Educational Developers use it to begin the conversation with colleagues. The session will also share: some of the lessons learned in relation to how those approaches are made; and stories which highlight the process and impact of successful enhancement activity.
Making their voice count: ensuring ANU student surveys enhance the student experience
The Australian National University
The survey team at Australian National University (ANU) aim to improve the student experience by providing the university community with customised information about student learning and the student experience from a suite of institutional and national quality assurance surveys, results of focus groups and course and teacher evaluations.
This presentation will review the range of survey reporting mechanisms that have been introduced at ANU to assist faculty and executives to identify survey themes and implement enhancement initiatives. It will reflect upon the successes and challenges of distilling and disseminating survey results from a growing suite of Australian higher education surveys.
Enhancing NSS promotion – findings from research into NSS promotion at 30 HEIs
University of Birmingham
This presentation will highlight promotional activities that can lead to higher NSS response rates. It will focus on themes of promotional activities drawn from research into NSS promotion at 30 HEIs, giving examples of particularly effective activity at institutions doing such promotion. The session aims to give delegates ideas of new ways to both improve NSS response rates and decrease the burden of promotion upon staff. In addition, an opportunity for discussion between counterparts at different HEIs to input further ideas and comments will be included.
‘How surveys make a difference’
New ways of improving response rates
City University London
Student Surveys play an increasingly important role in getting student feedback about their overall experience at University. In order to get valid and representative data, it is essential to get high response rates. The ‘Your Voice, Our Action’ campaign at City University London is more than just a way to get students to fill out a survey which encompasses internal surveys, NSS and PTES and PRES, but be engaged in feedback on a broader level and to see how what they say contributes to improving the student experience for them and others. City achieved a response rate of 73.9% in the NSS this year with a population of nearly 2,000 students. This is higher than the national average and is done in a variety of different ways in its innovative’ Your Voice, Our Action’ campaign.
Using PTES to improve the Post-Graduate Taught experience at The University of Nottingham
The University of Nottingham
This session will explore how The University of Nottingham has used PTES to improve the Post-Graduate Taught range of courses. We will consider three areas: how we marketed PTES and encouraged responses, how we analysed PTES data and how we use this analysis to inform Schools across the University.
Development of the National Student Survey
Charlotte Lester and Ross Hudson
How institutions are using data from the Irish Survey of Student Engagement
Irish Survey of Student Engagement
A national pilot of the Irish Survey of Student Engagement (ISSE) was undertaken in 2013, leading to full implementation in 2014 and 2015. Key stakeholders within participating institutions are striving to make efficient use of an emerging information source on student engagement. This session explores some of the actions taken, and challenges faced, as they seek to provide effective feedback to students and to wider staff, and as they evaluate potential uses of engagement data to inform enhancement activities.
The session will highlight examples of practice from a report published in January 2015, ‘Effective feedback and uses of ISSE data: an emerging picture’, and use these to pose a number of questions for discussion with participants.
A new measurement and ranking system for the UK National Student Survey
Dr John Canning
University of Brighton
This presentation outlines a new system for identifying absolute (the Weighted Student Satisfaction Score (WSSS)) and relative (the Weighted Student Satisfaction Quotient (WSSQ)) performance, improvement and decline in the National Student Survey (Canning 2015). The system enables stakeholders to quickly identify excelling (130+) and problematic courses (<70) whilst offering a nuanced quantitative differential across subjects and within institutions. It is also possible to measure year-on-year improvements across the whole sector and the quotient maps to a mean of 100 meaning distance from the mean for any individual course can be easily calculated.
Using UKES results and institutional award marks to explore the relationship between student engagement and academic achievement
Alan Donnelly and Nathaniel Pickering
Sheffield Hallam University
This presentation will share the key findings of quantitative research conducted to explore the relationship between student engagement in learning and academic attainment. This was achieved by analysing Sheffield Hallam University's results of the UK Engagement Survey (UKES) pilot 2014 against institutional award marks. This presentation will discuss the results of the research and consider the usefulness of merging survey data with institutional data in order to enhance the University's understanding of student engagement.
Data Training 101: From Data to information to Action
Dr Judith Ouimet
Are you drowning in data and need a life preserver? Do you have survey data vying for your attention but just do not know where to start? Are you looking for campus collaborators to help interpret your results and take action? Learn how to engage students, faculty, and staff to link your survey data to various campus analytics to improve student success.
Using survey data to inform and target curriculum improvement
Vicky Marsh and Rebecca Galley
The Open University
Student survey data is systematically collected at The Open University (OU) and used throughout annual quality enhancement cycles to inform best practice and drive curriculum change. Currently, satisfaction is considered as an independent student outcome measure.
An advantage of the OU data is the size of the datasets, data is collected from the majority of modules so the number of responding students is large. We are proposing to summarise how at the OU we are combining this rich source of survey data with student performance data and using it to systematically inform principles of good practice in module learning design.
It’s not just about the NSS: Engaging staff and students with all of our survey results
Bethan Foweraker, Nina Di Cara and Alastair Masterton
By the end of this presentation, participants will have learnt about the approach Cardiff University has taken to develop a centralised approach to our wide range of student surveys, through the creation of a standardised dashboard for all. The dashboards have been created to provide both our academic staff and Student Academic Representatives with easily digestible survey outcomes from our local and national surveys at School, College and University-level. Participants will have an opportunity to see the range of dashboards we produce, as well as learning how these form a core part of the evidence-base for our quality processes such as Annual Review and Enhancement as Periodic Review and ensure that survey data is properly incorporated and leads to enhancement.
What are students’ ‘saying’? Reflections on the politics of surveys for enhancement
Within universities, students’ unions the media and the sector, the NSS has lead a highly politicised existence, perhaps more so than any other data gathering exercise in the country. This section will explore and reflect upon the many political frames that the survey sits within, in particular considering the multiple interpretations of ‘what students’ think’ and the power that such assertions can carry in the modern university.