1. University of Aberdeen Undergraduate Term Time Employment
Cath Dennis, University of Aberdeen
This session will explore the outcome of a survey on term time employment administered to undergraduate students at the University of Aberdeen.
2. Teaching Academic Employability: A Discipline-Specific versus Holistic Approach
Helen Standage and Carol Faiers, University of Essex
A particular challenge for current careers educators is to market employability as an integral part of Higher Education (HE). One strategy to achieving this end is to root employability into the intellectual heart of university life rather than present it as an extra-curricular feature for secondary consideration by students and academics alike. To encourage the perception of employability as being a discipline within its own right with intellectual substance, the University of Essex has introduced credit-bearing employability modules that either a) adopt a generalized approach to teaching employability with an emphasis on career-related theory, or b) adopt a degree-specific approach whereby intellectual links are made between employability content and the home degree subject. Based on survey data and wider metrics, both teaching models are compared statistically across three outcome measures: student engagement, achievement and satisfaction.
3. Learning from the Best: Identifying the bright spots in university teaching
Craig Bartle and Caroline Wilson, Coventry University
A student’s evaluation of their experience is already critical for attracting prospective students, and is about to become a metric linked directly to student funding via the new Teaching Excellence Framework (TEF).
This research uses the ‘Bright Spots’ approach to identify and disseminate good practice across a UK university. Bright Spots is a problem-solving technique which focuses on discovering ‘successful efforts worth emulating’ (Heath & Heath, 2010).
Qualitative feedback from classes with consistently strong satisfaction levels has been analysed to reveal the ‘bright spots’ as perceived by students and is reported in this paper.
The goal is to use the insights uncovered to engage staff in future development work, led by the academics themselves.
4. Student Engagement: Surveying the impact on Learning Gain
Sally Dixon and James Price, Manchester College
Distance travelled is a current measure that is utilised within further education, however the higher education sector has not utilised this types of measure to assess the quality and impact of its provision.
HEFCE have devised the term Learning Gain as the term to ‘measure the improvement in knowledge, skills, work-readiness and personal development made by students during their time spent in higher education’ HEFCE have approved funding for a range of projects to research to test the potential methodologies for measuring Learning Gain. The Manchester College is the project lead with 15 other colleges.
The research aims to utilise mixed methods by drawing on a range of indicator to track improvement in performance through a combination of entry level qualifications, student grades, and student and staff surveys. The research is in its very early stages and the methodology is currently being verified with pilot surveys being tested.
5. A preliminary evaluation of the Collegiate Leaning Assessment as an instrument to develop an understanding into Black, minority ethnic undergraduate disparity
Richard Hillier and Christine Broughan, Coventry University
This presentation will detail the processes that Coventry University has undertaken to establish a baseline from which learning gains can be objectively measured.
As well as the identification and quantification of attainment disparity between White and Black and Minority Ethnic [BME] students, contributing demographic factors will be presented. Adopting logistic regression, the impact of these factors will be investigated.
For the assessment of learning gains among Coventry University undergraduates, the Collegiate Leaning Assessment was deployed. The experience of implementing this assessment will be discussed as will the provisional findings of this three-year longitudinal study.
6. Transforming teaching : Making Feedback Work for Students
Helen Shiels, Ulster University
This ongoing research project is to examine enablers and barriers in the development of effective co-creation of content between students and tutors, in the pursuit of an effective online collaborative learning environment within online postgraduate programmes of study.
This poster presentation will outline the research aims and objectives and provide an insight into the initial survey findings, administered to business and science-based students of online postgraduate programmes, within Ulster University.
7. Developing a shared understanding of quality through a survey
Ellayne Fowler, University of Bristol
This poster explores the role of surveys in an attempt to challenge and develop a shared understanding of good quality educational research in a medical faculty. The medical faculty spans a range of hospital settings in the South West of England where Clinical Teaching Fellows develop educational research projects. As part of a University teaching Fellowship the author reports on the use of surveys to follow up an initial group exploration of quality as a means of achieving both change and consensus.
8. Using data to enhance the quality of student learning experience: Aston University’s Learning Development Centre
Chinny Nzekwe-Excel, Debbie De and Ellen Pope, Aston University
The Aston University’s Learning Development Centre (LDC) is an academic initiative focused on meeting the learning requirements of students, and works with academic staff to enhance the academic outcomes for the University’s diverse student body. The centre provides practical support to develop students’ learning skills and abilities. Therefore, this study aims to review the centre’s services by seeking feedback to improve its services and deliveries, and to subsequently contribute to enhancing student retention, progression and success. The survey data collection tool was specifically employed to identify the students’ needs and expectations from the centre, measure their satisfaction of the LDC services, and identify possible areas for enhancing the centre’s practice. This study discusses how the main outcomes from the surveys provided some insights for further research using the focus group research method. It also shows how the outcomes from the surveys and focus groups are utilised by the LDC to strategically plan and structure its services and offerings to address students’ learning needs as well as devise ways to reach students that are not currently using the LDC.
9. Using student surveys to enhance the learning experience of part-time blended learning students:a practical approach
Janice Koistinen-Harris, Julia Neal and Amanda Andrews, Education for Health
The advantages of blended learning are widely recognised, but so too are the challenges students can face when undertaking such learning, particularly part-time adult learners.This study explored the use of semi-structured telephone interviews to gain a richer understanding of the experience of part-time students on a blended learning programme delivered by a UK organisation, building on the outcomes of earlier surveys of students’ learning experiences.Telephone interviews were held with programme students recruited through convenience sampling.Further surveying of the students’ experiences was time consuming and therefore did not afford the benefit generally associated with surveying as an efficient method of gathering data.However, the qualitative approach provided a rich set of additional data which has been invaluable in informing our approach to enhancing online resources as well as opening lines of communication with our community of non-traditional online learners.
10. Measuring future impact: Using Institutional questions within PRES to obtain baseline student feedback data
Gabrielle Milson, University of Strathclyde
This poster aims to highlight how the University of Strathclyde has utilised the Postgraduate Research Experience Survey (PRES) Institutional questions to define a baseline of student opinion and satisfaction with the Researcher Development Programme (RDP) and the novel Postgraduate Certificate in Researcher Professional Development (PG Cert RPD). Not only will this enable measurement of the impact of the RDP programme and the PG Cert RPD on the developmental skills and future career expectations of Strathclyde’s postgraduate researcher population, it will also provide data for business cases, Research Excellence Framework engagement and feedback for supervisors.
11. ‘How was it for you?’ – Student Evaluation of Curricular Change
David Wilson, Cardiff University
This study aimed to assess perceptions of student evaluation, and gain views on current School practice. Secondly, the study asked: is there too much? Finally, the extent to which students felt that their feedback was listened to and acted upon was evaluated. Using current 3rd and 4th year volunteers, two focus groups were run and the outputs were used to design an online questionnaire. Results were subjected to thematic analysis. There were 139 respondents – 91 (3rd year) and 48 (4th year) a response rate of 22.9% overall, 30.5% and 15.6% respectively. The majority of respondents (77%) saw the purpose of evaluation as course improvement in subsequent years. Students had mixed feelings with words like excessive, tedious being used to describe evaluations. Students felt that the loop was not closed (102/139 respondents) with improvements being mid-block ‘snapshots’, face-to-face feedback and short presentation indicating what changes had resulted from student feedback.
12. Transitioning to HE: Students, Surveys and Enhancement
Lucy Dumbell and Rosie Scott-Ward, Hartpury College
A student’s higher education learning journey begins the moment that they start to research progressing into Higher Education. These early encounters with the institution, its campus, staff, programmes and fellow students inform judgements in a way that seems highly resistant to change. As such they provide a valuable opportunity to co-construct expectations and provide an outstanding student experience that supports a student during their transition to both undergraduate and postgraduate study.
Hartpury asks new enrollers to reflect on their student journey from first coming in contact with information in the public domain to an induction period covering the first three weeks of semester one. This survey approach has been used consistently for over ten years to refine the approach taken to supporting student transitions.
This poster will reflect on the advantages of the survey approach used and suggest how future adaptations may support future developments.
13. Adapting surveys to improve the experience of pathways students
Victoria Wilson-Crane and Paul Lafferty, Kaplan International Colleges
Kaplan International Colleges recently reviewed its method of gaining feedback from students about their experience on pathways programmes, to support the planning of initiatives to improve the offering.
Whilst historically, in excess of 70% of students completed the End of Programme survey, there has been uncertainty regarding the value of the data and its ability to support improvements to the student experience. The review, undertaken by a diverse group of interested parties and stakeholders, resulted in a survey which has closer alignment to the National Student Survey (NSS) relevant to students in transition to higher education. During the process, feedback was gained from stakeholders, particularly including college alumni. Data from the new survey are accessible via a dashboard, enabling colleges and stakeholders to investigate the data. The poster will inform others about this work, including the development process and the emerging outcomes from the April 2016 completers of the survey.
14. Data driven decision making for quality assurance purposes - maximising the teaching and assessment opportunities for higher education students
Simon Bedford, Rodney Vickers, Jan Sullivan and Emma Purdy, University of Wollongong
It has become increasingly important to collect institutional data to measure and evaluate teaching and assessment improvements and to evidence quality assurance for both internal policy obligations and external review. However, how this data is presented, reported and targeted to individuals at various levels is of equal importance in ensuring that the correct decisions are made to maximise the student
learning experience. The primary aim of this work was to see how best to provide analytic data on subjects and courses at the University of Wollongong to staff and committees for monitoring and quality assurance improvement. This poster presentation aims to explain how effective this has been and what lessons others can learn from this experience.