Responding to the Student Voice: How University of Hull Aims to Improve Student Satisfaction on Feedback Response

We’re delighted to publish this guest blog from evasys client Joanna Carter at University of Hull, who manages and provides analysis on Module Evaluation Questionnaires for the university.

Listening to the student voice and providing satisfactory response to that voice is a key commitment from University of Hull.

In this blog Joanna highlights:

  • A national lack of clarity amongst students on how their feedback is acted on

  • Why a more complex approach to communicating action, or non-action, is needed

  • The importance of communication between module leaders and administrative staff during survey periods

  • The processes used within University of Hull to maximise opportunities for student and staff engagement

  • How critical it is to close the feedback loop with students to ensure they feel their feedback is valued

  • Plans to take closing the loop further at University of Hull

A clear message from students 

The increasing emphasis placed on the student voice in higher education is apparent in both the policies and practices of UK higher education institutions. A Google search of ‘university student voice’ shows it described as being about the value of student opinions, equal contribution, everybody being represented and improving policy and course delivery.  The QAA guidance on responding to feedback from students in 2013 outlined good practice and recommendations for the use of student feedback, all of which, if implemented, present a range of tasks and processes for University staff to administer.

There is a clear message coming from students responding to the National Student Survey (NSS) between 2017 and 2020, that their clarity on how feedback has been acted on is rated with much lower satisfaction compared to ratings for the opportunities they have had to provide feedback.

Table 1. National Student Survey percent agree sector average results 2017 to 2020 filtered to show higher education institutions only

Year Student Voice Scale Result 23. I have had the right opportunities to provide feedback on my course 24. Staff value students’ views and opinions about the course 25. It is clear how students’ feedback on the course has been acted on Response rate
2017 69.2 83.9 75.5 60.2 68
2018 69.2 84.0 75.4 60.6 70
2019 73.7 84.7 75.6 60.7 72
2020 73.6 84.6 75.6 60.6 69

Optimising communication to encompass all students

Students being given the opportunity to complete various surveys during their course perhaps through email notifications, staff messages and posters on campus is an overt strategy, whereas the perception of whether this has been acted on becomes a more personal experience.

Communication in relation to acting on feedback requires a more complex approach, to effectively convey aspects of course delivery where improvement is justified and likely to have wide-ranging benefits and to manage expectations about feedback that is not taken forward. 

These messages need to reach all students who were invited to give feedback whether they did so or not and also the next cohort of students, as part of a feedback cycle.  Students should be made aware at the start of their courses of the enhancements that have been made as a result of previous feedback, whilst also having opportunities to give feedback during the delivery and at the end.

Maximising opportunities to encourage student and staff engagement

In my experience, the communication between module leaders and administrative staff during survey periods becomes an important factor in supporting student participation and staff engagement.  The external support provided by Ipsos Mori to produce NSS responses of around 70% is unlikely to be replicable at institutions for internal surveys and the push for response rates and buy-in is a continuous effort.

At the University of Hull, we have tried to maximise our opportunities to encourage student and staff engagement when it comes to our Module Evaluation Questionnaires (MEQs).  Research by Bennett and Nair (2009) proposes a communication strategy with three distinct phases; a pre-survey phase, active phase and post-survey phase, that underpins our approach.

In the pre-survey phase, opportunities are taken to engage with module leaders towards the start of the trimester, not only to confirm they are correctly assigned to modules and but also to provide them with a video on the module feedback cycle.  This makes them aware of good feedback [1]practices and important markers during the module to engage students.  It also signposts them to the range of guidance materials available on our virtual learning site and advises staff to check access to instructor accounts in evasys+ to be able to monitor and access their MEQs.  This reduces some of the queries and administrative burden that was historically occurring at the end of the MEQ process when engagement opportunities have also then been lost.

Our MEQs run towards the end of the trimester (the active phase) and are launched with an e-Bulletin announcement, banners in the virtual learning environment and email notifications to staff and students.  We have designed tailored documents that are attached to each of these emails to provide more information.  The email to students includes a link to a video produced by our Teaching Excellence Academy on how to give good feedback.  The MEQs link through to the virtual learning environment to increase visibility where there is also an interactive guide to MEQs.   We have found that close to 50% of our responses occur on the first day and on the scheduled reminder days within the MEQ period.

The post-survey phase involves closing the feedback loop with our students by providing them with a response to their feedback.  Module leaders write a reflective narrative on the positive areas of practice they can identify from their feedback and areas where improvements have been suggested, followed by actions they intend to take.  Messages and guidance on this process gain momentum through the phases.  Being mindful of the most impact being gained by speed of response, a process for disseminating the reflections is administered through evasys+ where all students registered on the module receive a report containing the reflective narrative and quantitative question results.

The benefits of closing the student feedback loop

Closing the loop aims to demonstrate to students that their feedback has been acknowledged and contributes to the sense that staff value students’ views and opinions.  However, at this stage closing the loop does not necessarily evidence that actions proposed were carried out or the effects on subsequent student experiences.  Therefore, taking closing the loop further is something that we are exploring and analysing in more detail at the University of Hull.

There is the potential for a wider communication strategy to our students to convey the collective view of enhancement activity being undertaken.  Closing the loop with our staff by communicating the areas in common that they have identified for enhancement could support their continued engagement.  Building our year on year data picture to take into account the areas where improvements were proposed will support our evaluation of impact.  Ultimately, we are working towards our staff and students sharing mutual recognition of the benefits of feedback processes being two directional.    

Joanna Carter Uni of Hull

About Joanna Carter

Joanna has been the Student Insight and Sector Policy Manager at University of Hull since November 2017. She works within the Strategic Planning and Business Intelligence Service to draw together and evaluate information in a meaningful way and communicate it effectively to staff, students and stakeholders.  She is committed to improving the use of data to support the enhancement of teaching and learning. 

Share This Post!