The impact of closing the feedback loop – Bruce Johnson

Released at the end of July 2021 – and after the impact of COVID-19 on higher education in the UK – the National Student Survey (NSS)¹ results recorded the lowest percentage of agreement around the theme of Student Voice. Down almost nine percentage points on 2020, these results clearly indicate that current students in the UK do not feel their feedback is sufficiently heard at institutional level.

Coupling the most recent NSS results with policy drivers that include the UK Government’s ‘Students at the Heart of the System’² and UK Quality Code for Higher Education (2018)³, there is a pressing need for improvement in the way that UK institutions approach, manage and respond to the student voice.

It is clear how students’ feedback on the course has been acted on

Actually, it’s not. The heading above is the exact wording of Question 25 in the 2021 National Student Survey. Only 51% of participating students, nationally, indicated any agreement at all with this statement.

It was the lowest percentage agreement out of the twenty-seven core questions asked in the 2021 NSS. Students don’t feel that they are being heard at a time when it’s never been more critical to listen to and act upon the student voice – and ensure that students know their voices are being heard and responded to.

So, what can be done? 

Surveys alone aren’t the answer

While surveys at module, course and institutional level are an important tool in harvesting student insights as efficiently as possible, it’s what happens after each survey closes that has the potential to make a tangible difference.

In a blog⁴ last year on this very topic, I noted that there was a growing pressure for students to be seen as partners, leading to increased focus on universities adopting a more customer-focused approach. While it is already common practice to seek students’ views to inform improvement within institutions, it is less common for universities to close the feedback loop and let students know how views will or will not be acted upon.

Surveys provide data that a university can analyse and action if deemed appropriate, but if results and intended actions aren’t communicated many students may feel disenfranchised, disconnected and less inclined to participate in the future.

The impact of closing the loop

When the recent HEPI Report 140⁵ ‘What is the Student Voice: Thirteen essays on how to listen to students and how to act on what they say’ was published with support from evasys, I commented that:

“The way a university approaches and acts upon the student voice can make a tangible difference to future outcomes for the institution – we see this time and time again across our customer base. Those who consistently close the student feedback loop by offering reflection and proposed action to issues raised evidence positive pull through in sector-wide results such as the NSS. Student surveys are a valuable tool for capturing the student voice, but a critical element is in the response to that voice. This is what facilitates true partnership working with students to enhance their learning experiences.”

And for her essay in the same report, ‘The Virtuous Loop; capturing the student voice through course and module evaluation’, Dr Helena Lim of evasys spoke to Ismail Ali, President of the Student Union at the University of the West of Scotland, who said:

“Closing the student feedback loop makes students active partners in ensuring that teaching and learning delivery not only works well but continues to improve over time. The dialogue between students, module leaders and the wider university is an ongoing project and it is imperative not just to close the feedback loop, but to stay in the loop too.”

Ismail Ali, President of the Student Union, University of the West of Scotland

Engaged students who know that their opinions are valued, and acted on, are more likely to participate in future surveys.

Helena also sought opinion from VP of Welfare and Wellbeing at the University of the West of Scotland, Luke Humberstone. He commented:

“In my experience, capturing the student voice as a mechanism of the quality process is something that should be done with the genuine intent of improvement. Sometimes academics might use the process as a box-ticking exercise or are too quick to say that something cannot be changed or improved because of the capacity of the individual or team. Even that being fed back to the students can be useful in illuminating them to the real-life pressures on academics.”

Luke Humberstone, VP of Welfare and Wellbeing, University of the West of Scotland

It seems obvious, but as well as taking positive action around student feedback, explaining why the institution CAN’T action a suggestion can also provide that same sense of voices being listened to.

At University of Hull, where the Student Insight and Sector Policy team have implemented a robust approach to module evaluation and closing the feedback loop, results have highlighted the impact of the Module Feedback Cycle on improving the student experience of teaching and learning. 

For example, there has been an incremental increase in student agreement that marking criteria is clear in advance of assessment between 2017/18 and 2020/21, resulting in an overall uplift of 7.6%.  This is due to the sustained focus on improving this area as a result of low ranking in earlier module evaluations.

Further information on the university’s evaluation process, including evaluation of blended teaching and learning in the wake of COVID-19,  and results can be found on the institution’s website in the article ‘Closing the feedback loop for Module Evaluation Questionnaires: Evaluating the conversion of feedback into enhancement of teaching practice’⁶.

And the impact of closing the feedback loop isn’t just felt by students. During a recent meeting with an evasys client, academic staff provided opinion on using evasys to close the loop:

  • “It was not fair to expect students to respond to a large number of surveys if we do not provide feedback in reply”

  • “Showing students the dashboard of responses in real-time seemed to motivate more to complete the survey”

  • “More student responses seemed to be received because students heard that there would be feedback provided”

  • “The process of closing the loop forced me – in a positive way – to consider student feedback more deeply and to reflect on my own practice”

  • “[It can be] time challenging, but makes you a better teacher”

The easiest way to close the loop

The easiest way to close the student feedback loop at module level is to use a software solution that speeds up the process, provides consistency for staff and students and reduces administrative burden in managing the process.  A recent blog from evasys entitled ‘What is the simplest way to close the student feedback loop at module level’⁷ provides further information on the benefits of a digital system for module evaluation.

In addition to the digital tools used to gather survey data and provide responses to students, institutions use various processes to maximise staff and student engagement and provide timely feedback to students.

A recent blog on the AUA website entitled ‘Taking closing the loop further at the University of Hull’⁸, written by University of Hull’s Student Insight and Sector Policy Manager, Joanna Carter, outlines the institution’s approach to managing surveys and closing the feedback loop.

In her experience, communication between module leaders and administrative staff during survey periods becomes an important factor in supporting student participation and staff engagement.

At Hull, the team maximise opportunities to encourage staff and student engagement for Module Evaluation Questionnaires (MEQs) by deploying the Bennet and Nair (2009) three stage approach to communications: pre-survey, active and post survey.  This approach allows the institution to inform and educate module leaders during the pre-survey phase, launch a coordinated communications strategy during the active phase and close the student feedback loop effectively in the post survey phase.

Why close the student feedback loop?

Universities should close the feedback loop at module level to ensure that students understand that their voices are being heard. Not only is this beneficial to student engagement, but is should also provide a positive impact on wider survey results, such as the NSS.

As Dr Christine Couper, former Director of Strategic Planning at the University of Greenwich communicated in a guest blog for evasys entitled ‘How to have structured conversations with students post survey’⁹, it’s not just about ticking boxes to improve results.

The University share clearly presented survey results to each cohort, enabling them to judge their own experience against their peers, seeing whether their views were typical or not.  Module leaders digest outcomes and then provide their own commentary and reflections back to their students, making the conversation two-way and giving students much greater ownership of the whole process.

It doesn’t stop there. University of Greenwich utilises comparative analysis of outcomes across modules so, while the analytics are managed centrally, the reviews are done within faculties, first by programme teams and then by student experience committees, allowing additional layers of feedback to students as part of the annual teaching and learning review process.

Christine succinctly sums up the reason why universities should close the student feedback loop at module level:

“By developing conversations with all our students, we hope to provide evidence that there is direct benefit to them in continuing to complete the module evaluations, thus reducing the risk of survey fatigue – and increasing future response rates as students evidence that their voice is being heard.  We are also building positive behaviours that will impact on other surveys that staff and students are involved in, like the NSS.”

Dr Christine Couper, former Director of Strategic Planning, University of Greenwich

¹Office for Students (2021), National Student Survey – NSS accessed on 24 August 2021 at https://www.officeforstudents.org.uk/advice-and-guidance/student-information-and-data/national-student-survey-nss/

²BIS (2011), Higher Education: Students at the Heart of the System accessed on 24 August 2021 at https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/31384/11-944-higher-education-students-at-heart-of-system.pdf

³QAA (2018), The UK Quality Code for Higher Education accessed on 24 August 2021 at https://www.qaa.ac.uk/en/quality-code/advice-and-guidance

⁴evasys (2020), Why Universities need to Close the Student Feedback Loop at Module and Course Level accessed on 24 August 2021 at https://evasys.co.uk/why-universities-need-to-close-the-student-feedback-loop/

⁵HEPI (2021), What is the Student Voice? Thirteen essays on how to listen to students and how to act on what they say accessed on 24 August 2021 at https://www.hepi.ac.uk/wp-content/uploads/2021/08/What-is-the-student-voice_HEPI-Report-140_FINAL.pdf

⁶University of Hull, Closing the feedback loop for Module Evaluation Questionnaires: Evaluating the conversion of feedback into enhancement of teaching practice accessed on 25 August at https://www.hull.ac.uk/choose-hull/study-at-hull/teaching-academy/news/closing-the-loop-evaluating-the-use-of-student-feedback-to-enhance-teaching

⁷evasys (2021), What is the simplest way to close the student feedback loop at module level?’ accessed on 25 August at https://evasys.co.uk/simplest-way-to-close-the-student-feedback-loop-at-module-level/

⁸ AUA (2021), Taking closing the loop further at the University of Hull accessed on 25 August 2021 at https://aua.ac.uk/taking-closing-the-loop-further/

⁹ evasys (2020), How to have structured conversations with students post survey accessed on 25 August at https://evasys.co.uk/how-to-have-structured-conversations-with-students-post-survey/

Bruce Johnson MD EvaSys UK closeup

Bruce Johnson is the Managing Director at evasys and has 20+ years of experience working within both large and small UK universities, including 14 years heading Student Systems at a Russell Group University. 

Share This Insight!