Skip to content Accessibility statement

Digital programme and module evaluation system

The digital programme and module evaluation system is a centralised, digital system for module evaluations. 

It replaces current evaluation methods, which can vary between departments and programmes, and involve numerous different methods, with a single, uniform method for collecting and analysing feedback. 

The goal is to standardise the process and improve data collection and analysis.

It’s vital that student input is heard and acted upon promptly - amplifying the student voice and demonstrating that we listen to their feedback.

Professor Tracy Lightfoot, Pro-Vice-Chancellor for Education and Students

Benefits

Key benefits of the new system include:

  • Enhanced student experience: a consistent feedback mechanism across all modules will ensure students’ input is gathered and acted upon promptly.
  • Improved resource efficiency: full automation and streamlining of the evaluation process will help to free up staff time.
  • Greater consistency and standardisation: the system will provide institutional oversight and accurate, timely data to support accreditations and other external requirements.

How it works

Evaluations must be conducted online using the new system.

Modules must be evaluated every time they are delivered, near the end of teaching and before the final summative assessment.

The survey window will be ten working days.

The University has selected Evasys as our digital module evaluation platform provider. 

They have a partnership with Advance HE for PTES/PRES comments and work with other higher education institutions.

Access the staff portal

Closing the feedback loop

A summary of evaluation results and the department’s initial response must be provided to students who were eligible to complete the evaluation within ten working days of the evaluation closing date.

Phase Recipient Content received Timeline
Feedback delivery Participating students Quantitative (numerical) results only Immediately after the two-week survey window closes

Reflection period

Module Leader

Full quantitative report, all raw open comments, and a Word Cloud visualisation

Two-week period immediately following the survey window closure

Reflection response

All students on module

Module Leader’s reflective written response to the feedback

After the Module Leader publishes their reflection (within the two-week reflection window)

Confidentiality and data access

Student confidentiality will be protected. 

Data will be fully anonymised, except for in extremely rare circumstances when anonymity may be removed to address serious concerns such as discriminatory, harassing, or threatening content; disclosure of serious harm or illegal activity; or explicit self-identification by a participant. In such cases, access will be restricted to senior members of staff in Academic Quality and/or the Academic Registrar.

Access to data is restricted by role and responsibility, and module leaders must share results with all staff who taught on the module.

Key dates

The system is scheduled to go live for module evaluations in Semester 1 of the 2025/26 academic year.

Surveys will launch at the start of Week 9, closing at the end of Week 10. 

Quantitative Results will be available from the beginning of Week 11. 

Closing the feedback loop will start at the beginning of Week 11 (Monday 9 December) and finish on Monday 22 December.

Policy and procedure

Orientation sessions

Module Leader orientation sessions

Watch a recording of the Module Leader orientation session

https://youtu.be/osRKWeEBr0c 

Drop-in sessions during the survey window:

  • Wednesday 3 December, 1pm to 2pm

Closing the feedback loop orientation:

  • Friday 5 December, 2pm to 3pm
  • Thursday 11 December, 10am to 11am

Drop-in session:

  • Monday 15 December, 3pm to 3.45pm

Sign up for closing the loop orientation sessions

Frequently Asked Questions

About the system

What questions are included in the module evaluation surveys?

To ensure consistency and enable institution-wide comparisons, all modules use a standard set of core questions.

These questions are based on existing department question sets and are aligned with the University's learning objectives and NSS goals. 

The questions will cover key areas, including:

  • Module organisation and structure
  • Teaching and learning
  • Assessment and feedback
  • Workload and challenge
  • Overall experience
Can I modify the survey questions for my module?

To ensure consistency and allow for reliable, institution-wide comparisons, the system does not offer the option to modify the core survey questions. 

All modules use a standard set of questions approved by the University Education Committee (UEC).

This approach provides better data quality and helps the University identify broad trends and areas for improvement across different departments.

What is the format of the questions?

The questions use a combination of Likert scales (eg Strongly Agree to Strongly Disagree) for quantitative data and open-text boxes for qualitative comments.

This approach gives us measurable data and rich, detailed feedback from students.

There is more than one member of staff teaching the module. Can we all be added as module leader?

There can only be one module leader assigned to the survey in Evasys.

The assigned module leader can share the QR code with other teaching staff for the module and they will be able to share response rates and results of the survey.

In this instance they might also want to discuss their shared reflections in response to results.

How do the surveys mitigate the dangers of unconscious bias?

There is guidance linked from the top of each survey, which advises students about how to give feedback in a professional way, and talks about the danger of unconscious bias.

What does 'anonymous non confidential' mean?

Students complete module evaluations confidentially and their name/student ID will not be included in any reports distributed. 

Student confidentiality must be protected at every stage of the evaluation process, unless comments are expressed that lead to acute concern for the wellbeing of the student or others, or where an evaluation response presents a prima facie case for a breach under Regulation 5.7 (academic misconduct) or Regulation 7 (student discipline). 

In these instances, and where the Head of School/Department and the Pro-Vice-Chancellor for Education and Students give authority, designated staff will be able to link specific comments to a student identification number to enable follow-up actions as appropriate.

Are the surveys completely anonymous? How is offensive or crisis feedback handled?

The surveys are confidential, not totally anonymous.

Students are warned that the system has the capability for de-anonymisation if required.

  • Handling offensive feedback (eg directed at staff): Offensive or inappropriate comments should be immediately reported to the Academic Quality Office via student-surveys@york.ac.uk. De-anonymisation requires high-level authorisation and is reserved for serious incidents.
  • Handling crisis feedback (student welfare concern): If Module Leaders and Departmental Users identify any comments that suggest a student is in crisis, follow departmental practice for escalation immediately by contacting student-surveys@york.ac.uk and requesting de-anonymisation.
Will the new digital module evaluation system add to academic workload?

The new system is designed to reduce and streamline administrative tasks associated with module evaluations, not add to them. 

While Module Leaders have always been expected to conduct evaluations, the current fragmented process is inefficient. 

This new system, provided by Evasys, automates many of the manual steps, such as sending emails, tracking responses, and compiling reports. 

This will free up time, allowing staff to focus on the most important part of the process: reviewing feedback and implementing improvements. 

The goal is to make the process more efficient, transparent, and ultimately, more valuable for both staff and students.

What is the expected time commitment for students to complete the survey?

Students are expected to complete the survey in five to ten minutes.

The survey is short, typically consisting of ten Likert-scale questions and one open comment box. 

What do I see as a front end user?

The Staff Portal gives module leaders an overview of the surveys for which they are responsible.

Log in using your University username and password.

Within the Portal, you can:

  • Access information on current evaluations (eg open/close dates and number of participants)
  • View a copy of the survey
  • View a QR code for the student module evaluation portal (which can be displayed to students)
  • Monitor live response rates
  • Download a copy of the evaluation report once the survey has closed
  • Close the feedback loop with students (see section below)
  • Download reports for historical surveys

To encourage student engagement, module leaders are encouraged to display the QR code to students during teaching. Students can scan this QR code to access their evaluation.

How do I correct a mistake with the module I've been assigned?

Information on who leads modules is based on the records held in SITs.

A data quality check was performed and records were updated when necessary. 

If you have not been assigned a module that you lead, or been mistakenly assigned one, please contact student-surveys@york.ac.uk

Corrections to the data will be made once students have responded at the end of the survey window. 

The survey window

How are surveys distributed to students?

Students will be sent an email each time they have module evaluations ready to be completed.

This email includes a link to the student module evaluation portal, where a student will find all module evaluations which are open to them.

There is also a QR code, which gives access to the portal, where they can access all modules. They need to select the relevant survey from the list.

From Semester 2 2025/26 it will also be possible to access surveys via the VLE.

Will I be notified when a survey opens and closes?

Module leaders receive an email at the time a survey opens and closes.

How long will my survey stay open?

Normally, evaluations are open for a period of two weeks.

How are the start and end dates agreed?

The start and end dates for evaluations are agreed with between the Pro-Vice-Chancellor for Education and Students, the Academic Registrar and Academic Quality after consultation with the Students’ Union.

How should we handle surveys for modules that run outside of the standard semester dates?

For non-semesterised modules (eg year long, summer), the survey window should be scheduled to open towards the end of the teaching delivery for that module, following the same two-week duration.

Please contact student-surveys@york.ac.uk if you need additional survey windows setting up. 

Survey closure

What do I need to do once my survey ends?

Once a survey closes, students who have completed the module survey will automatically receive a pdf of the quantitative survey results. 

Module Leaders will then complete the reflection questions on the Staff Portal. 

They will then be able to send their reflections to all students, this is called Closing the Feedback Loop.

What is closing the feedback loop?

Closing the feedback loop is a core policy requirement to support the continuous enhancement of the student learning experience and maintain the academic quality of the University’s educational provision. 

This step is vital to amplify the student voice by ensuring student input is heard and acted upon promptly, directly countering the feeling that feedback has little to no effect.

To achieve this, a summary of the evaluation results and the school/department's response must be provided to all eligible students within ten working days of the evaluation closing date, transforming evaluation data into clear, visible and accountable action. 

It’s important that we demonstrate to students that we have listened to their feedback, so it is crucial that we close the feedback loop following an end-of-module evaluation.

Why are quantitative results released immediately after survey closure?

The system automatically releases quantitative results to those that completed the survey, immediately after it has closed.

There are two reasons for this. 

The primary intention is to build trust with students: they get a prompt acknowledgment that their feedback has been received, and a summary of their cohort's results.

In addition, administrative tasks are streamlined and staff time is prioritised for the moderation phase: providing the reflections and action plans to address student feedback.

How is the Module Leader report structured?

The Module Leader's report provides three components for data analysis and to support their reflection:

  1. Raw open comments: The complete text comments from all respondents.
  2. Quantitative data: The numerical scores.
  3. Word cloud: A visual representation highlighting frequently used words in the open comments.

We recommend Module Leaders:

  • Read all of the open comments. The word cloud should guide you toward potential themes (eg assessment, clarity, resources), but your final reflection must be based on the full text to capture nuance, context and single-instance critical feedback that may not appear in the cloud.
  • Balance with quantitative scores: The reflection should correlate themes from the qualitative data with the numerical scores. If scores are low in a specific area, the reflection should specifically address the comments related to that score.