Accessibility statement

Building in feedback

The conversational framework emphasises the need for a design in which students can be:

  • challenged and motivated to produce ‘outputs’ that articulate their knowledge or allow them to  apply it in some way (eg responses to questions, reflections or recommendations on case-study scenarios, summaries of arguments, critiques, verbal or written presentations).
  • provided with an opportunity to improve knowledge articulation or application through reflection and feedback (from peers as well as staff).

The challenge for staff is to create and facilitate an environment in which students are motivated to keep engaging with tasks, explanations, demonstrations and models of success, to keep producing and improving their outputs.

Feedback opportunities are an integral part of this process. These need to be built-in so that students are provided with information to support improvement in the form of models, criteria, and opportunities to reflect on outcomes, as well as direct feedback on their efforts. 

Feedback from teaching staff has an essential role, but there are strong pedagogical and practical arguments for designing a mix of approaches to feedback within your module. Potential approaches include:

  • Individual and group feedback.
  • Peer feedback.
  • Use of models and exemplars.
  • Support and encouragement to record and revisit feedback as part of ongoing reflection on progress.

These activities can be underpinned by active use of assessment criteria.

(For a more detailed description of Laurillard's framework see: Laurillard, D. (2009). The Pedagogical Challenges to Collaborative Technologies, International Journal of Computer-supported Collaborative Learning, 4, 1, 5-20) UoY Library Permalink.

Some specific options for designing for feedback-rich activity include:

Active learning tasks and adaptive exercises

  • Build-in regular ‘active learning’ tasks such as pause points for (individual and peer) response and reflection during synchronous or asynchronous lectures or when engaging with set texts or written materials posted to the VLE.
  • Create online exercises that incorporate feedback and give learners guidance and opportunities for action based on their individual needs, for example:
    • Use the VLE test tool or Google Forms to create multiple-choice quizzes with automatic feedback for specific responses.
    • Use survey tools such as google forms or qualtrics to create ‘branching’ activities which provide further questions or feedback information depending on choices made or responses given. 
    • Set ‘release conditions’ for content within the VLE so a particular item or file is made available only once students  have met certain criteria such as a particular score in a test or other gradable item.
    • Use authoring tools such as Xerte to create more complex interactive learning materials.

‘Micro’ tasks

  • You can incorporate focused tasks as a routine element of your design, and build peer-feedback by setting up an opportunity for students to share and learn from the outputs.  These ‘outputs’ might include:
    • Short reports and responses posted to a discussion board or submitted via VLE short answer tests.
    • Responses to questions or points for discussion in the classroom and gathered via classroom polling tools or collaborative documents.
    • Student presentations which can be in-person or recorded.
    • Collaboratively created digital documents (eg Jamboards, Padlets, Google Docs).
  • Tasks can be used as preparation for active participation in classroom discussion or in order to extend engagement outside of staff/student contact sessions.  These can be designed to involve structured responses to peers and with reference to outcomes and assessment criteria. Salmon’s (2013) framework for designing online learning activities (which she refers to as e-tivities) can be helpful in providing a structure for designing asynchronous tasks. She suggests that e-tivities should include:
    • A clear introduction to the task(s) with a clear and enticing title and an outline of the purpose;
    • A summary of the task incorporating a 'spark' - a question or challenge to begin the dialogue;
    • An invitation to make a clearly-defined individual contribution, avoiding any kind of topic or approach that 'closes' the topic by, for example, involving a single correct answer or limited number of possible approaches;
    • An invitation to respond (feedback), again clearly defined in terms of the length, timing and type of posting;
    • A clear outline of what the tutor role will be (typically designed to gradually increase student autonomy);
    • A clear outline of timescales for the e-tivity as a whole leading to a final outcome and a connection made to the next steps. 

      Salmon, G. (2013). E-tivities. Routledge: Abingdon. UoY Library Permalink.

‘Multi-staged iterative tasks’

A series of related tasks connected in sequence can support the development of learning through your module, eg designed to support students towards assessment.

  • Discussion boards, wikis, blogs or digital documents (eg Jamboards, Padlets, Google Docs) can be used to support collaborative approaches to these tasks by providing a space for student groups to record the outputs in a logical sequence. 
  • Learning journals or blogs can support (individual or group) reflection on these activities.

Maximising opportunities to learn from practice

Blended learning design affords opportunities for students to prepare for, or review in-person practice-based sessions.  This can enable them to maximise focus and time on the learning task itself:

  • “Walkthroughs” can be as simple as a photo set and descriptive text of eg a field trip site or, where technology/digital skills/funding allows, something as digitally advanced as a bespoke designed digital simulation.
  • 360 video tours can help orientate students to specific working environments (eg a lab).
  • Screen or video recordings can be used to demonstrate how to use equipment or tools.
  • Structured reflective activities and peer-feedback processes (supported by criteria).

Practical tasks can also be designed to maximise the value of ‘intrinsic’ feedback which is provided through the task itself rather than direct from staff or students. 

  • This is often built into ‘authentic’ activities and assessment involving real world application, eg Does the algorithm have the desired effect? Was the experiment successful? Was the investment portfolio profitable? Did the intervention save the simulated patient? The benefits can be maximised through tasks (perhaps structured along similar lines to the ‘e-tivities’ approach above) encouraging students to compare approaches and results and/or carry out structured reflection on the outcomes leading to plans for future action.
  • Providing an environment rich in intrinsic feedback can be more challenging for abstract conceptual topics but a possible approach involves the use of models (model answers or exemplars for comparison) which can provide an environment for comparison, interpretation and improvement without direct intervention from teaching staff.