Assessment literacy and scorer reliability

Enhancing assessment literacy amongst PGT students and scorer reliability amongst PGT staff

Team Members: Claudia Rosenhan, Farah Akbar

Abstract

Background: The proposed study investigates the student and staff responses to updated College PG Assessment Criteria used across the MSc TESOL and Language Teaching at MHSE.  These criteria are broken down into level descriptors and the language of these descriptors forms the basis of feedback given by markers on assignments.

Level descriptors facilitate scorer reliability on open-ended responses as well as guide students’ interpretation of their performance.  If these descriptors are ambiguous, students frequently complain of a disconnection between their mark and their feedback.  Also scorer reliability suffers across large cohorts.  Such issues have been observed by staff on the MSc TESOL programme in previous years and a working group has updated the language of the level descriptors to facilitate a) enhanced reliability across larger groups of markers who tend to interpret the level descriptors based on individual experience and b) also enhance the assessment literacy of PGT students, many of whom are from an international background.  Assessment Literacy is an important graduate attribute as it makes students into autonomous and critical analysts of their own learning.

Aims: The aim of this study is to evaluate the changes that have been made to the descriptors.  For that purpose a number of evaluations, such as piloting the new descriptors and surveying responses from students and staff will be carried out.  These evaluations serve to validate the changes and enable the new descriptors to be recommended across PGT courses in the College.

Final project report

Download the final project report (PDF)