RM Compare™

Adaptive Comparative Judgement

Adaptive comparative judgement with RM Compare™

RM Compare™ is an effective solution for educational institutions, awarding organisations and professional bodies, wanting to improve their formative, summative, peer-to-peer assessment and moderation processes.

Introduction to RM Compare™

The RM Compare™ software is designed to improve and streamline the process of formative and summative assessment and moderation.

Developed by academics and education technology experts, it uses proven comparative judgement techniques to facilitate more meaningful and collaborative assessment and feedback, whilst improving attainment and helping to reduce the workload of teachers.

RM Compare™ uses the methodology of Adaptive Comparative Judgement (ACJ). It is based on The Law of Comparative Judgement conceived by leading psychometrician L.L. Thurstone, which proves that people are better at making comparative, paired judgements.

ACJ involves assessment through comparison of two anonymised pieces of work. The assessor is simply asked to use their professional judgement to select which of the two pieces is better.

Through repeated comparisons, assigned by an iterative, adaptive algorithm, a scale can be created to measure the quality of the pieces. Instead of demanding an absolute, isolated judgement using set criteria, ACJ uses comparative, paired judgements, giving a forensic understanding of what a ‘good’ piece of work looks like.

RM Compare™ is suitable for organisations all across the educational spectrum: schools, colleges, higher education facilities, professional bodies, and awarding organisations can all benefit from use of the software.

Its innate flexibility means it can be used in operations of any scale, from individual organisations and schools, up through local, regional, national levels, and even allowing for seamless international collaboration.

Users can collaborate on assessments wherever, however and whenever works for them, offering an unprecedented level of control over the assessment and moderation process.

How Does RM Compare™ Help?

Improve Student Outcomes

Significantly improve attainment with ACJ, which has been proven to positively impact the performance of students by unlocking a greater understanding of what exemplar work contains, aiding their approach to future work and learning.

Reduce Teacher Workload

Avoid overwork and marker fatigue with this collaborative approach between assessors in a more natural and dynamic assessment process. The faster process allows educators to utilise time that would have otherwise been spent on traditional marking, on planning and delivering high quality, tailored learning.

Increase Assessment Flexibility

Assess whenever and wherever is convenient for you. The cloud-based service ensures that your assessments will always be close to hand, meaning the assessment process can take place around a busy schedule.

Straightforward Collaboration

Collaborate at any level from local to international. Joint assessment between institutions allows a parity of marking and grading and a collective raising of standards within wider communities. This more comprehensive sense of what constitutes high-level work from multiple institutions, supports Educator Continuous Professional Development.

Strengthen Formative Feedback

Consistent formative assessment from a broad range of perspectives throughout the year keeps students always aware of where they need to improve and allows educators to sculpt their curriculums to meet the specific needs of their students.

Leverage Professional Judgement

All work is assessed independently, using the professional judgement of multiple assessors which offers an improved level of reliability and consistency.

How can adaptive comparative judgement be used?

Formative Assessment

Summative Assessment

Peer to Peer

Formative assessment has traditionally been used to monitor students’ learning, providing ongoing feedback to improve and enhance teaching and learning.

The strength of this assessment method, in terms of its overall impact on learning and students’ progress, can be variable. The quality of the feedback also depends on the quality of the evidence gathered, and it can be time-consuming for educators to thoroughly examine student work.

The most effective formative assessment can be achieved if educators have a robust method of assessment, which can reduce the amount of time needed to mark work and bring teachers together for moderation, as well as provide a forensic understanding of a student’s strengths and areas for improvement.

RM Compare allows for the scalable delivery of formative assessment of a group of students across a class, a group of schools, nationally or even internationally, to be marked by a cohort of teachers. One of the major benefits of using Adaptive Comparative Judgement (ACJ), is that work can be compared across a wider cohort, meaning that the educator gains a greater breadth of understanding in relation to what ‘exemplar’ looks like within a given subject context. Additionally, as student work is assessed anonymously by a larger group of teachers, it reduces the effect of any unintentional bias.

Summative assessment can be particularly challenging in the context of open-ended student work where there is no single correct answer, common in subjects that require creativity such as drama, music, or English. A common issue in marking this work is unintentional assessor subjectivity.

This can be a result of subconscious reference by the assessor to previously assessed work, or differing interpretations of complex assessment rubrics, both of which can create disparate preconceptions of what the best work should look like. Additionally, assessor workload and fatigue can exacerbate those issues. The result is that the reliability of the marking process can diminish over time.

RM Compare addresses these issues through Adaptive Comparative Judgement (ACJ). It harnesses the power of collective professional judgements, where assessors are able to compare two pieces of work side-by-side and decide which is better, rather than needing to mark each individual piece of work against a set of marking criteria. This comparison process is done by multiple assessors to ensure a high level of reliability – on average each piece of work is seen and judged at least 14 times by multiple assessors.

RM Compare’s built-in adaptivity makes this process even more efficient. An intelligent algorithm adapts how often each piece of work needs to be seen and judged, and which work to display it alongside, based on previous judgements made by all of the assessors in the process. This minimises the potential for any one assessor to influence the final grade achieved by each student, reducing the effects of potential subjective bias or assessor fatigue.

The result is a process that minimises assessor workload, removes the need for separate moderation processes, and ensures the highly consistent and reliable summative judgements of students’ open-ended assessments.

Peer-to-peer assessment, or learning by evaluation, through Adaptive Comparative Judgement (ACJ) has proven to be particularly powerful and transformative in certain forms of assessment. When assessing open-ended responses - such as tasks where students are asked to evaluate or produce subjective, open-ended material - ACJ can be used as a learning intervention for a significantly positive impact on learning outcomes.

The use of ACJ to constructively evaluate the work of peers has been proven to improve the performance of learners. Educational researchers have carried out tests with students across a wide age range. Their results consistently show that every student who used ACJ as a learning intervention tool achieved significantly higher levels of attainment, outperforming students that used a more traditional approach to peer assessment.

RM Compare makes the peer to peer assessment process seamless – students can compare two pieces of work side-by-side in order to achieve a much better understanding of what the best work includes, enhancing their understanding of the goals and how to achieve them.

“We see it as a great opportunity to help our students learn through evaluation – it turns the assessment process into a learning experience. The students collaborate with peers and with teachers to successfully improve their grades.”

Scott Bartholomew, Assistant Professor, Engineering/Technology Teacher Education
Purdue University

“We found four big reasons why we believe adaptive comparative judgement is very successful for students.”

Scott Bartholomew, Assistant Professor, Engineering/Technology Teacher Education
Purdue University

“Interestingly, the use of ACJ technology did not just benefit either the highest or lowest achieving students. It actually boosted the attainment of all the students who used RM Compare as a learning intervention.”

Scott Bartholomew, Assistant Professor, Engineering/Technology Teacher Education
Purdue University

"RM Compare helps to boost the attainment of learners and improves the teacher’s judgements, as they have a clearer view of what good quality writing looks like, leading to better professional conversations between teachers at school."

Steve Dew, Headteacher
Church Cowley St James Primary School, Oxford, UK

“We are able to use this software to have more frequent assessments, across every year group and every child in the school. Which is different from the national assessment that we have to do for their writing, where everything comes to a head and at the very end of the year.”

Rhiannon Wilkie, Teacher and Local Authority Moderator
Church Cowley St. James Primary School, Oxford, UK

“The process of moderation is designed to quality assure against things like bias in the traditional assessment setting. With this, there wouldn't be any need for that. It quality assures the judgements you're making by putting them against the judgements of many other teachers.”

Rhiannon Wilkie, Local Authority Moderator and Teacher
Church Cowley St. James Primary School, Oxford, UK

University  How Purdue University improved student attainment using RM Compare for Peer  to Peer assessment  Enabling students to engage in a collaborative peer to peer assessment  experience as part of their learning process. Continue reading  University  How Purdue University improved student attainment using RM Compare for Peer  to Peer assessment  Enabling students to engage in a collaborative peer to peer assessment  experience as part of their learning process. Continue reading
Schools  How Oxfordshire schools benefited from RM Compare  Oxfordshire schools successfully pioneer a new method of collaborative  assessment and moderation. Continue reading  Schools  How Oxfordshire schools benefited from RM Compare  Oxfordshire schools successfully pioneer a new method of collaborative  assessment and moderation. Continue reading

© 2019 RM Education Ltd. All rights reserved.