What happened when 300 candidates in 19 colleges took a real live exam on computers for the first time?
While we may have well-established, trusted systems for e-authoring and e-marking, to suggest actually delivering exams and assessments on a computer is enough to bring most educators out in a cold sweat. What if the technology fails? What if there’s a power cut? What if the data gets lost? In fact there are so many potential pit-falls, most countries still have e-testing in the ‘too difficult to deal with’ pile with only the most tentative, exploratory noises starting to be made.
Undeterred by this seemingly daunting challenge, in 2011, the Infocomm Development Authority (IDA) of Singapore and the Singapore Examinations and Assessment Board (SEAB) made a public commitment to pilot e-testing. Approximately 300 students across 19 junior colleges would take their national Mother Tongue Language (MTL) tests in Chinese, Malay or Tamil, using a computer. The gauntlet had been thrown down and RM Results won the tender to deliver the pilot.
Failure is not an option
The objectives of the pilot were; to explore how e-testing could be implemented, establish what systems and processes needed to be in place to make it work and to pave the way for more innovative types of assessment in the future. From the outset, the team in Singapore took a rigorous and detailed approach. Failure was not an option. They were committed to learning from the process, knowing that mistakes would be made but embracing the opportunity they presented to fine-tune every aspect of delivering e-assessment.
With this clarity of purpose in mind, RM Results developed onscreen testing software that was trialled in 2012 in a non-live environment at number of schools to help refine the process. Following feedback from this initial trial we improved key areas of the software solution including improvements to security, the back-up strategy and general usability. As we moved closer to live testing, further fine tuning of the code was completed working closely with the SEAB and IDA team.
Dealing with the ‘what ifs’
In tandem with the software improvements, the SEAB team was focused on developing procedures to deal with any eventuality that could be experienced during the testing process. For example, what happens if a student finishes the test, puts their head on the desk and accidentally holds down one or more keys? What happens if a student needs to move to a new machine, in the event of hardware failure? Every possible scenario was worked through and a plan of action put in place so everyone knew exactly what to do, via documented standard operating procedures. Teachers and support staff were trained how to configure the computers for the test and how to deal with any issues that arose, including escalation routes.
Finally, in November 2013 after many trials and dry-run sessions the team were ready to deliver the live tests. 300 students across 19 junior colleges sat at individual laptops to take their national MTL tests. In the pilot and practice sessions SEAB had tested what would happen if the laptops failed, so that in such an event during a live test, rigorous processes would be in place, preventing any students being adversely affected. Should a laptop fail mid-test, students would be successfully moved to alternative machines, with no loss of data, within minutes.
Overall the pilot was deemed to be a 100% success, delivered on time, to budget and with good feedback from students.
The future of e-assessment
Despite this successful pilot in Singapore, we are still a long way from e-testing becoming the standard, but if all journeys start with a single step then this pilot was a significant one. It proved that it is possible, even with multiple languages to cope with, to successfully implement live testing. In large part this was down to the passionate commitment of the teams out in Singapore, RM Results’ years of assessment experience delivering software for high stakes testing environments as well as the close working partnership between the SEAB and RM Results teams. All these things made onscreen testing in Singapore schools a reality.