The development and application of an expert marker system to support ‘High Stakes’ examinations
conference contributionposted on 24.05.2006 by Patrick Craven, S. Long
Any type of content contributed to an academic conference, such as papers, presentations, lectures or proceedings.
been used as the benchmark assessment for vocational IT skills ever since. It currently attracts over 300,000 entries per year across 9 different application areas. This equates to over 1m scripts per year, all of which require marking, moderation and processing to award of result. Over the last 5 years OCR has worked with the University of East Anglia to develop an automated ‘expert’ marking system for four of the application areas within CLAIT (Core, Word Processing, Spreadsheet, Database). This system is now operational and during the first pilot phase successfully supported 30 Centres and processed over 2000 script submissions. From initial concept the system set out to extend the scope of computer assisted assessment beyond simple skill tests as it was not desirable to reduce the CLAIT assessment to series of atomised functions tests or to place software simulations within testing Centres (over 3500 in the UK). It was also important that the candidate experience of the assessment remained relatively untouched and so an off-line delivery/on-line processing solution was proposed. This paper provides an overview of the modular approach taken by OCR and reflects on the successes and challenges faced during implementation. The presentation will provide a demonstration of the system with a brief explanation of the intelligent workflow processing and the application of expert marking rules to ensure valid and reliable assessment outcomes.
- University Academic and Administrative Support
- Professional Development
- CAA Conference