posted on 2006-05-24, 14:11authored byPatrick Craven, S. Long
been used as the benchmark assessment for vocational IT skills ever since. It currently attracts over 300,000 entries per year across 9 different application areas. This equates to over 1m scripts per year, all of which require marking, moderation and processing to award of result. Over the last 5 years OCR has worked with the University of East Anglia to develop an automated ‘expert’ marking system for four of the application areas within CLAIT (Core, Word Processing, Spreadsheet, Database). This system is now operational and during the first pilot phase successfully supported 30 Centres and processed over 2000 script submissions.
From initial concept the system set out to extend the scope of computer assisted assessment beyond simple skill tests as it was not desirable to reduce the CLAIT assessment to series of atomised functions tests or to place software simulations within testing Centres (over 3500 in the UK). It was also important that the candidate experience of the assessment remained relatively untouched and so an off-line delivery/on-line processing solution was proposed.
This paper provides an overview of the modular approach taken by OCR and reflects on the successes and challenges faced during implementation. The presentation will provide a demonstration of the system with a brief explanation of the intelligent workflow processing and the application of expert marking rules to ensure valid and reliable assessment outcomes.
History
School
University Academic and Administrative Support
Department
Professional Development
Research Unit
CAA Conference
Pages
197071 bytes
Citation
CRAVEN and LONG, 2002. The Development and Application of an Expert Marker System to Support ‘High Stakes’ Examinations . IN: Proceedings of the 6th CAA Conference, Loughborough: Loughborough University