Most, if not all, students have to learn or improve their IT skills at university. Many universities in the United Kingdom give credit to students who take such courses and hence the students have to be assessed; a requirement which consumes a substantial number of examiner hours. This paper describes techniques and outcomes of a project to computerise the assessment of word processing examinations which is part of a larger project to computerise the assessment of a range of IT skills. The approach taken is to compare the candidates output with the correct solution generated by the examiner and, from this comparison, to categorise and report errors by type. A preliminary version of the software generated a number of useful suggestions for improvements, mainly of a managerial nature, and the final version is now in use in a number of universities and colleges.
|Number of pages||10|
|Journal||Information Services and Use|
|Publication status||Published - 1 Jan 1998|