Skip to main content

Research Repository

Advanced Search

Automated assessment of non-objective textual submissions.

Christie, James Rennie

Authors

James Rennie Christie



Contributors

H.I. Ellington
Supervisor

S. Earl
Supervisor

R. Butler
Supervisor

Abstract

The author sought to explore the methodology of automated marking of non-objective textual submissions. In so doing he tasked himself to discover how far automated assessment of text could be applied in the range of submissions from the single word to a multi-sentence, multi-paragraph piece of text. In the course of his research, the author had to determine the extent and severity of any problems associated with manual marking of such submissions. In fact, the thesis informs the reader of the myriad problems associated with, and generated by, the subjectivity inherent in the manual marking process and the author indicates to what extent automated marking can remove or reduce these problems. In parallel, the author had to determine the existence, if any, of automated assessment of such type of submissions. The literature survey did show that some work had been done in this subject area, but that most of the work dates from the 1960's for style marking and only recently has some work on content marking been published. The author devised algorithms and implemented them in a software system, called SEAR (Schema Extract Assess and Report), which was intended to process word-processed submissions and award marks. Lack of suitable marked data sets prevented the full development of the style algorithm. However, the author was able to demonstrate that content marking is possible with a range of submissions. The proposed style algorithm is based on a novel idea of determining a set of metrics that could be applied to all textual submissions. The data structure that was developed for the marking of content is unique. The author also compiled a set of potential criteria for use in the evaluation of the methodology of automated marking of textual submissions. These criteria were applied to his software system.

Citation

CHRISTIE, J.R. 2003. Automated assessment of non-objective textual submissions. Robert Gordon University, PhD thesis. Hosted on OpenAIR [online]. Available from: https://doi.org/10.48526/rgu-wt-1585045

Thesis Type Thesis
Deposit Date Feb 3, 2022
Publicly Available Date Feb 3, 2022
Keywords Text recognition; Automation; Artificial intelligence; Marking; Examinations; Assessments
Public URL https://rgu-repository.worktribe.com/output/1585045
Publisher URL https://doi.org/10.48526/rgu-wt-1585045
Award Date Feb 28, 2003

Files




Downloadable Citations