That was one of the key findings from a new of Automated Essay Scoring (AES) tools produced by eight commercial vendors and from Carnegie Mellon University.
While writing is an essential part of the educational process, many instructors find it difficult to incorporate large numbers of writing assignments in their courses due to the effort required to evaluate them. However, the ability to convey information verbally is an important educational achievement in its own right, and one that is not sufficiently well assessed by other kinds of tests. In addition, essay-based testing is thought to encourage a better conceptual understanding of the material on the part of students and to reflect a deeper, more useful level of knowledge and application by students. Thus grading and criticizing written products is important not only as an assessment method, but also as a feedback device to help students better learn both content and the skills of thinking and writing. Nevertheless, essays have been neglected in many computer-based assessment applications since there exist few techniques to score essays directly by computer. In this paper we describe a method for performing automated essay scoring of the conceptual content of essays. Based on a statistical approach to analyzing the essays and content information from the domain, the technique can provide scores that prove to be an accurate measure of the quality of essays.
Automated Essay Scoring in Review
Landauer, T. K., Laham, D., & Foltz, P. W. (2003). Automated scoring and annotation of essays with the Intelligent Essay Assessor. In M. D. Shermis & J. C. Burstein (Eds.), Automatic essay scoring: A cross-disciplinary perspective (pp. 87-112). Mahwah, NJ: Lawrence Erlbaum Associates.