COMPUTER ASSISTED LANGUAGE LEARNING, vol.37, no.4, pp.961-985, 2024 (AHCI)
Feedback is generally regarded as an integral part of EFL writing instruction. Giving individual feedback on students' written products can lead to a demanding, if not insurmountable, task for EFL writing teachers, especially in classes with a large number of students. Several Automated Writing Evaluation (AWE) systems which can provide automated feedback on written texts have been developed to reduce the time and effort teachers need to give individual feedback on students' writings. Employing a quasi-experimental research design, this study aimed to examine how automated feedback impacted students' writing scores and writing accuracy. The data were collected from 75 Turkish EFL university students. The experimental group students were exposed to combined automated-teacher feedback while the control group students received full teacher feedback. Both quantitative and qualitative data were collected through pre-test/post-test writing tasks, Criterion error analysis reports, and student reflections. The results of the study revealed that the students who received combined automated-teacher feedback improved their analytic writing scores as much as the students who received full teacher feedback. However, combined automated-teacher feedback was more effective than full teacher feedback in reducing the students' grammar and mechanics errors. The qualitative findings obtained from the student reflections about the Criterion feedback helped to understand its impact on writing improvement. The study provided implications for effective use of AWE in EFL writing classrooms.