IELTS Writing Revision Platform with Automated Scoring and Feedback
Analysis
This paper addresses the limitations of traditional IELTS preparation by developing a platform with automated essay scoring and personalized feedback. It highlights the iterative development process, transitioning from rule-based to transformer-based models, and the resulting improvements in accuracy and feedback effectiveness. The study's focus on practical application and the use of Design-Based Research (DBR) cycles to refine the platform are noteworthy.
Key Takeaways
- •The platform uses an Automated Essay Scoring (AES) system and provides targeted feedback based on the IELTS writing rubric.
- •The development progressed from rule-based to transformer-based models, significantly improving scoring accuracy.
- •Adaptive feedback implementation showed statistically significant score improvements, though effectiveness varied.
- •Automated feedback is best used as a supplement to human instruction, particularly for surface-level corrections.
“Findings suggest automated feedback functions are most suited as a supplement to human instruction, with conservative surface-level corrections proving more reliable than aggressive structural interventions for IELTS preparation contexts.”