For our CS224D’s final project, Lucio and I took on Kaggle’s Automated Essay Scoring competition. We tried to build a model that can automatically grade your essay. You input an essay and voila, it outputs the score for it. The dataset we have is for essays grade 7 to 10, but the model is easily scalable. It can be used to grade SAT/ACT practice essays or any kind of essays, as long as we have enough training data.
The competition was held in 2012 with the participation of 154 teams. The evaluation metrics they used was quadratic weighted kappa (QWK) to measure the similarity between the human scores and automated scores. The winning team for the Kaggle competition got the QWK score of 0.81407.
We got the score of 0.9447875 using a simple 2 layered neural network with word vectors of 300 dimensions.
Our 2-layered neural network’s performance with different word vector sizes.
Does it mean that if we had joined the competition, we would be in the money? I don’t think so. There are several reasons:
- The competition was held 4 years ago. The machine learning/NLP scene was much different back then. We didn’t find any paper by the winning team so we didn’t know what models they were using, but we doubted they used neural networks. They probably used hand-engineered features. We also trained our word vectors, initialized them to GLoVe pre-trained word vectors. GLoVe pre-trained word vectors weren’t available 4 years ago.
- We didn’t have access to the real dataset those teams were evaluated on. We got the training dataset and we used part of it as the test set. Maybe our models wouldn’t do as well as the real test set.
Nonetheless, we were surprised at how well a simple neural network did. None of our models used more than 1000 lines of code.
We are thinking of trying to get a SAT/ACT essay dataset so we can train our models to help students practice for SAT/ACT. If you have any lead where to obtain these datasets, please let me know. We also wrote a paper detailing what we did. If you are interested in reading it, shoot me an email.