Readability Annotation: Replacing the Expert by the Crowd
- Publication type
- Publication status
- van Oosten, P., & Hoste, V.
- Proceedings of the Sixth Workshop on Innovative Use of NLP for Building Educational Applications
- Association for Computational Linguistics (Portland, Oregon)
- External link
This paper investigates two strategies for collecting readability assessments, an Expert Readers application intended to collect fine-grained readability assessments from language experts and a Sort by Readability application designed to be intuitive and open for everyone having internet access.
We show that the data sets resulting from both annotation strategies are very similar. We conclude that crowdsourcing is a viable alternative to the opinions of language experts for readability prediction.