Abstract
This paper analyzes the quality of scores in open questions exams through Many-Facet Rasch (MFR) model. Scores assigned to essays compiled by the participants in the college entrance examination for Londrina University, in 2015, were used. The MFR Model can provide both group and individual level studies, enabling the determination of raters with biased behavior, which are known to cause significant errors in the written assignment scores. Analyzes at group level showed that the evaluation was efficient and the data, in general, are suitable for Rasch models measurement and through the analyzes at individual level it was possible to find raters who scored differently from the average scores of others reviewers. The MFR model proved to be an appropriate and effective tool for monitoring the quality of scores assigned to writing tasks.
Many-Facet Rasch Model; Open questions; Essay Tests; Large scale evaluation; Rater tendency