Sunday, February 6, 2011

Testing, Scoring and Trusting the Data

A fascinating case study is the NY Regents, scoring, and the unintended (or purposeful) consequences of a line in the instructions to the scorers.

First, here's the graph of the number of kids getting each score (from WSJ).  The issue can be seen quite clearly. The graph overall is a typical left-skewed distribution, as you'd expect from this type of test. The trouble comes when you explore that jump in the middle at the passing mark.

Of course, the nattering class is all up in arms over this, claiming fraud and misconduct.  You can almost hear jeers of "Union Bastards trying to save their jobs by lying on the tests."  Unfortunately for those people, the reason comes down to one sentence:
"[State officials] note that the state actually requires teachers to regrade certain Regents tests where the student barely fails in order to check for grading errors."
When you are singling out tests for special consideration, and the stated focus is to look for "scoring errors" on "barely failing tests", the only way for the scores to change is up.  Since it's easiest to give one or two points to a 64 or 63, it's logical that those would be the most effected.


Looking at the graphs, you can see that some teachers (probably a school at a time) set their cut off at 50 while the majority set the cutoff at 55.

Why the negative slope in that region? When looking for ambiguous answers which could be scored higher, you have to "rescore" each problem until you get an appropriate number of points. It's easier to find one such than 10 such.

Similarly, since there was no reason to check more answers than just enough to get the kid to pass, the uptick was only to 65, though it seems that many teachers weren't keeping close track of the extra points and brought the kid up to a 66 or 67.

Conversely, if the teachers had been instructed to rescore all students within ten points of the cutoff instead of just those who were below it, then you would have seen some students adjusted downward, balancing out much of the upward movement.  If there were a similar uptick at the 65 mark in this hypothetical, then and only then can you claim that teachers are deliberately mis-scoring to jigger their VA measures. (and even then, I'd put the reason as teachers wanting to help students rather than being so coldly self-interested.)

The place where I found the link to this article had this comment: "Teachers don't want to flunk kids that just barely miss the passing score. Until all responsibility for creating and scoring state exams is given to an independent body with no interest in the results of the tests, the results reported should be viewed skeptically."

Ummm, no. Scoring tests is really complicated.  Pearson, the biggest company, uses part-time, barely out of college, minimum wage people to do the scoring. Getting the "right" score is more a matter of whether or not the scorer speaks English and actually knows the material.

Frankly, given the mess that the testing industry is in when it comes to scoring, I have a feeling that the teachers are doing a more conscientious job. If you want a nasty introduction to the follies of testing company scoring sessions, check out Todd Farley's "Making the Grades." It's well-written but damn depressing if counting on accurate test scores because you're stuck in the hell of value-added and merit pay.

1 comment:

  1. Your post points to one of the reasons why SaveOurSchools has formed and we will be taking action to get the word out that America's public schools need to be supported. We have an upcoming Blog Campaign that we would "love" for you to participate in. It takes place this Monday, Valentines Day, and it is called: "I Love Public Education Blog Day"

    Everyone who cares about young people cares about our schools. Our best schools nurture our children, make them feel safe, and able to take the risks they need to in order to learn. But our schools are in danger of becoming even more narrowly focused on test preparation, while class sizes rise, and teachers are blamed for the ravages poverty inflicts on their students.

    We are responding. We love our schools. We declare Valentine’s Day, 2011, to be I Love Public Education Blog Day. On this day we will write our hearts out, about why it is that public education is so important to us, our children, and our democratic society. If you or your readers will join us and tell why you love public education too, send your comments and posts to saveourschoolsmarch@gmail.com.

    Writing will be displayed at the www.SaveOurSchoolsmarch.org website, and will be tweeted with the hashtag #LovePublicEd. We offer the march and events of July 28 to 31st in Washington, DC, as a focal point for this movement, and we ask participants to link to this event, so we can build momentum for our efforts. If your readers wish to repeat this post on their own blog, we would love it!

    Thank you!
    Kelli Reyes

    ReplyDelete