Wednesday, November 4, 2015

Incorrect Data isn't Useful

The other day I went to the Health Center for a followup checkup. I had been in previously and had gotten some anti-biotics for an insect bite that got infected. Simple, right? As part of the visit, the nurses are instructed to take routine weight and blood pressure measurements.

I know my blood pressure, so I was surprised that her diastolic reading was 20 points lower than it normally is. I remarked on that. Her reply was "Lower score is good, right?" in the tone of voice that conveyed clearly that I shouldn't be questioning her.

I'm thinking, "Sure ... unless it's a bad measurement." I get that BP is inexact, but it's a bit silly to refuse to re-measure it when the patient points it out. 20 points can make all the difference to the doctor's diagnosis of my overall health.

I decided that I would request the printout from the front desk as I left, the one with all of the day's numbers and decisions from the visit. I read the scale ... same weight as two weeks ago. On the printout, though, it was different ... she had obviously transposed digits when entering the data. In two weeks, I had "gained 23 pounds" and then lost it again in the 30 minutes it took to drive home. My blood pressure changed by 20 points, and that's a lot.

Bad data makes for inappropriate diagnosis.

Bad data makes for bad education policy, too.

The state of Vermont is "suffering" through the release of the first round of SBAC scores despite our scores being better than most other states (we're usually top 5).  "Results are much lower" and already my principal is bitching about it, despite declaring at the time, "We don't care what the scores are, we just want to get the process right." (I'm paraphrasing but that was the intent.)

I'm all for improvement, but I hate basing change on the back of bad data.  Our diagnosis is flawed because our data is flawed, and the prescription runs counter to other policies that the State has imposed.

First, the SBAC has measurement errors just like my nurse had.  Many students took that test knowing that scores would not be held against them, that there was absolutely no chance that anyone would see the scores in fewer than six months or act upon them to set courses for this year or college applications. Additionally, the test itself is drastically different in format (and it's all done through the Chromebook) ... a test completed entirely on-line.

There are no multiple choice questions and kids can have scratch paper, but they're not used to doing math that way. There's a lot of "drag the factors to the answer box" and write three paragraphs explaining why you know that this is a straight line .. and few can stretch out an explanation that far.

Second, and just as  important, the SBAC "passing scores" were decided upon after the fact, to make the percent-passing numbers match what the state had decided they should be ...

That's right. Before the kids even took the test, they told us that there would be a state-wide passing rate of 33% on the HS math test. Then they set the cut-score to match.

Third, add in the fact that we are a small school and we pride ourselves on being able to provide a more personalized education that your average public school, including having personalized learning plans that had quite a few students taking Algebra 2 as seniors. I'm sure you can see where this is going: many of our kids were taking a test heavily based on mathematics they hadn't seen yet.

This runs directly counter to another major initiative in the State of Vermont, the Personalized Learning Plan. Sometimes called "Personal Pathway to Graduation", the initiative requires schools to design different course pathways to graduation for each student as appropriate. This includes allowing schools to schedule certain kids into a faster progression for math and others into a more moderately paced path that might not even include algebra 2. It means that "pre-algebra, algebra 1, geometry, algebra 2" might be the most appropriate for a student.


Taking that approach and then complaining that they don't know algebra 2 by March of their junior year is silly.

It's the rhetorical equivalent of reading a graph that says that Pre-calculus students do better on the NAEP and then concluding that we must make sure every student takes Pre-calculus by the time state tests are given in the junior year
 ....

which a previous principal actually said.

So when the bright bulb in the room points out that we teachers should prepare the kids for college and careers, and should have prepared the kids better for this test, and "If you hold the kids to a higher standard, they'll rise to meet that standard," I will calmly channel Dick Cheney and say that "you go to war with the students you have, not the students you wish you had."

Finally, the teachers are not allowed to know what's on the test.  I don't want to teach to the test, but I'd like to know what is included.I'll give the same assessments I would already have planned, but I might change some questions to a similar format, for example.

Also, I'm not willing to just take their word for it that the test is appropriate. We can't check for bad questions that might have tripped up our students and we can't check that the answers they gave were correct or not. We have to take Pearson's word that the scorers actually knew what they were doing.  After reading Todd Farley's book, Making the Grades: My Misadventures in the Standardized Testing Industry and others with similar tales of the realities of corporate test making and scoring, I'm not particularly willing to do that.

The NY Regents is an example of a relatively open and transparent test-making system, but it has many errors. The NY teachers can catch these problems and get them fixed. If we look at Mr. Honner's long-running series reviewing the NY State Regents exams in mathematics, why should we expect that the SBAC tests will somehow be perfect if there is no chance for oversight?

The SBAC is a closed system with no accountability that scores the tests in strange ways, fails to take into account the realities of the students, will not allow anyone to analyze or even examine any of the questions (unlike the SAT which I can see in its entirely within a few weeks), spits out pre-determined results that do not reflect student abilities, and makes everyone wait an unconscionably long time for those results ... much too long for the school to do anything with them.

I can't use the scores because they aren't detailed enough, timely enough or accurate enough.

I guess I'll just teach math and ignore all that bluster.

1 comment: