Friday, August 21, 2009

Data and Improving teaching

Over on Right on the Left Coast, Darren spoke about his school's new program to track student scores and data.
Today we were introduced to the new software that's going to track every student's performance, on any number of tests (not just state standardized tests), for the 13 years they're in school. I can see how my last year's students did, or how my current students did on last spring's tests.

A response asked
"If there is some way to figure out how the students who are one grade above what you currently teach did on some of the math SUBtests (such as multiplying fractions, or dividing decimals or whatever), you may find a potential weak spot in your teaching of *this* year's students and head that off at the pass. Is the information that detailed?"

Information is rarely that detailed. They'll say "number operations" but you really don't know exactly what the question was. You're always left wondering how accurate the data is:

- was the kid tired, lazy, trying? did he care? (Scores don't count for the student up here - they don't have to pass or even take the test. They can just bubble in or fiddle.)

- was the question worded strangely or differently from what we do?

- What was the general trend/difficulty of the test/assessment? Up here in the frozen north, they don't release the whole test like the NY regents do and we often don't see the results until 5-6 months later. (Test in Nov, scores by April) It makes it difficult to tailor your next years teaching when you can't get a sense of the target.

If RoTLC can get more data from his program, it'd be neat. Ultimately, though, you can lose yourself in the data aggregation and disaggregation and forget that kids change and the data collection was suspect. The summer transitions often make those new 10th graders unrecognizable. Getting a job often turns 11th grade slackers into 12th grade students. Girlfriend issues make far more changes occur than Education Commisioners.

Data is wonderful, but students aren't data. All the data in the world might help you make some remedial work available, but four weeks later he 'gets it', now you don't need any of it.

RoTLC is going to hear a LOT about "Tracking" and "Don't let his past performance dictate his future." I suspect that he will also be up to his MilSpec eyeballs in outside concerns that skin color is affecting scores and placement, not ability. (With all the yellers forgetting that class is a much better indicator than skin color, but I digress)

Finally, I'd say to everyone: There's good teachers and bad teachers, teachers who connected with Johnny and those who didn't. Don't assume anything from his old scores until you know Johnny, and really not even then. The only really useful assessments are the ones you give out yourself.

It's kind of funny that we are assuming that past teaching was not good enough and we need to improve our teaching by poring over those same past results. We tend to dwell on those past scores as if they were generated by the best teachers in the world using the best teaching methods known to man. Seems like a disconnect to me.

I'm going to stick to teaching algebra, helping the weaker ones remediate, pushing the group to do their best. I'll test them for my purposes and when I'm done, I'll send them to the next teach with a good background. Any difficulties the kids had last year - well, that's what these inservice days are for, in my opinion - talking about students and what they need.

I know that my course is listed as "Algebra I," not "Topics as they occurred to me." My kids need to move on. They will need certain knowledge and skills. If they don't have either, then I can't let them move into "Algebra II."

No comments:

Post a Comment