It is why I have such a visceral reaction to most education research. Principals, school boards and teachers (especially teachers) read preliminary research results and jump to wrong-headed conclusions. Then the school gets turned upside down to institute some "new" idea. Let's build "open classrooms" so the children can learn cooperatively with the teachers "doing inter-disciplinary work."
Education research relies on classroom anecdotes, faulty numbers
PETER BERGER, Vermont middle school teacher / Schenectady (NY) Sunday Gazette
Sept. 14, 2008
Education experts always claim their latest breakthrough is based on "what the research tells us."
Consider these specimens of actual education research:
* A study of juvenile crime determined that most of it happens after school, as opposed to during school, when most juveniles are in school, or after 11 at night, when most juveniles are in bed.
* According to a sexuality specialist, teenagers who drink are more likely to have sex. This confirmed what teenagers themselves discovered at drive-in movies.
* Investigators probing adolescent behavior calculated that a 20-cent tax on six-packs of beer would lower gonorrhea rates for 15- to 19-year-olds by 9 percent.
Their findings rest on the assumption that teenagers who are thinking of having sex will decide not to if it costs them each an extra dime.
* A bestselling pediatrician-turned-education-expert deduced there's no such thing as a lazy student.
His "science" tells him that children never "decide not to make an effort."
* British and American researchers concurred that overweight kids are more likely to be picked on.
* A Georgia team discovered that eighth-graders who study algebra tend to do better in "higher level" ninth-grade math classes. Also there appears to be a correlation between success in ninth-grade English and reading lots of books in eighth grade.
* ACT analysts determined that students who can read "complex" material are more likely to be ready for college than students who can't.
* Students rejected by their classmates are more likely to withdraw from school activities. Preschoolers whose parents drink and smoke are more likely to choose alcohol and cigarette accessories for their Barbie dolls.
* Students who rank in the bottom fifth of basic skills have a low probability of completing college. Kids with "academically oriented friends" tend to do better academically, while kids whose friends are "delinquent types" are more likely to wind up in trouble.
In 1993, research conclusively told us that girls were achieving less than boys and were victims of a gender gap. By 1994, these conclusions were under attack, and by 1999, the data were telling us that boys, not girls, were achieving less and were victims of a gender gap.
The research also validated single-sex schooling until a March 1998 report cast doubt on the value of single-sex schooling, a charge that was irrefutable until an April 1998 report confirmed the benefit in single-sex classes.
In 2001, the research demanded that schools "give single-sex classes a chance," except when it concluded with equal certainty that single-sex programs were a failure.
When most people think of research, they picture facts, figures and experimental results. Unfortunately, education's numbers come from standardized testing, which has proven so unreliable that its reputation for producing meaningful data lies in well-deserved ruin. When RAND concludes that today's standardized tests identify not "good" and "bad" schools, but "lucky" and "unlucky" schools, you've definitely got a data problem.
When education researchers aren't citing faulty numbers, they're basing their conclusions on feelings.
For example, those conflicting gender studies rested on notoriously unreliable student surveys and evidence as weightless as "boys call out in class eight times more often than girls," which is why scholars and critics complained about flawed research claims, a small body of research and questionable findings.
Similarly, a 2004 evaluation of Maine's statewide laptop distribution announced that laptops made a "significant and positive impact" on the quality of work and student achievement. Except those rosy conclusions were based on the perceptions of teachers, parents, and students, on their "opinions, but not actual hard data." In other words, the evidence consisted of what students and teachers believed had happened, not on any documented improvement in student performance.
The American Educational Research Association even endorses a research tool they call "data poems." Employing this method, educators can "focus, interpret, clarify, and communicate qualitative research" by writing and reciting a poem.
Don't look for this species of research at a physicists convention.
Not a science
Education research rarely satisfies real scientific standards. That's partly because education isn't a science. It's an art and a craft.
That doesn't mean that teachers don't need knowledge of their subjects, or that I can't improve my technique in the classroom.
But education research is fundamentally anecdotal, so that what I observe free in my classroom isn't necessarily any less valid or informative than an expensive study of someone else's classroom, especially when most of those studies are conducted by experts who have rarely, if ever, worked in a classroom.
The education establishment has lavished a fortune on research that's yielded mostly meaningless data and sentiment dressed up as evidence. Schools have squandered scant resources and time hopping on research-based bandwagons.
Even worse, decades of students have been the unwitting guinea pigs of a pseudo-science that more often suits education experts' philosophical preferences than it serves either students or the truth.
The nation, its schools and our students would be better served by common sense.
If the research tells us anything, that's it.