For a statistician, Andrew Gelman of Columbia University has an apparent discomfort with data. What else can explain his recent piece in Vox, co-written with former math teacher Mark Palko, arguing against high-performing charters and test-based accountability based entirely on anecdote?
The piece opens by recycling the well-known negative narrative about New York City’s Success Academy charter schools stemming from a video showing a teacher losing her cool with a first-grader and an entirely unrelated anecdote from an unhappy parent.
They then hang their thesis on this remarkably thin branch:
The data seems to show that Success Academy thrives by a combination of kicking out poorly performing students and training the remainder to perform well on tests that kids at other schools don’t really care about—or don’t care as much about.
Note the phrase “seems to show,” which does not strike me as statistically rigorous. Moreover, it’s not clear what “data” they are referring to since they offer none. And I have no idea how they know what kids do or don’t “care as much about.”
They follow with a vague reference to “other critics” and a link to an opinion column alleging that Success Academy pushes out special needs kids, again based on one example.
Having established to their own satisfaction, if not the reader’s, that Success Academy is gaming the system they go on to say that their remarkable test score data is compromised.
They continue with their attack based on “a look at the data,” but they offer no data nor any links to data. So what exactly did they look at? It’s anyone’s guess.
They point out the stunningly obvious fact that any test purporting to measure “college and career readiness” cannot fully and completely prove what it claims unless you wait 20 years and study whether kids actually complete college and thrive in their careers.
Point taken gentlemen. I guess that also means we can’t prove global warming until we are underwater, though we could collect some anecdotes from hungry polar bears about shrinking ice caps and the shortage of fresh seals.
Gelman and Palko then declare that the “no-excuses” charter school model is “not normal” and therefore the results are not valid. They assert, without any evidence, that at Success Academy, “the focus on standardized tests is relentless.”
How do they know this? Did they visit the school? Interview the principal and teachers? Enroll their own kids at Success?
They add that Success Academy students “don’t do so well on tests that matter to the students themselves.” Which tests “matter” to kids? How do they know? Did they survey them? No links, no data. Not even any anecdotes.
They follow with a bad metaphor—“tampering with the speedometer won’t make the car go faster”—and broadly conclude that the Success Academy model “can lead to more data corruption.”
I suppose anything “can” lead to data corruption, but did it? No one knows. Who cares? Their message is clear: Success Academy test scores don’t count.
Ironically, the only actual data they link to in their opinion piece is an op-ed from Success Academy founder Eva Moskowitz showing that her charter schools collectively rank in the top 0.3 percent of schools statewide in math and the top 1.5 percent in English.
At last—actual data—and wow is it impressive. Poor, inner-city students outperforming wealthy suburban kids who have more enrichment opportunities, better paid teachers and better resourced-schools. The only possible conclusion for Gelman and Palko: Success Academy must be gaming the system.
Entirely missing from the story is any consideration of how much traditional schools all across America do the very same things they accuse Success Academy of doing: excessive test prep, drill- and kill- teaching, no-excuses discipline and pushing out weaker students into alternative settings.
And yet, these traditional public schools have not closed achievements gaps like Success Academy.
It would “seem” that the authors have no interest in the 94 percent of schools educating the vast majority of American schoolchildren. It would seem that their real mission is to discredit charter schools like Success Academy and undermine test-based accountability.
If, on the other hand, they really want to offer balanced perspective on charter schools, accountability and American education, perhaps they can do something noticeably lacking in their article: spend a little less time stringing together anecdotes and a little more time crunching data.