card is anything but artificial grade deflation.
Grade deflation happens all the time. For example, I know that at one time, the Harvard Business School would fail students in the bottom 25% of their class, even if those students scored over 90% or something on their class assessments.
That's called grade deflation. It's every bit as pernicious as grade inflation.
It doesn't prove you have high standards. If you're a teacher, it proves you're a jerk. If you're a school, it proves you have some sort of tough guy complex. If you're a state, it proves you're trying to further the narrative that Ohio public schools are failing and need reformed in dramatic ways.
Is it a coincidence that giving more Ds and Fs to districts will potentially allow hundreds of more districts to be available for brick and mortar charter schools to open (it's currently at 39 districts) after the 2017-2018 school year?
Well, dear reader, I leave that analysis to you.
As I've said many times, if I created a test in my University of Akron course that I knew most kids would fail, I wouldn't be allowed to give the test. Because it's an unfair assessment of my students' understanding of the course material.
Yet we allow Ohio leaders to get away with saying "we're raising standards" to explain away the historically bad performance of Ohio's school districts -- a performance Ohio school districts have never replicated in nearly 20 years of high stakes standardized testing.
If you think it's really about higher standards (mind you, we've had these standards in place for two years now and passed them when I was in the legislature), then explain this: Not a single school district in this state received a higher performance index score this school year than they did in the 2013-2014 school year.
Is that really possible? That no school district in the state is doing a better job today at preparing kids for these tests than they were three years ago?
How do you know that the score drops don't mean a lot? Because school districts generally rank about where they always have -- wealthier ones on top, poorer ones on the bottom.
This leads to all kinds of statistical nonsense. For example, Cleveland's performance index score dropped by more than 27%, yet their state rank was the exact same as the 2013-2014 school year. Meanwhile, Firelands Local in suburban Lorain County had a slight smaller, 26% score drop. Yet Firelands had the state's largest single rank drop -- from 161st to 528th (out of 609 districts).
And there are districts like Athens, which had a greater than 10% drop in their performance index score, but actually improved their state rank by more than 125 spots.
Overall, the biggest percentage score drops happened in the districts with the lowest scores already, with the bottom 10% of performance index scores in the 15-16 year seeing their scores drop nearly 25% since 13-14. Meanwhile, the state's highest performing districts this year only saw an average 6.2% score drop -- 1/4 the dip of the poorer districts. The median score dropped 13% and the number of scores over 100 (out of 120 maximum) dropped from 288 districts in the 2013-14 year to 48 this last year.
Another issue is each successive standardized test has dropped scores. Again, not surprisingly. The last year of the OAAs was 2013-14 and districts did how they traditionally have done -- very well. In the 14-15 year, we had the PARCC exams and scores dropped some then. Last year we had the AIRs (after complaints about the PARCC), and now we have the worst results ever. However, it's important to remember that the majority of school district grades remain A, B or C.
What we have here is artificial grade deflation posing as "tough standards." The standards have nothing to do with it. It's testing regimes kids aren't used to taking, coupled with tests that most kids were expected to do poorly on in the first place.
If teachers were giving these tests, they'd be fired. Deservedly so.
But when we attach real consequences to these results (a district's performance index score rank determines whether charter schools can open in that district, for example), our leaders accept them as little more than a "real" indication of how districts are doing because it fits in with their now 30-year-old narrative that the nation's public schools are failing.
And while in a few, isolated cases that narrative is true, in the vast majority of cases, nothing could be further from the truth.