Courtesy of commons.wikimedia.org
Arizona's AzMERIT scores have just been released, and the Department of Education has created such a detailed spreadsheet, it makes this old English teacher's head spin. Everything is broken down about as far as it can be broken down—by district, school and grade, then within those categories by gender, ethnicity, economic status, disability, English proficiency. Did I miss anything? Maybe, I haven't dug into the material in detail—and to give you an idea of what "in detail" means, just one page of the spreadsheet has more than 65,000 rows, with each row broken down into 13 columns. I'll be spending more time plowing through the breakdowns to see what I can discover. In the meantime if you want to look for yourself, you can download the spreadsheet here
So, taking a first, very general look, what do we learn from the AzMERIT scores?
Mainly, we learn that the rich stay rich and the poor, and their schools, are going to get blamed for their low test scores—again. Districts with lots of kids from high income families have higher passing rates than districts with lots of kids from low income families. In other words, there's nothing new under the AzMERIT sun. The same was true with the AIMS test scores, and the same is true around the world. No matter where you go, socioeconomic status is the most reliable determiner of student scores on standardized tests. The only difference in the U.S. is, the gap between the high income/high scorers and low income/low scorers is greater than in most other industrialized countries (We're also the only country that spends less on its low income than its high income students). But here in Arizona, conservatives pooh-pooh all those pesky facts. They tell us we shouldn't play the socioeconomic card when looking at test scores. "No excuses!" To them, low scores from poor kids mean: (1) those lazy kids aren't trying hard enough; (2) their schools are doing a terrible job teaching them and should be punished; and (3) their parents are lousy parents who don't care about their children.
We also learn that the AzMERIT test is tougher than the old AIMS tests and the cut scores are higher. This year, something like a third of Arizona students passed AzMERIT. On the 2014 AIMS test, 61 percent passed the math section and 79 percent passed the reading section. Unless someone put stupid juice in the kids' drinks last year, the overall difference in the passing rates is a function of the tests, not the students' skills and abilities.
Not surprisingly, around the Tucson area, passing rates for the more affluent districts beat the state average. The above-state-average districts, starting with the highest passing rates and working down, are Catalina Foothills, Tanque Verde, Vail, Amphi, Marana and Sahuarita. The below-state-average districts, starting with the highest passing rates, are Flowing Wells, TUSD and Sunnyside.
Now here's one interesting change I have to spend more time looking into. Flowing Wells and Nogales districts are always held up as the shining achievement stars whenever people question the correlation between socioeconomic status and test scores. "Yeah, but what about Flowing Wells and Nogales? If they can beat the odds and score higher than similar districts, why can't everyone else?" This year, Flowing Wells scored just a few points higher than TUSD, and Nogales scored a few points lower. It looks like both are off their AIMS games when it comes to AzMERIT. Why?
With Nogales, the answer is simpler. The district was caught cheating on the AIMS tests, meaning their previous scores were bogus. The current scores are probably more valid than what the old AIMS scores. It's an object lesson we've seen in low income, high scoring districts around the nation. All too often, the high stakes pressure on the high stakes tests leads teachers and administrators to try to raise scores by doing more than just teaching to the test to the exclusion of pretty much everything else. It moves them to cheat. And too often, schools and districts that are being honest are told they're failing districts because, "Look at Nogales, or Atlanta, or Washington D.C. Why can't you do as well with your kids as they're doing with similar kids?" The answer is, those other districts were cooking the books.
With Flowing Wells, I honestly don't know the reason the scores are down relative to other districts. Maybe it was just an off year for the district and next year it'll return to its stellar performance. Or maybe the district had become expert at preparing students for the AIMS test, but it hasn't developed similar strategies for coping with the new test. In other words, maybe Flowing Wells' previous high scores reflected something other than the actual academic achievement of its students, and these scores are a more accurate reflection of how their students compare with students in other districts.
Enough for now. Everything
I've written is tentative and incomplete, subject to revision and/or elaboration in further posts. There's much more to be dug out of the data. Stay tuned.