Courtesy of commons.wikimedia.org
Imagine I'm a public school math teacher, and I just gave my students a test worth 100 points. These classes of mine don't have strong math skills. With the students needing 50 points to pass, only 14 percent of them make it, and most of the passing students squeaked by just above the cut line, with between 51 and 59 points.
Next year I give the same test to my classes, which are basically identical in math skills to last year's batch. But this time I decide, damn it, 50 points shouldn't be enough to pass. I'm going to set the passing cut score at 60. That moves most of the students who would have passed with the previous cut score into the ranks of the failing. Only 2 percent pass.
The question is, should I be more concerned about the math skills of my classes with 2 percent passing rates than the previous classes where 14 percent passed? Obviously not. Their skill levels are basically identical. Only the score it took to pass changed. But while a 14 percent passing rate is met with shaking of heads and clucking of tongues, a 2 percent passing rate makes people crazy. Contact the media! Call out the public school haters! It's time to scream, "Oh my God, look at these scores! What happened? Shame on those kids! Shame on their schools! Shame on their parents!"
That, in a nutshell, is what's happening with the AzMERIT scores and the soon-to-come tsunami of shaming.
The process has already begun, though, believe me, it's only in its infancy. An article in Friday's AZ Republic, AzMERIT: Poor, rural districts feel burden of new test
, is an example. It's a sympathetic, hand-wringing forerunner to the upcoming onslaught of shame. The article looks mainly at the AzMERIT scores on the San Carlos Reservation and in the Baboquivari Unified School District. In the San Carlos schools, 6 percent of the students passed the English section and 2 percent passed the math section. In Baboquivari Unified, 7 percent passed the English and 8 percent passed the math. The article talked about how difficult it it is for the districts to deal with these results, how hard both school systems have worked to boost their students' achievement. And yet, look at these disappointing passing rates, so much lower than the previous year.
But if you look back on last year's AIMS scores, you'll see that these new low scores are perfectly predictable. They're not a sign the students' achievement levels are lower this year, and they don't mean the students did worse on the AzMERIT test than on AIMS. What they mean is, the passing level — the cut score — was raised for the new test. If the students had been given one of the old AIMS test and the cut score was raised, like I raised the scores on my imaginary math test at the beginning of the post, pretty much the same thing would have happened. It's not about the students and it's not about the new test. It's all about raising the tests' cut scores.
In San Carlos, 14 percent of the students passed the 2014 AIMS math test, compared to 2 percent passing the AzMERIT. But if you look at the breakdown of the AIMS passing scores, you find that 13 percent simply met the math standard and only 1 percent exceeded it. So when you moved that bar up on the new test, what happened basically is, 12 of the 13 percent who just met the standards fell below the new line and only 1 percent joined that 1 percent that exceeded. If you go through the AIMS math and reading scores for the two districts, you find the same situation, between 10 and 47 percent meeting the standards and between 1 and 3 percent exceeding.
Ready for some more numbers? Here are some cross-district comparisons of the number of students who met and exceeded the AIMS standards. In the San Carlos and Baboquivari districts, the ratio between the number of students meeting the AIMS math standards and those exceeding it is about 10-to-1. Only one in 10 students who made it above the cut score exceeded the standards. In TUSD, the ratio is 3-to-1—meaning one in three students not only met the cut score but exceeded it. In Catalina Foothills, more students exceeded than simply met the standards. For the AIMS English standards, in San Carlos and Baboquivari, the ratio of meets-to-exceeds was about 40-to-1. In TUSD it was 9-to-1. In Catalina Foothills, it was 2-to-1.
So if you raise the cut scores, it's obvious the passing rates will fall the most in San Carlos and Baboquivari because those districts had the most students who just barely passed the AIMS test, while significantly fewer would fall in TUSD and far fewer in Catalina Foothills. That's exactly what happened. You can find the same relationship in most districts around the state.
In other words, very little has changed in Arizona student achievement as measured on the state's high stakes tests except for a significant hike in their cut scores. Student achievement hasn't slipped in the highest or the lowest scoring schools. In fact—you may have caught this about a month ago—in the most recent scores on the national NAEP test, the most respected standardized test in the country, most states' scores either remained the same or slipped a bit. Unexpectedly, Arizona was one of the few states where the scores actually went up. Why? Good question. No one really knows. But if we want to choose a test to measure student achievement across the state, we would do better to focus on the NAEP, which has been carefully normed since the 1970s and which students can't be prepped for, rather than the new AzMERIT, an untested test with newly raised expectations. That would mean that people should be proud of Arizona's recent boost in achievement as measured by the NAEP and categorize the AzMERIT scores as an interesting set of numbers but not terribly important.
Back when Bush's No Child Left Behind began, many of us said that the real reason for the test was to have lots of public school students fail so the conservatives could use the results to say how lousy our schools are. I still believe that was true then, and it looks like it's more true now with the new scores tied to the Common Core standards (The results on this year's tests are similar across the country to what we've seen here). Schools look worse this year than last, and the lowest scoring schools look positively dreadful. We've not only raised the bar on the tests. Simultaneously, we've raised the level of student/school/parent shaming.