We’ve made it to April!
In the spirit of this season of rebirth and renewal, in this week’s TL: DR blog post I ponder and prophesy on the current state and future of our field.
assessment, accountability, and other important stuff
We’ve made it to April!
In the spirit of this season of rebirth and renewal, in this week’s TL: DR blog post I ponder and prophesy on the current state and future of our field.
I wrap up my March series, NAEP by the Numbers, with the number .05 and a discussion of significance and differences. The significance of NAEP lies far beyond score differences within and across years that are statistically significant at the .05 level. Much of what makes NAEP significant is that it is different. Different from state tests. Different from tests administered by schools and districts. It serves a different purpose. A purpose for which it is well-designed. Simply put, NAEP is NAEP.
“All good things must come to an end.” In the third post of my NAEP by the Numbers series, I ponder historically, philosophically, and a little bit technically, on whether that time has come for something that many believe are very good things – the NAEP trend lines.
In the second post of my NAEP by the Numbers series, I reflect on the NAEP 0-500 scales: both the Long Term Trend scale that stretches back to the 1970s and the new scale developed when NAEP began reporting state results some 35 years ago. At times, impressive. Other times frustrating. Love it or hate it, there’s nothing in our field quite like the NAEP scale.
For March, I’m planning a series of posts looking at NAEP by the numbers. The first two numbers are 20 and 10, as in the 20 students performing at the 10th percentile in reading and mathematics in a typical NAEP state sample. We’re all concerned that the bottom has been falling out of NAEP results, but my question is just how well we understand who those 10th percentile students are.
The 10th percentile sitting out there 1.28 sd from the mean is kind of an abstract concept, but a classroom-size sample of 20 kids is something we should be able to wrap our heads around.
Over the weekend, I set out to make a list of the test-related things that I was thankful for this Thanksgiving. That daunting task proved more difficult than I anticipated. The constant attacks on testing have become more subtle and a little more muted, but they persist. Everything about testing seems up in the air, changes in technology happening faster than our ability to process them. Then I remembered that the past 25 years focused on compliance. The chaos itself is something to be thankful for. But let’s dig a little deeper than that.
Are the words test and assessment interchangeable, comparable, synonyms? Seems like a rather innocuous question. My take on it last week, however, hit that blogging sweet spot between striking a chord and striking a nerve. When that happens there’s only one thing for a self-respecting blogger to do, double down, add a bit of meat to the bone, and tackle the topic again.
I have little doubt that the future of standards-based assessment is going to be much more complex, multi-dimensional, and well, messy than the testing situations with which we are familiar and comfortable. That likely means we are going to have to lean more heavily on those other forms of linking educational assessments that don’t fit under that category of equating. As we are revising the Standards, now might be a good time to consider what linking might look like as we shift our attention from testing to student assessment.
This morning, I had the privilege of speaking (via Zoom) at the 8th International Association for Innovations In Educational Assessment conference in Nigeria. The theme of this conference as Assessment in the Era of Artificial Intelligence. While many hear AI and picture a brave new world of an assessment future heretofore unimagined, I find myself dreaming of the envisioned assessment past that was never fully realized and wondering with the support of AI, why not. Why not now? Why not us? Why not state-supported, school-based assessment as the norm?
In this week’s edition of How Charlie’s Mind Works we see how attending a conference on balanced assessment systems, discussing a blog post reflecting on said conference, listening to an episode of Freakonomics on air traffic control, attending a TAC meeting, preparing presentations for two upcoming conferences, falling asleep while watching a campy vintage horror film, and listening to the acoustic version of The Life Of A Showgirl on repeat coalesced into a blog post on balanced assessment and the future of state-supported student assessment.