How do you innovate in large-scale state testing while continuing to meet federal assessment and accountability requirements?
That, essentially, was the question addressed by the Center for Assessment team and state leaders in the final session of the Center’s virtual Reidy Interactive Lecture Series (RILS). Specifically, the session featured interviews with leaders in states currently designing, developing, and implementing pilot state testing programs under the Innovative Assessment Demonstration Authority (IADA) provision of the Every Student Succeeds Act (ESSA). While listening to the state leaders describe their struggles to serve two masters (i.e., the feds and their desire for innovation), my mind wandered to my own experiences and lesson learned.
That lesson, as presented in the passage above, excerpted from Matthew 22, is to “repay to Caesar what belongs to Caesar and to God what belongs to God.” The same lesson, as I saw it applied deftly by Rich Hill on multiple occasions over the course of my career is to do what you need to do to figure out a way to give the Feds what they require. Nothing more, nothing less. When you have done that, then you gotta go where you wanna go, do what you wanna do, with whoever you wanna do it with.
Accomplishing that task often requires more innovative thinking than the desired innovation to the assessment program.
Give them what they want, what they really, really want
Even a cursory reading of the IADA description and requirements makes it clear that the Feds are interested in a very specific type of innovation; that is, innovation in which the state figures out a different way to accomplish the same task performed by the current assessment. In this case, the task is to determine student proficiency on the state standards in English language arts and mathematics. The idea behind IADA is that a state may come up with an innovative way to do that more efficiently and/or effectively – faster, cheaper, better. [Aside: Yes, doing it more effectively or better could result in comparability issues, but not the kind of comparability issues that normal people or the Feds worry about.]
The IADA is clearly not about innovation designed to accomplish a different task.
Not that there’s anything wrong with that. Some of the best innovations are products or processes whose primary function was to enable us to accomplish a task more effectively or efficiently.
But what if a state wants their innovative assessment to do something new or different than their current assessment – measure 21st century skills or assess an interdisciplinary content area that is a combination of English language arts and social studies. How do they satisfy the Feds while building their innovative system?
Open During Construction
We have all seen some version of the sign and are familiar with the concept of “Open During Construction” and all that it implies:
- Delays and alternating lane closures for three years as they shore up the bridge on I-95 between New Hampshire and Maine.
- A limited menu at your favorite fine dining establishment (or perhaps only their drive-through window is open).
- The exit that has become the entrance and temporary checkout lines at the grocery store.Where are they hiding the potato chips this week?
- At home, handling food prep and storage during a two- to three-month kitchen renovation.
The same principles apply to maintaining your current assessment system (or at least enough of it to please the Feds) while designing, developing, and implementing a new, innovative assessment system.
Pardon our dust. We’re innovating to improve your assessment experience!
What Will It Look Like When the Project Is Complete?
A big part of deciding what the project will look like during construction is based on how things will look when the project is completed. Above, I provided the example where the renovated bridge will occupy the same space as the current bridge.
On just my local commute, however, I can think of three cases in the past ten years where a new bridge was built adjacent to the existing bridge. In those cases, the existing road entering and exiting the bridge had some new and interesting bends added to connect it to the new bridge but was unaffected for the bulk of the construction process.
The process of building an innovative assessment program adjacent to your current program – keeping the current program largely unaffected until you are almost ready to make the switch – might be similar to what districts and schools in states not named Massachusetts experienced during the transition to Smarter Balanced and PARCC. It got a little messy during field testing in 2013-2014, but the existing and innovative assessment programs were largely independent during most of the process. [Note that the experience of the state DOE staff and educators directly involved in the consortia building those innovative programs would have been different than the experience of the districts, schools, and students.]
The example of the restaurant offering a limited menu during construction might be the more appropriate comp for a state trying to satisfy the Feds while developing an innovative assessment program. Offer a shortened version of the current assessment or select those 3-5 breakfast and lunch items (power content standards) needed to maintain the customer experience and fulfill your needs (determining student proficiency).
Or perhaps you will be administering the microwave version of your assessment or buying takeout for a short time while your innovative assessment is being installed.
Innovation, Like Construction, Is Messy (and a pain in the butt)
I am not trying to be glib or gloss over the challenges, difficulties, and bumps in the road that will be encountered in trying to maintain an existing assessment program while building and implementing its replacement.
States, understandably, want to avoid asking (or requiring) schools to double test students, but the reality is that some double testing is inevitable. The good news, however, is that it’s highly likely that most innovative assessment programs being considered by states are part of the trend of moving state assessment from external on-demand tests to more curriculum-embedded, classroom assessment approaches. It should not be as difficult to find districts, schools, and even teachers willing to participate in those efforts.
Then there is the task of convincing the Feds that you are still meeting all of the requirements of the law and meeting Peer Review standards. The good news there is that there are some people who live for making those arguments and presenting those types of cases to the Feds. The keys in that process are a) giving the Feds a reason to say yes, and b) not putting them in a position where they are forced to say no. The former requires presenting clear evidence to support your argument. The latter requires not setting out to explicitly violate both the letter and spirit of the law (e.g., not testing at certain grade levels, not reporting student proficiency in English language arts).
The final key is something that I learned during my first trip to Baton Rouge to work with the LA DOE on their assessment and accountability program.
Don’t accept a “No” answer from someone not authorized to say “Yes”!
That advice came from an interview with a travel advisor I heard on the radio at my hotel, but it applies just as well to dealing with the Feds about innovative assessment. The other piece of advice I recall from that interview – remove the rarely laundered comforter or quilt from the bed as soon as you enter the room. Not quite as relevant here, but still good advice.
You must be logged in to post a comment.