assessment, accountability, and other important stuff

Archive for July, 2019

Is this person college-and-career ready?

 

yes no

 

Charlie DePascale

Now that the administration has dropped efforts to include a citizenship question on the 2020 Census, perhaps there is space on the form for the proficiency question, “Is this person college-and-career ready?”  For persons 18 and under, the question would be, “Is this person on track to college-and-career readiness?”

Think about it. We ask the question in April 2020 and by December 31st we have a national count of the number of college-and-career ready residents in the United States.  By March 31, 2021 we have state-level counts disaggregated by race, ethnicity, and other key demographic factors.  In about the same amount of time that it took to produce the 2017 NAEP Reading and Mathematics results, we would have proficiency information for the entire U.S. resident population instead of the small portion of the population captured by that ill-defined social construct grade level.

The federal government could then make decisions about how much money to allocate to programs designed to improve college-and-career readiness and how best to distribute that funding across the states (just as they do with other information collected through the Census). States could begin to redesign their early childhood, K-12, postsecondary, and adult education programs to better meet the needs of their residents (just as they do with other information collected through the Census).

I bet that you are thinking, but Charlie, just how accurate could that self-reported information possibly be?  Well, you see, accuracy is a funny concept; it’s one of those eye of the beholder, depends on what the meaning of “is” is type of things.  Would a U.S. Census count of college-and-career readiness be any more or less accurate than the differences we have had in proficiency estimates among the 50 states or between states and NAEP?  Would the actions triggered by a U.S. Census count of college-and-career readiness be any more or less appropriate than actions resulting from the wide variations across states in the percentage of schools identified for support and improvement under ESSA accountability systems?

Or perhaps you are thinking, but Dr. DePascale – psychometrician – a single Census question on college-and-career readiness is not measurement.  Where are the “big 5” sources of validity evidence?  Where are the external alignment studies?  Where is the USED Peer Review?

All valid points, but here’s the thing, federal and state assessment policy has never been about measurement.  As I have argued in previous posts, determining the percentage of students in a state who have met minimal competency standards, attained proficiency,  are on track to college-and-career readiness or who have made progress from fall to spring is now, always has been, and always will be, at its core, a data collection problem and not a measurement problem.

At one time, the most efficient and accurate way to solve that data collection problem was with a large-scale state assessment; that is, with a short, on demand, machine-scored test administered to students in the general education program at selected grade levels.  But that time was a long time ago.  Policies and laws on inclusion changed.   The student population became more diverse. Content and performance standards became more rigorous and complex.

Thought experiment: Imagine you have  placed a group of experts in a room (or even a group of testing company psychometricians), and tasked them with coming up with the most efficient and effective way to determine the number or percentage of students in a state who are on track to college-and-career readiness or the number and percentage of high school graduates who are college-and-career ready.  If you don’t like a group of experts, you can crowd-source the task or use artificial intelligence to solve it.

Whatever approach you take, it is highly unlikely that the solution that is generated will be a single, on demand, end-of-year, state assessment.  If you expand the task to determining the number for the country rather than a single state, I guarantee that the solution will not be 40-50 unique state assessments.

The solution may include a limited amount of state and federal assessment (e.g., something like NAEP), but it is virtually certain that the solution will be more centered on quality data collection than on high quality assessment; and if we are looking for a data collection solution, what better place to begin than the U.S. Census Bureau.  Their self-described mission is “to serve as the nation’s leading provider of quality data about its people” with the goal “to provide the best mix of timeliness, relevancy, quality and cost for the data we collect and services we provide.”  Does any state department of education assessment or any testing company claim the same mission and goal?  Would we want them to?

Where would we begin?

So, with the proficiency question on the 2020 Census where would we begin to ensure the most accurate count possible?  The first step would probably be to develop a common definition of college-and-career readiness that we want people to use when answering the question.  The next step might be a public education campaign to get the public on board with the importance of collecting the information. That campaign undoubtedly would include clear descriptions and real-life examples of college-and-career readiness or of being on track to college-and-career readiness – descriptions that people can easily grasp and apply to themselves and the people in their home.

Now you may be asking yourself aren’t those the same things that we should do when introducing a new set of content standards or assessment program.  The answer, of course, is yes; but often those steps are forgotten or are given insufficient attention and resources when the focus is on building a better assessment or accountability system rather than on collecting better data.

There have been efforts at such public relations campaigns in the past, and they have been somewhat successful.  When the MCAS tests and new performance standards were introduced in Massachusetts in the late 1990s, “What Does Proficient Look Like” workshops were held in communities across the state and “Test Yourself” brochures were distributed at toll booths, grocery stores, and public libraries.  When the Common Core State Standards were introduced, it was impossible to watch a professional golf tournament on network television without seeing a “Support the Common Core” commercial sponsored by EXXON or some other major corporation (yes, that sentence was intentionally Bidenesque).

Massachusetts no longer has toll booths, people buy groceries online, and public libraries are being repurposed to meet the changing needs of communities.  Women’s soccer matches may be a better option than professional golf tournaments for spending advertising dollars (at least every four years).  Yes, the medium will change, but the message and the need for the message remains the same.

We can develop the best large-scale assessment ever imagined; but at the end day and at the of the school year, if every teacher, parent, and student cannot give an accurate answer to the question “Is this person on track to college-and-career readiness?” without looking at a score on a state assessment, what have we really accomplished?