With the third pandemic-affected school year just ended, it is tempting to want things to return to normal in the fall. Many teachers, administrators, and families are spent from the ever-changing circumstances of their lives and classrooms since March 2020. They’ve witnessed firsthand the many impacts the pandemic has had on students’ well-being and academic performance.
Normalcy, understandably, sounds great. Unfortunately, however, returning to “normal” for students will not cut it if we hope for academic recovery from the pandemic. Fortunately, districts already have access to unprecedented ESSER federal funding, which provides an opportunity to invest in academic interventions and initiatives to get students the extra support they need.
To gauge how much support students need to catch up, we need to know just how far behind they are relative to a typical year. We, along with colleagues from Harvard University and Dartmouth College, recently released two reports that take a closer look at student achievement by exploring changes in student-test scores before and during the pandemic. Both rely on data from more than 2 million students across nearly every state and more than 10,000 school districts. The result provides what is arguably the most comprehensive picture of the state of student learning in last fall.
Of course, test scores are only one measure of the impact of the pandemic on students, and the pandemic had myriad impacts on students’ lives. But ignoring the effects on test-based measures of learning would be a mistake. Even with their limitations, test scores are our best bet for understanding the big-picture effect of the pandemic on student achievement across the country. There’s also abundant evidence that standardized-test scores are strongly related to knowledge, skills, and later opportunities in life and work. Test scores may not tell us everything, but they are highly predictive of things that matter.
Our findings confirm much of what was already known: Student-test-score levels and academic growth suffered during the pandemic, especially in math, and especially for students of color and those attending high-poverty schools. But we offer two new important considerations as we move forward with recovery efforts.
First, we translate the amount of academic recovery students need into readily interpretable and actionable weeks of instruction. Across the country, students are an average of 12 weeks—or a third of the typical 36-week school year—behind where they would typically be in math. For context, this average catch-up needed is larger than even what was documented in the disruption of learning for students in the areas affected by Hurricane Katrina. This magnitude of loss suggests most students are going to need substantial additional and accelerated learning opportunities over the next several years.
Second, we show that the pandemic’s impact on test scores was not equal. It varied across students, contexts, and whether students were learning in person or remotely. Students whose 2020-21 school year was primarily remote, students in high-poverty districts, and students of color suffered the most. In high-poverty districts that operated remotely for more than half of 2020-21, the catch-up needed in math is equivalent to 22 weeks of instruction, roughly half a school year.
But factors like poverty and remote status don’t tell the whole story—some districts that were similar on these dimensions had very different outcomes. Nearly 90 percent of districts experienced lower than expected achievement, but not all districts did.
Districts serving lower-achieving students, who would already be expected to have lower achievement in fall 2021, tended to be further behind. But in some cases, districts with similar pre-pandemic achievement, enrollments, student demographics, income levels, and amounts of remote instruction in 2020-21 saw different results.
Take two districts we identified that are similar on all those factors. Students in one were about were about two weeks behind in math—what would have been expected from a pre-pandemic year; whereas, students in the other district were about 14 weeks behind.
We also found that while the national narrative about students being further behind in math than reading is generally correct, it is not universal. There are districts where reading scores took a harder hit than math. One big takeaway: Districts need to carefully assess the status of student learning across the board and not assume that the national trends reflect local student needs.
As school and district leaders finalize their investments in interventions for next year, we urge them to keep this variation in mind by using their local data to identify the students and subjects most impacted. This will allow a more accurate estimate of the scale of recovery learning opportunities students will need to catch up. In many districts, these interventions almost certainly will not be able to exist only within the confines of school or the school day to be commensurate with students’ needs.
Key to recovery efforts will be district leaders’ clear communication with their school communities about the urgency and scale of the challenges students are facing. Leaders will in many cases need to rally political support for recovery initiatives. And, in turn, communities will need to offer school and district leaders the flexibility and grace to monitor, adjust, and learn as the recovery process unfolds.
If students are on a sharp enough upward trajectory, that’s great. If not, education leaders should reassess their recovery plans, learn from bright spots and other districts, and try solutions that have a better chance of success.
Over the next three years that federal resources are available for recovery efforts, monitoring and transparent reporting at the end of each school year about the trajectory of students’ progress relative to the needed recovery is essential. As a nation, we will need to learn from each other’s successes to give our children the best opportunities to catch up.