Opinion
Every Student Succeeds Act Opinion

Data Are Critical for High-Mobility Students

By Jennifer Bell & Nadja Young — August 10, 2016 5 min read
  • Save to favorites
  • Print

The Every Student Succeeds Act requires states, for the first time, to measure and report on the academic performance of homeless and foster children, as well those from military families.

Providing student-growth measures for these vulnerable subgroups will give states and districts a clearer picture of how—or whether—the needs of these students are being met. As states and districts plan how to incorporate these data into their accountability systems, they must also understand how to mitigate the unique challenges of measuring the academic growth of these students.

Homeless, foster, and military-connected student subgroups include a higher proportion of high-mobility students, missing test scores, and smaller student sample sizes than many other subgroups—all of which can hinder the ability to measure their academic growth.

BRIC ARCHIVE

Students connected to the active-duty military, for instance, move three times more frequently than their civilian counterparts, according to the Military Child Education Coalition. In addition, high percentages of homeless and foster students experience frequent school changes, often moving from one district to another.

These disruptive transitions can lead to lost testing data. Many states, including Arkansas, Delaware, and Kentucky, have expanded their statewide student-information systems over the past decade and now have the ability to share data on students who move across district lines. (Privacy laws, however, still stymie efforts to track student data across state lines.) While sharing data between districts should mitigate the loss of existing testing data, students in these subgroups are also more likely to miss tests in the first place.

Many state student-growth models can’t incorporate students who are missing recent test scores, because those models focus on a change in student achievement, in a single subject, only from one year to the next. States and districts attempting to use these simplistic growth models will struggle to generate information on highly mobile subgroups. How do we make sure data shine a light on how these potentially at-risk students are being served?

See Also

For more on student mobility, please visit:

Sophisticated growth models, such as those used in Tennessee and Pennsylvania, can include more of these students, even those missing test scores from the previous year. Both states have used Education Value-Added Assessment System (EVAAS) models for many years and have a rich history of using data for both reflecting on instructional practices and improving student outcomes.

By including additional prior testing data—across different subjects, grades, and assessments—advanced growth models provide a more accurate understanding of students’ knowledge and skills when they enter the classroom. This approach gives teachers better information on how to work with those students and provides a clearer baseline from which to measure growth in the current year.

How do we make sure data shine a light on how these potentially at-risk students are being served?"

Another challenge in collecting good data is that homeless, foster, and military-connected student subgroups represent a small percentage of the overall school population. For instance, 15 states have fewer than 5,000 homeless students. With a smaller subgroup of students, it is more difficult to produce meaningful growth measurements, given the inherent statistical limitations of small samples.

The American Statistical Association recommends that estimates from student-growth models be presented alongside information on the precision and limitations of the model used. This is an especially important reminder when faced with small subgroups, as smaller samples have more built-in error. Adopting a model that includes the standard error around a group’s growth measure can mitigate that problem, by essentially telling users how confident they should be in the measure.

In its notice of proposed rulemaking under ESSA, the U.S. Department of Education allows states to set their own student-subgroup minimum amounts, but requires states to get federal approval for a minimum sample size greater than 30 to make sure they are still capturing the performance of small groups. As states consider different growth measures for their accountability systems and school report cards, they must also take the limitations of small-group measurement into account. Incorporating standard error adds critical context and protects schools against incorrect classification.

Some states use student-growth measures to classify schools into different categories, such as letter grades, star ratings, and schools “in need of improvement.” The standard error indicates how confident we can be in concluding whether the growth measure meets, exceeds, or falls short of the growth expectation. Only when there is enough evidence is a growth measure categorized into something other than “meeting expectations.”

The data challenges of small, mobile subgroups are not insurmountable; if we conquer them, we can do more than just meet new ESSA requirements. ESSA prompts states to design accountability systems that look back on how they served students the previous year. More advanced models also look to the future, toward how to better serve these often-overlooked subgroups in the coming years. Advanced models incorporate predictive analytics, which allow for student projections to future state assessments and Advanced Placement and college-readiness tests.

With projections and early-warning indicators, teachers and schools can see a student’s trajectory and more proactively implement remediation, intervention, and enrichment strategies that foster academic improvement. Better still, they can accomplish this with the same underlying standardized-test data required by ESSA.

As states and districts redesign school accountability systems, student-growth measures remain a valuable indicator of school quality. But let’s use all the data we have to meet the distinct needs of homeless, foster, and military-connected students. Where possible, let’s examine these vulnerable groups individually. And let’s not remove a child from an analysis because he or she is missing a test score. All kids count, so let’s count all kids.

Events

Artificial Intelligence K-12 Essentials Forum Big AI Questions for Schools. How They Should Respond 
Join this free virtual event to unpack some of the big questions around the use of AI in K-12 education.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in Schools
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by Panorama Education
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Science Webinar
Spark Minds, Reignite Students & Teachers: STEM’s Role in Supporting Presence and Engagement
Is your district struggling with chronic absenteeism? Discover how STEM can reignite students' and teachers' passion for learning.
Content provided by Project Lead The Way

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Every Student Succeeds Act Opinion 20 Years Ago, NCLB Kinda, Sorta Worked. That's the Problem
NCLB's political success gave rise to a more complicated reality of lax academic standards and public cynicism.
3 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
DigitalVision Vectors/Getty
Every Student Succeeds Act Biden Education Department Approves One Request to Cancel State Tests But Rejects Others
Officials will allow D.C. to cancel tests. They denied similar requests from two other states and approved less extensive waiver requests.
6 min read
Image of students taking a test.
smolaw11/iStock/Getty
Every Student Succeeds Act Republicans Tell Miguel Cardona His Plan for ESSA Waivers Seems to Violate the Law
The Every Student Succeeds Act doesn't permit the education secretary to seek certain data he's asking for, the two GOP lawmakers say.
4 min read
White House press secretary Jen Psaki, left, listens as Education Secretary Miguel Cardona, center, speaks during a press briefing at the White House on March 17, 2021.
White House press secretary Jen Psaki, left, listens as Education Secretary Miguel Cardona, center, speaks during a press briefing at the White House on March 17, 2021.
Andrew Harnik/AP
Every Student Succeeds Act How Will ESSA Hold Up During COVID-19? Pandemic Tests the Law's Resilience
Lawmakers designed ESSA to limit mandates covering issues like how tests are used. Will that affect how well the law survives the pandemic?
6 min read
President Barack Obama, flanked by Senate education committee Chairman Sen. Lamar Alexander, R-Tenn., left, and the committee's ranking member Sen. Patty Murray, D-Wash., signs the Every Student Succeeds Act on Dec. 10, 2015.
President Barack Obama signs the Every Student Succeeds Act on Dec. 10, 2015, in Washington.
AP Photo/Evan Vucci