Special Report
School & District Management

How to Find Evidence-Based Fixes for Schools That Fall Behind

By Sarah D. Sparks — September 27, 2016 | Corrected: September 28, 2016 6 min read
  • Save to favorites
  • Print

Corrected: A previous version of this article incorrectly spelled the name of Vivian Tseng, the vice president of the William T. Grant Foundation. In addition, the article has been changed to clarify that the Chiefs for Change group has created an ESSA working group of 15 experimentally minded state education leaders.
An earlier version of this article misidentified the organization that created the “State Guide to Evidence Use.” That resource was developed by the Florida Center for Reading Research at Florida State University.

The Every Student Succeeds Act gives states and districts significant flexibility in how they turn around struggling schools, as long as the local approaches are backed by evidence. But without support, that flexibility runs the risk of putting smaller or more rural districts at a disadvantage.

“This is a sea change from the highly prescriptive approach to school improvement [under the No Child Left Behind Act] to what can seem like a bit of a Wild West structure under ESSA,” said Mike Magee, the chief executive officer of Chiefs for Change, which has created an ESSA working group of 15 experimentally minded state education leaders. “We have potentially unprecedented flexibility in how states address school improvement—but that’s just another factor in how high the stakes are.”

As states work out how to apply ESSA’s new standards of evidence, their quest highlights the need for more research on interventions at schools with a wider array of contexts. The pool of high-quality research on education programs remains relatively small, sporadic, and focused on shorter-term gains for students.

Tying Things Together

While the federal What Works Clearinghouse has reviewed more than 10,000 studies on various interventions, a forthcoming meta-analysis based on the clearinghouse’s reviews found only 29 different interventions showed significant effects—and the average effect was small, particularly when the interventions were in messy real-school contexts instead of highly controlled laboratory settings.

“If you look from 10,000 feet at education interventions, you can almost count on your hand the number of interventions that have truly scaled and established” themselves, said Jerome D’Agostino, a professor of educational studies at Ohio State University, who led the study presented at the 2016 American Educational Research Association in Washington this April.

“It’s not just the sheer volume of programs; there hasn’t been a wider effort to tie these [intervention evaluations] together in any way,” D’Agostino said. “It’s not like some grand designer said, ‘Do we have enough interventions in reading, in math, in different grade levels? It’s field-generated. ... People have been focusing on their parts of the elephant, and I’m not sure there would be a whole elephant if you brought them all together.”

ESSA lays out three levels of evidence that states can choose to apply to prove an intervention works:

• “Strong evidence” includes at least one well-designed and -implemented experimental study, meaning a randomized controlled trial.

• “Moderate evidence” includes at least one well-designed and -implemented quasi-experimental study. For example, a program evaluation could use a regression-discontinuity analysis, in which researchers might look at differences in outcomes for students who scored a point above and below the entrance cutoff score for a particular program or intervention.

• “Promising evidence” includes at least one well-designed and -implemented correlational study that controls for selection bias, the potential differences between the types of students who choose to participate in a particular program and those who don’t.

In separate guidance, the Education Department explained that districts and states should work to use the most rigorous evidence available, with intervention studies that not only meet high methodological quality, but also reflect similar students and school types as those where the intervention would be used.

“The logic model before was, pick a good intervention, implement it, and you’ll get results,” said Vivian Tseng, the vice president of the William T. Grant Foundation, which has been studying research use in education. “Now, whatever you implement, there’s this idea of ongoing evaluation to see where it worked, where did it not work, and for whom. I would say the ongoing cycle of learning is needed for programs at any of these [evidence] tiers.”

Proof Requirements

Resources

As states and districts grapple with how to develop and use evidence for school improvement, several new resources are rolling out to help them:

Enhanced Find What Works

Who made it? U.S. Department of Education

When is it available? Now

What is it? The What Works Clearinghouse has evaluated more than 10,000 studies on different educational programs and interventions, but in the past, some educators have found the database difficult to use. The revamped database includes a search tool to allow educators to search for studies not just based on the topic and grade level, but also on the demographic characteristics of the students who used the intervention and whether the schools studied were urban or rural, among other things.

“One of the things we’ve heard from people is they really want information on the population and context where things were tested,” said Ruth C. Neild, the acting director of the Institute of Education Sciences, which runs the clearinghouse. IES has always collected contextual data from its study reviews, she said, but the new tool “frees a lot of the data we had but didn’t really have a way of displaying without overwhelming people.”

National Study on Research Use Among School and District Leaders

Who made it? National Center for Research in Policy and Practice

When is it available? Now

What is it? The center is producing a series of reports based on a nationally representative survey of 733 school and district leaders from 45 states and 485 districts. The group is reporting on how and when district and school leaders use evidence to make decisions and how states can provide better resources and supports to help them use research more effectively.

RCT-Yes

Who made it? U.S. Department of Education

When is it available? Now

What is it? The highest tier of evidence under the Every Student Succeeds Act includes randomized controlled trials, or RCTs, in which researchers randomly assign participants to use an intervention. In practice, RCTs can be expensive and lengthy to perform in educational settings. This free software helps districts perform small-scale experimental and quasi-experimental studies in the regular course of implementing a program.

For example, a superintendent may pilot a new math program at five of nine elementary schools and find higher math scores for students in the participating schools at the end of the year. The software could be used to help the superintendent understand whether the new math program or something else led to the student gains.

State Guide to Evidence Use

Who made it? Florida Center for Reading Research at Florida State University

When is it available? Late 2016

What is it? The guide is a self-study walk-through for states to plan their own evidence standards and school improvement strategies, applying the ESSA levels of evidence. The lab plans to devise a similar guide for districts on how to apply their state evidence levels to district school improvement decisions.

“We want to give them a structure to do their planning; we’re not telling them which strategies to pick,” said John Hughes, an associate director of the Florida Center for Reading Research and the deputy director of the regional lab.

Results First Clearinghouse Database

Who made it? Pew-MacArthur Results First Initiative

When is it available? Now

What is it? This search tool aggregates results from several evidence databases, including those for child-welfare, juvenile-justice, mental-health, and social-services interventions. For administrators looking for nonacademic or community-related interventions, this tool can provide a broader array of interventions.
–S.D.S

Source: Education Week

Those levels were developed in part from the proof required under the Obama administration’s Investing in Innovation competitive grants. It’s telling that only 43 projects met the grant’s moderate-evidence bar—and fewer than 10 have so far had the strong evidence required to win i3’s top-tier grant.

The few interventions that have established strong bases of evidence and use over time, such as Success for All and Reading Recovery, which D’Agostino evaluated for the federal Investing in Innovation program, created comprehensive infrastructure to implement programs in a wide variety of schools; trained and retrained staff that turned over; and sustained ongoing improvement and evaluation of the programs.

“When I meet with other [intervention programs], they are so far from even thinking and conceptualizing that need for infrastructure, I’ve come to the conclusion a lot of them will never get there,” he said.

It can take years of effort to build a strong evidence base for a program. One of those i3 grantees, the New Teacher Center, has been conducting multiple randomized controlled trials and quasi-experimental evaluations of its mentoring model since 2004, according to Ali Picucci, the center’s vice president of impact and improvement.

While the studies have produced promising results, “we know [randomized controlled trials] occur in controlled environments and are not ideal for addressing the social complexities that we find in classrooms,” Picucci said.

“Just as one-size-doesn’t fit all when it comes to clothes or educational initiatives, one study doesn’t fit all district and school contexts. It’s important for all of us to remember that interventions—even those backed by high-quality evidence—are beginnings and not ends,” said Ash Vasudeva, the vice president for strategic initiatives at the Carnegie Foundation for the Advancement of Teaching. “Simply selecting an evidence-based program does not ensure that similar results can be achieved in different settings and systems.”

Local Contexts Critical

In Cleveland, district officials are trying to integrate thinking about evidence into day-to-day decision making in schools.

The district has set up a website with a summary of every program available in the district, and is working to provide reviews on the effectiveness of each on key outcomes like reading and math achievement. The intervention “report cards” do include high-quality external studies if they are available, but also include the district’s own research on an intervention’s effects for schools that used it either a lot or just a little, and feedback from principals on how easy it was to use and how well it worked for them.

“We’re evaluating programs regardless of their external body of evidence, because context matters,” said Matthew A. Linick, executive director of research and evaluation for Cleveland public schools. “As a researcher, you know the more rigorous the study is, the less generalizable it becomes. While things that work in urban districts could be helpful, Cleveland’s urban context could be very different. Just because something works in Cincinnati doesn’t mean it will work in Cleveland.”

Both the U.S. Department of Education and nationwide groups like the Council of Chief State School Officers and Chiefs for Change have set up support networks for state officials to work together to identify evidence for what interventions will work in different school contexts. And the Institute of Education Sciences, the department’s research arm, is working to allow researchers and educators to search research for core elements of different programs and the effects of an intervention on specific populations.

“One of the things we’ve heard from people is they really want to see the context” of the school where an intervention was done, said Ruth C. Neild, the IES acting director. “Almost regardless of whether impacts are different for different [student] groups, ... a lot of times people need to see the intervention was done in their particular context in order to believe it.”

A version of this article appeared in the September 28, 2016 edition of Education Week as Finding Evidence-Based Fixes for Schools

Events

School & District Management Webinar Crafting Outcomes-Based Contracts That Work for Everyone
Discover the power of outcomes-based contracts and how they can drive student achievement.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in Schools
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by Panorama Education
School & District Management Webinar EdMarketer Quick Hit: What’s Trending among K-12 Leaders?
What issues are keeping K-12 leaders up at night? Join us for EdMarketer Quick Hit: What’s Trending among K-12 Leaders?

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

School & District Management Reports Strategic Resourcing for K-12 Education: A Work in Progress
This report highlights key findings from surveys of K-12 administrators and product/service providers to shed light on the alignment of purchasing with instructional goals.
School & District Management Download Shhhh!!! It's Underground Spirit Week, Don't Tell the Students
Try this fun twist on the Spirit Week tradition.
Illustration of shushing emoji.
iStock/Getty
School & District Management Opinion How My Experience With Linda McMahon Can Help You Navigate the Trump Ed. Agenda
I have a lesson for district leaders from my (limited) interactions with Trump’s pick for ed. secretary, writes a former superintendent.
Joshua P. Starr
4 min read
Vector illustration of people walking on upward arrows, symbolizing growth, progress, and teamwork towards success.
iStock/Getty Images
School & District Management Opinion How Social-Emotional Learning Can Unify Your School Community: 7 Timely Tips
It’s a stressful political season. These SEL best practices can help school leaders weather the unpredictable transitions.
Maurice J. Elias
4 min read
Modern digital collage of caring leader surrounded by positivity. Social Emotional learning leadership.
Vanessa Solis/Education Week via Canva