Opinion
School & District Management Opinion

We Must Raise the Bar for Evidence in Education

By Carly Robinson & Todd Rogers — October 30, 2019 5 min read
BRIC ARCHIVE
  • Save to favorites
  • Print

Those looking for what works in education will find no shortage of advice. Educators, hoping to improve student outcomes, eagerly embrace recommendations telling them to “Cater to each child’s learning style!” and “Give students awards for positive behaviors!” But many such intuitive, popular “best practices” may not, in fact, be what is best for students, even though their proponents stamp them “evidence based.”

Educators who prioritize evidence-based practices are fighting an uphill battle; the standards of proof for what constitutes “evidence” in schools—and education more widely—are often exceedingly low. For instance, the popular notions that we should be teaching to students’ learning styles or providing students with attendance awards were both rooted in observational evidence. Both practices have now been debunked, but over 75 percent of educators still endorse learning styles, and many schools say they use awards to recognize excellent student attendance.

We should not be surprised when people have incorrect notions on what research says works—education research is littered with published papers on a range of practical topics that either do not replicate previous findings or suggest massively inflated effects.

Educational policymakers and practitioners need to understand how study designs and research practices influence the reproducibility and credibility of a study’s findings. This is easier said than done, but there are a couple of initial indicators that suggest a research finding is “real” and worth implementing.

First, to disentangle whether a practice causes improvement or is merely associated with it, we need to use research methods that can reliably identify causal relationships. And the best way to determine whether a practice causes an outcome is to conduct a randomized controlled trial (or “RCT,” meaning participants were randomly assigned to being exposed to the practice under study or not being exposed to it).

The standards of proof for what constitutes 'evidence' in schools—and education more widely—are often exceedingly low."

Second, policymakers and practitioners evaluating research studies should have more confidence in studies where the same findings have been observed multiple times in different settings with large samples. Many educational practices are based on single research studies with small sample sizes. We can learn from small, one-off studies. But when it comes to adopting practices, we recommend those that have been evaluated in studies with large sample sizes and reproducible results.

Finally, we can have much more faith in a study’s findings when they are preregistered. That is, researchers publicly post exactly what their hypotheses are and exactly how they will evaluate each one before they have examined their data. This helps limit flexible research practices, making it less likely that researchers will find statistically significant results by chance.

Some will lament that large RCTs are too expensive, slow, or difficult to implement, and that preregistering studies is not feasible because educational research is messy and unpredictable. Yet several studies conducted in the past few years prove that changes in educational practice evaluated by large-scale, reproducible, preregistered RCTs exist and can be used to inform work on the ground.

In our own work, we have spent the last six years studying how to reduce student absenteeism. Through two large-scale, preregistered RCTs (one with more than 28,000 K-12 students and another with almost 11,000 K-5 students), our research team found that sending mailings to parents several times over the course of the school year with personalized attendance information that dynamically targets key parental misbeliefs consistently reduces chronic absenteeism 10 percent to 15 percent. This research led to the creation of InClassToday, a program that partners with districts around the country to help them reduce student absenteeism by implementing this research-backed intervention.

Another practice that educators can—and, strong evidence suggests, should—take up comes from a study conducted by Peter Bergman and Eric Chan. They randomly assigned parents of more than 1,000 middle and high school students in 22 schools to receive automated, frequent information via text message about their child’s missed assignments and grades. The study, which has now been replicated multiple times, found that this strategy led to a 28 percent reduction in course failures, a 12 percent increase in specific class attendance, and increased student retention by 1.5 percentage points. Providing parents with information that helps them monitor their child’s academic progress can have meaningful impacts on student success.

Finally, a study of learning mindsets led by David Yeager explored for whom growth mindset interventions are most effective. In a nationally representative sample of more than 12,000 students, the study found that adolescents assigned to complete a “growth mindset” intervention—which taught that intellectual abilities can be developed—earned higher GPAs (a modest, but real 0.05-grade points) in core classes at the end of the 9th grade. The authors preregistered that they predicted the intervention would most help low-achieving students. Consistent with this, they found that low-achieving students showed larger effects with higher GPAs (0.10-grade points) in core classes at the end of the 9th grade and were 11 percentage points less likely to get a D or F average in one of these classes. This large-scale and preregistered study replicating prior findings provides evidence that mindset interventions can improve outcomes for struggling students.

Each of these relatively low-cost and easy-to-implement interventions have modest but real impacts on student outcomes. Holding educational research to greater standards of evidence will very likely mean the effect sizes that are reported will be smaller. But they will reflect reality.

Expectations about how much impact interventions tend to have need to be massively recalibrated since more-rigorous research with larger sample sizes tends to find smaller effect sizes. If an educational intervention’s outcomes seem too good to be true, they probably are. (A recent working paper by Matt Kraft of Brown University provides a helpful overview of how we might think about effect sizes in education.)

There do not appear to be single “silver bullets” that will easily equalize and accelerate educational outcomes. The reality is that educational gains will come from a combination of many well-supported, evidence-based practices and communities of caring adults helping kids.

We hope educational policymakers and practitioners will start proactively looking for practices that meet these standards of evidence and work to widely implement them. By doing so, we can move steadily toward greater educational success for all students.

A version of this article appeared in the October 30, 2019 edition of Education Week as Raising the Bar for Evidence in Education

Events

School & District Management Webinar Crafting Outcomes-Based Contracts That Work for Everyone
Discover the power of outcomes-based contracts and how they can drive student achievement.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in Schools
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by Panorama Education
School & District Management Webinar EdMarketer Quick Hit: What’s Trending among K-12 Leaders?
What issues are keeping K-12 leaders up at night? Join us for EdMarketer Quick Hit: What’s Trending among K-12 Leaders?

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

School & District Management Should Schools Have Cellphone Restrictions for Teachers Too?
Schools expect teachers to model responsible cellphone use.
4 min read
Illustration of a young woman turning off her mobile phone which is even bigger than she is.
iStock/Getty
School & District Management Download Shhhh!!! It's Underground Spirit Week, Don't Tell the Students
Try this fun twist on the Spirit Week tradition.
Illustration of shushing emoji.
iStock/Getty
School & District Management Opinion How My Experience With Linda McMahon Can Help You Navigate the Trump Ed. Agenda
I have a lesson for district leaders from my (limited) interactions with Trump’s pick for ed. secretary, writes a former superintendent.
Joshua P. Starr
4 min read
Vector illustration of people walking on upward arrows, symbolizing growth, progress, and teamwork towards success.
iStock/Getty Images
School & District Management Opinion How Social-Emotional Learning Can Unify Your School Community: 7 Timely Tips
It’s a stressful political season. These SEL best practices can help school leaders weather the unpredictable transitions.
Maurice J. Elias
4 min read
Modern digital collage of caring leader surrounded by positivity. Social Emotional learning leadership.
Vanessa Solis/Education Week via Canva