In an effort to make its services more timely and relevant, the federally supported What Works Clearinghouse has launched a series of “quick reviews” to assess the methodological soundness of studies that have made national news.
“There are all kinds of studies coming out daily that are cutting-edge,” said Mark Dynarski, who directs the clearinghouse, which is based at Mathematica Policy Research Inc. in Princeton, N.J. “The idea is whether the clearinghouse can help the public in determining whether studies are well crafted, if they use sound methods and make good inferences.”
Created in 2002 by the U.S. Department of Education’s Institute of Education Sciences, the online clearinghouse vets the research base for programs, practices, or policies that educators might try out in their own jurisdictions. But the first reviews were slow in coming, and few studies met the clearinghouse’s strict evidence standards, leading some critics to dub the project the “nothing works” clearinghouse.
The clearinghouse plans to post the new quick reviews, in contrast, within a month of a study’s public release. The turnaround time is faster because reviewers are evaluating a single study, rather than a body of research.
The first five reviews, posted online starting late last month, give mixed assessments of recent studies on after-school programs, a student-athlete drug testing program, a cash-incentive program for teachers and students of Advanced Placement courses, a computer-based program for teaching algebra, and on whether 6th graders are better off in middle schools or elementary schools.
Some of the reports are already drawing sharp criticism from study authors.
Research Into Practice: The Push For Usability
This four-part series examines the movement to make education research more “usable” and explores some efforts to connect the worlds of research and practice.
“This is all politics. I’m telling you from the bottom of my heart,” said Dr. Linn Goldberg, a professor of medicine at Oregon Health and Science University, in Portland. Clearinghouse reviewers determined that the study Dr. Goldberg led on school-based drug testing for student-athletes was “not consistent with WWC evidence standards” —the lowest of three possible ratings.
The other rating choices are “consistent with WWC evidence standards” and “consistent with WWC evidence standards with reservations.”
Tough Standards
In the study, published in November in the peer-reviewed Journal of Adolescent Health, Dr. Goldberg and his colleagues evaluated the effects of a drug-testing program in which high school student-athletes were randomly selected for unannounced tests for alcohol and drug use.
After two years, the researchers found, students taking part in the testing were no less likely than their control-group counterparts to be using prohibited substances. Dr. Goldberg said the findings, which were widely reported in the press in February, run counter to the Bush administration’s stance favoring such programs.
In their review, What Works Clearinghouse analysts said the findings were inconclusive, in part, because of the high attrition rates among the schools and students involved. Seven of the 18 schools that signed up for the study dropped out, according to the clearinghouse’s report. In addition, 29 percent of students in the drug-testing program and 19 percent of the control-group students failed to return final surveys.
However, Dr. Goldberg said that the schools withdrew before the drug-testing programs began in any school, and that they did so in response to a threatened lawsuit. Final student data are missing, Dr. Goldberg added, because students either graduated or quit their teams the next year.
“It’s natural attrition,” he said, “and it’s much lower than we expected.”
For his part, Mr. Dynarski of the What Works Clearinghouse denied that the study was selected to serve as a political target. He said reviewers chose it because it was published within the time frame the clearinghouse had set for its first round of reviews. Although federal education officials can refer studies to the clearinghouse under the new “quick review” guidelines, none of the studies in the first round came to the clearinghouse by that route, he added.
“Attrition is a legitimate concern,” Mr. Dynarski said. “Clearinghouse standards pay a lot of attention to attrition, and here there is a 10-percentage-point difference between the control and experimental groups.”
The clearinghouse also gives a “not consistent” rating to an eight-state study of 35 high-quality after-school programs that was first reported in Education Week last November. (“High-Quality After-School Programs Tied to Test-Score Gains,” Nov. 28, 2007.) Deborah Lowe Vandell, the study’s lead author and the chairwoman of the education department at the University of California, Irvine, said the reviewers’ determination was not surprising, given the narrow range of study designs that pass the clearinghouse’s evidence screens.
“A clinical trial [one of the designs that pass muster] is an important standard, but it represents only one type of research evidence,” she said.
Clearinghouse reviewers gave a “consistent” rating to a randomized study that found positive academic effects for students using SimCalc Mathworlds, a software program for teaching algebra concepts. Two other studies—one on a Texas effort to give cash bonuses to students and teachers for high scores on AP tests and the other on the effects of teaching 6th graders in middle schools—were rated “consistent with WWC evidence standards with reservations.”