Includes updates and/or revisions.
A new review by an independent panel of experts concludes that the U.S. Department of Education’s much-criticized What Works Clearinghouse is doing a “reasonable job” of reviewing and rating the research evidence on the effectiveness of programs and practices in education.
Created in 2002 by the Institute of Education Sciences, the department’s primary research arm, the clearinghouse has come under fire from policymakers, researchers, and practitioners, who question its usefulness and methods. Some have dubbed it the “nothing works” clearinghouse because of the limited number of programs and studies that meet its strict screening standards.
But in its report, which was posted online Nov. 19, the six-member panel contends that the clearinghouse’s review standards are “well documented” and “reasonable.” The study further characterizes the reports that the clearinghouse produces as “succinct and meaningful.”
Noting that their review focused narrowly on whether the clearinghouse makes judgments that are scientifically valid, the panelists also call, however, for a fuller look at the entire mission of the enterprise.
Both Grover J.“Russ” Whitehurst, the outgoing director of the IES, and Robert C. Granger, who chairs the national board that advises the research agency, called the panel’s findings reassuring.
“In a marketplace that is unsophisticated with regard to research quality, there has to be an entity that uses rigorous standards to vet research on education program effectiveness for practitioners and policymakers,” Mr. Whitehurst wrote in a letter accompanying the report.
Mixed Reactions
Members of the research community offered mixed reactions.
“For me, the key question is how usable is the information that the clearinghouse produces,” added James W. Kohlmoos, the president of the Knowledge Alliance, a Washington group that represents research organizations. “We haven’t answered that yet.”
Robert E. Slavin, a researcher and co-founder of the Baltimore-based Success for All Foundation, concurred. “The panel acknowledged that it was given too little time and too narrow a mandate to adequately evaluate the WWC,” Mr. Slavin, who has been an outspoken critic of the clearinghouse, said in an e-mail.
Begun in late July, the review was commissioned by the National Board for Education Sciences, the presidentially appointed panel that advises the IES.
Mr. Granger, the board’s chairman, said the House Appropriations panel that oversees education programs called for a more comprehensive investigation by the U.S. Government Accountability Office, the watchdog agency for Congress. On the Senate side, though, appropriators requested a more focused look at the scientific validity of the clearinghouse’s review procedures.
“There was just not enough time with the current board to take on a broader review,” Mr. Granger said, noting that the terms of five members of the board, him included, are due to end later this month. While replacement members have been named, their nominations are not likely to be approved by the Senate before the new administration takes control of the White House in January. Currently, only 11 of the 15 slots on the board are filled.
“The focus is narrow by design,” added Mr. Granger, who is the president of the William T. Grant Foundation in New York City. “It is the crux of the matter. If you can’t trust the information that’s on the What Works site, then you can’t trust anything else it does.”
Changes Recommended
Mr. Granger, working with staff members at the IES, helped select the review-panel members, most of whom are experts at synthesizing research findings in fields other than education.
The panelists are: C. Hendricks Brown, a biostatistician at the University of South Florida in Tampa; David Card, an economist at the University of California, Berkeley; Kay Dickerson, an epidemiologist at Johns Hopkins University in Baltimore; Joel B. Greenhouse, a biostatistician at Carnegie Mellon University in Pittsburgh; Jeffrey R. Kling, an economist at the Brookings Institution, a think tank in Washington; and Julia H. Littell, a professor of social work and social research at Bryn Mawr College in Bryn Mawr, Pa.
Besides calling for a more comprehensive study of the clearinghouse, the panel also urged the research agency to set up a process for regular reviews of the clearinghouse standards—an action that Mr. Whitehurst said the IES would undertake.
Among its other recommendations, the panel called on the clearinghouse to: review its standards regarding the attrition rates of subjects who take part in experiments; establish a formal process for tracking potential conflicts of interest in the studies it reviews, especially when program developers pay for studies of their own programs; and take another look at the standards it uses to account for cases when subjects fail to comply with the intervention being studied or when intervention practices cross over from the experimental to the control group.