Nearly a year ago, a group of scholars working in education decided to form a professional society focused on advancing scientifically rigorous studies that could yield definitive answers on what works in education.
Since then, the fledgling group, known as the Society for Research on Educational Effectiveness, has attracted 250 dues-paying members from a range of academic disciplines and research traditions. Last week, as the society kicked off its inaugural conference here, one of the biggest questions its members faced was how big to make its tent.
Learn more about the Society for Research on Educational Effectiveness.
“After listening to two days of talks, I fear the education research community has the same problem that’s been attributed to math education—that it’s a ‘mile wide and an inch deep,’ ” said Judith D. Singer, a statistician from Harvard University’s graduate school of education and a member of the society’s advisory board.
“We all have chosen a particular piece of the landscape,” she continued, “but I find myself saying, ‘Where should we place our beliefs?’ ”
Supported by a three-year, $760,000 grant from the U.S. Department of Education’s Institute of Education Sciences, the society largely broke off from the 18,000-member American Educational Research Association, based in Washington. (“Review Process for U.S. Education Research Approved,” Feb. 1, 2006.)
The society’s mission is to provide a common forum for scholars interested in research that probes “cause-and-effect relationships” in education. The gold standard for assessing cause and effect is widely thought to be randomized control trials—studies in which students may be assigned by chance to either control or experimental groups.
Though widely used in medicine, such studies are far rarer, and often controversial, in education research—and typically are a tiny fraction of the studies presented at annual AERA gatherings.
Those attending last week’s conference, though, heard presentations on a variety of research methodologies, including both randomized and nonrandomized research designs. The standing-room-only event drew 135 participants, and organizers turned away dozens of others.
From the public-health field, researchers presented findings from randomized studies on programs that have succeeded in promoting children’s social and character development and improving classroom management.
And a cognitive scientist discussed a study on science education that involved small numbers of children working with springs and ramps in laboratory settings—a contrast to the kind of randomized studies that encompass large numbers of classrooms, schools, or districts.
Depth vs. Breadth
Mark A. Constas, a co-chairman of the group, said the discussion on alternative research methodologies had been largely unplanned. The experts who were invited to speak decided on their own to focus on topics such as “regression discontinuity analyses,” and other nonexperimental research approaches.
Yet the discussion on various research approaches was a “notable achievement” for the society “because the criticism out there about us is that we are very much confined methodologically,” said Mr. Constas, an associate professor of education at Cornell University in Ithaca, N.Y., where the society is based.
The input from other disciplines such as public health, meanwhile, had been planned in advance, he said.
The organization invited the public-health researchers in part so that members could learn from that field’s longer track record in school-based experiments, said Mr. Constas, who co-founded the group with Larry V. Hedges, a professor of statistics and policy research at Northwestern University in Evanston, Ill.
Mr. Constas said the society also plans to build bridges to education experts who specialize in translating research into practice.
“Always, the end goal is getting toward the kind of research that supports cause-and-effect inferences,” he added. “The most successful scientific societies are noted for their depth rather than for their breadth.”
The society’s upcoming projects include a research journal slated to start in January 2008; an online publication, due out in the fall of next year, that is aimed at policymakers and practitioners; and a handbook of research on educational effectiveness. The society also plans to launch efforts to train budding education researchers in the skills they need to conduct experimental work.
In an address at the conference, Grover J. “Russ” Whitehurst, the director of the Institute of Education Sciences, which has led the Education Department’s ongoing campaign to transform education into an evidence-based field, urged the society to take on yet another mission: advocacy.
“I suggest as you become a little more mature as an organization that you think about adding to your mission public-policy advocacy for the type of research that you are committed to doing as individuals,” he said. “Frankly, it’s lonely out there.”