Two national education groups are launching a first-time effort to assess—and possibly even rank—the hundreds of doctoral programs that prepare education researchers.
The field of education produces an estimated 1,800 doctorates a year, according to the American Educational Research Association, which is overseeing the three-year study with the National Academy of Education, known as NEAD. Such programs, however, have come under scrutiny in recent years, with scholars and policymakers complaining that they suffer from uneven quality and “mission muddle.”
Some of those critics said last week that the new assessment could be a step toward improving the caliber of the programs.
“I think this is really very important,” said Arthur E. Levine, who published a highly critical report on the state of research doctoral programs last year. (“U.S. Faulted on Training of Scholars,” May 7, 2007.)
“The AERA has clout with people who do research in education and NAED is thought of as having some of the best people in the field,” continued Mr. Levine, the president of the Woodrow Wilson Foundation, based in Princeton, N.J., a former president of Teachers College, Columbia University. “It’s very important for peer review to assert high standards in education research preparation in a way that hasn’t been done before.”
Felice J. Levine, the AERA’s executive director, who is no relation to Mr. Levine, agreed. “We know this will really add to and enhance our field in education and training for generations to come,” she said.
Scheduled to take place over three years and at a cost of more than $2 million, the evaluation is intended to parallel a much larger study of research doctoral programs that the National Research Council conducts every 10 years or so. The council—the research arm of the National Academies, the respected Washington-based group that advises the federal government on scientific matters—surveys, ranks, and gathers data on thousands of doctoral research programs across the country. Its latest study, due out next year, includes 5,000 such programs at 212 universities.
Controversy and Credibility
While the NRC’s rankings have sometimes generated controversy, the assessment includes a wealth of data on doctoral students, their professors, and program coursework that institutions can draw on to improve their programs.
Among other program characteristics, the assessment looks, for instance, at how long it takes doctoral candidates to earn their degrees and where they end up after graduation, and at the numbers of academic citations, publications, grants, and awards that program faculty members generate.
“One of the things that makes the NRC rankings valuable is that they’ve been really credible,” said Chris M. Golde, the associate vice provost for graduate education at Stanford University and the former research director for an extensive study on the education research doctorate conducted by the Carnegie Foundation for the Advancement of Teaching.
The NRC declined, though, to make education research a part of that study. One reason: Some education schools confer the same Ph.D. degree on all their program graduates, regardless of whether those students intend to be superintendents or scholars—a practice that Mr. Levine, in his own critique, labeled “mission muddle.”
“We decided the study was complex enough already,” said Charlotte Kuh, the NRC’s study director.
Study advisers also raised questions about the difficulty of developing a classification system for tracking the dozens of research programs the field offers, which include traditional mainstays such as education psychology and newer fields of study such as neuroscience and education policy.
In comparison, several experts said, the AERA and NAED may be in a better position to undertake such analyses. The groups’ preliminary study of 1,300 doctoral-level education programs, for example, found more than 900 programs with a primary aim of nurturing new scholars and more than 300 practitioner-oriented programs.
The issue of program muddle, nonetheless, goes to the heart of concerns about program quality, said Lorrie Shepard, the president of the Washington-based NAED, an invitation-only group composed of the field’s most distinguished scholars.
“By blending both programs, you serve neither purpose well,” said Ms. Shepard, the dean of the education school at the University of Colorado at Boulder. “By the mere act of participating in this study as an institution, you get integrated into the larger conversation going on in the field around this issue.”
The project is being underwritten, for the most part, by the National Science Foundation. Mathematica Policy Research Inc., of Princeton, N.J., the same group that conducts the NRC assessment, is expected to conduct the research, Ms. Levine said