The best way to get a handle on slippery dropout and graduation rates across the nation, a federal task force has concluded, is for states and federal statisticians to work together in devising data-collection systems that can track individual students throughout their high school years.
The report, “National Institute of Statistical Sciences/Education Statistics Services Institute Task Force on Graduation, Completion, and Dropout Indicators,” is available online from the National Center for Education Statistics. ()
That call is the central recommendation of a long-awaited report released last week by the Task Force on Graduation, Completion, and Dropout Indicators. Made up of 10 academics and government statisticians, the panel was formed in the fall of 2003 to advise the U.S. Department of Education’s chief statistical agency on ways to improve its reporting on schools’ progress in helping students earn high school diplomas.
But federal officials and independent experts said last week that the task force is placing a tall order on states and the National Center for Education Statistics. Though student-level longitudinal tracking systems may be considered a gold standard for measuring education progress, fewer than a dozen states have such systems in place, experts said. Establishing more of them, they added, could take years and millions of dollars.
“This is a very important report,” said Christopher B. Swanson, a researcher with the Urban Institute, a Washington-based think tank. “But I’m concerned that the task force is putting all its water in one bucket, and there may be some holes in that bucket.”
Incentives to Cheat
The need for accurate graduation and dropout rates has become more urgent with the passage of the No Child Left Behind Act, which holds schools and districts accountable for improving those rates. The 3-year-old law does not, however, require states to report those figures in any particular way.
As a result, states use a variety of methods, some of which the task force says are misleading and others that inadvertently provide what it calls “perverse incentives” for schools to cheat.
• The National Center for Education Statistics should take the lead in encouraging states to set up student-tracking systems to get a better handle on dropout and graduation rates.
• A federal student-tracking system would not be economically or politically feasible.
• Simpler, less reliable methods exist for counting graduates and dropouts, but the NCES should not direct states to use them in the short term while better approaches are being devised.
• No indicator of high school graduation or dropout rates should be considered perfect.
• Improving school-completion counts is just one of many reasons for states to work with the statistics agency to build longitudinal student-tracking systems.
SOURCE: Task Force on Graduation, Completion, and Dropout Indicators
One example: When school officials are not required to track whether students who transfer ever graduate, the officials might encourage students who are likely to drop out to move to other schools in an effort to make their own look better on paper.
That point was underscored last year in Houston when school officials were accused of misrepresenting their dropout data to state officials. (“Houston Case Offers Lesson on Dropouts,” Sept. 24, 2003.)
The task force says another impetus for its work was the growing numbers of ways students can complete high school, such as online learning and home schooling, and the difficulty in keeping track of the numbers of students who take those nontraditional routes.
Also, the National Center for Education Statistics, or the NCES, relies on state-reported statistics and federal census data in its own reporting on dropout, completion, and graduation rates.
According to Grover J. “Russ” Whitehurst, the director of the Education Department’s Institute of Education Sciences, which oversees the NCES, some of those rates are calculated now by comparing school enrollment figures for the fall and the spring for every year of high school and then extrapolating a dropout rate from the accumulated differences.
A better way to do it, the report says, would be to use a formula that, in its simplest form, divides the number of students who enter high school in one year by the number who graduate three or four years later. It should also, however, take into account students who transfer in and out or those who might be excluded from the graduation count for reasons other than dropping out, the report says.
To formulate those kinds of figures, states would have to put systems in place that assign identifying numbers to students and track them throughout their high school careers—even when they transfer to other schools in the same district or across the state. Tallying up the numbers that way, the report says, could provide more definitive data on whether students are dropping out, graduating, or completing their schooling some other way.
Short-Term Quandary
While the department cannot legally force states to put those kinds of systems in place, the report concludes, it can take the lead in prodding states to establish them and in assisting in those endeavors.
Mr. Whitehurst said the institute was “eager” to respond to the task force’s call. He noted that Congress last month authorized $25 million in grants to help states design better data-collection systems aimed at tracking individual students’ achievement growth as well as their progress through school systems.
Even so, Mr. Whitehurst noted, the report’s estimate that most states will have such systems in place within four years may be unrealistic.
“There is a remaining question of what to do in the short term to help states comply with the [No Child Left Behind] law and to help the public have the best available information even though it’s not perfect,” he said.
The report does suggest other, less reliable calculation methods states can use in the meantime, but it also urges the federal statistics agency not to order states to use them because it might be “confusing and wasteful to states.”
In the final analysis, though, said Walter M. Haney, a Boston College education professor, even statewide, student-level longitudinal tracking systems are not completely accurate. They cannot, for instance, account for students who move to schools in other states. His own studies suggest that a state’s school population can fluctuate by as much as 15 percent over five years.
For his part, task-force member Russell W. Rumberger said the panel recognized early on that its call for statewide longitudinal tracking systems was ambitious but not impossible.
“But we can definitely encourage it by saying it’s the best way because existing systems are not accurate,” added Mr. Rumberger, an education professor at the University of California, Santa Barbara. “My argument is that school districts already know, for every kid who walks in the door, when he walks in and when he walks out, and what grade level he’s in.”