A report from a progressive think tank measuring the “educational productivity” of more than 9,000 school districts around the country says that districts getting the most for their money tend to spend more on teachers and less on administration, partner with their communities to save money, and have school boards willing to make potentially unpopular decisions, like closing underenrolled schools.
The study, from the Washington-based Center for American Progress, attempts to measure district productivity nationwide. Almost every K-12 school district in the country with more than 250 students was included, and the information has been compiled in a website that allows users to compare districts within states.
The attempt to drill down on productivity—what districts are getting in terms of student achievement in math and reading for their education dollar—is particularly appropriate now, as relief to districts from the federal economic-stimulus program is petering out, and an economic upswing is not on the horizon, said Ulrich Boser, a senior fellow at the center and the report’s author. The report is part of a series from the center examining government accountability and efficiency.
The analysis is intended to encourage a more sophisticated discussion rather than just suggesting district funding should be cut in the name of encouraging efficiency, Mr. Boser said.
“Do we pretend that this problem [of inefficiency] doesn’t exist, so we don’t enter into this conversation? I think the answer is no,” he said. “In education, we think about achievement on one side, and spending on the other, and we need to marry that.”
One district the center singled out for productivity was the Harlan Independent School District, which has around 950 students and is located in a coal-mining community in southeastern Kentucky. Based on the center’s measures, the district is doing a good job getting strong academic results with its predominantly low-income student population. Most of the other districts deemed efficient were larger and wealthier, so the Harlan Independent district is an outlier.
“States ought to pay very close attention to this,” said Superintendent David Johnson, who noted that 75 percent to 80 percent of his students go on to college in a community where 40 percent of the population doesn’t have a high school diploma. “What impact are we getting from the investment we’re making in education? I think that’s a great question to ask,” he said.
A district-by-district evaluation of educational productivity.
SOURCE: Center for American Progress
Mr. Johnson said that his district had not made specific changes for the sake of productivity, other than maintaining “an expectation of academic performance.”
But the report did single out practices of other districts, like Franklin in Franklin, Mass., a 6,200-student district that merged its technology department with the town to save money. The 600-student Poyen district in Poyen, Ark., partners with a community college to offer its students free college courses.
But some researchers caution that this type of analysis is so complicated that policymakers must be careful about drawing specific conclusions. “There’s lots of people who have tried to do analyses like this,” said Robert Bifulco Jr., an associate professor of public administration at Syracuse University in New York.
“I would start with a skeptical mind when it comes to policy results,” said Mr. Bifulco, who had not seen the center’s report.
The report itself offers a range of caveats, noting that researchers cannot account for all the variables outside the control of a district, or for flaws in the databases they used.
“Despite these caveats, we believe our evaluations are useful, and the best available, given existing traditions and knowledge,” the report states.
However, it does draw some policy suggestions from the data—some of which are an “absurd stretch of logic,” according to Bruce D. Baker, an associate professor of educational policy at New Jersey’s Rutgers University who provided technical advice to the center. For example, the report does not adequately account for the extra resources it takes to educate disadvantaged students, he said, and it could give more ammunition to people who want to cut school funding.
“It was a given that poor urban districts would be found inefficient,”Mr. Baker said. “The ones that don’t look inefficient are the ones that are dreadfully underfunded.”
Three Perspectives
The center’s analysis offers three ways of looking at district productivity, each of which offers slightly different results.
The report uses 2007-08 spending data and state reading and math test results for the 2007-08 school year. Because state assessments vary across state lines, district efficiency can only be compared within any one state. Also, the District of Columbia, Hawaii, Alaska, Montana and Vermont were not included in the analysis. The District of Columbia and Hawaii are single-district jurisdictions; Montana and Vermont did not have enough comparable districts, and Alaska was excluded because the authors could not sufficiently adjust for cost-of-living differences within the state.
The “basic return on investment” measure rates school districts on how much academic achievement they get for each dollar spent, relative to other districts in the state. Adjustments are made for students deemed more expensive to educate than their peers in general education: special education students, students who are eligible for free or reduced-price lunches, and English-language learners.
The “adjusted return on investment” is similar to the basic measure, but it uses a different analysis to be more sensitive to spending differences within states.
Finally, a “predicted” efficiency rating attempts to gauge how much more or less achievement a district produced, compared with what would be expected of a district with the same amount of spending and student demographics.
The results calculated in this study suggest that high-spending districts are often inefficient. The report notes that only 17 percent of the Florida districts in the top third in spending were also in the top third in achievement. Also, students from disadvantaged backgrounds nationally were more likely to be enrolled in highly inefficient districts, even taking into account that such students tend to cost more to educate.
The interactive website that accompanies the report allows some interesting comparisons, both in using the different measures in one district, and in comparing different districts with one another.
For example, the nation’s largest school district, New York, earns a fairly low ranking on both basic and adjusted return on investment, suggesting that it spends a lot of money but doesn’t get very good test scores in return.
However, in “predicted efficiency”—how well New York would compare with other districts with similar demographics—the district earns high marks.
In comparing districts with each other, the report gave the example of Eau Claire and Oshkosh in Wisconsin. The districts are about the same size—Eau Claire has around 10,800 students and Oshkosh around 10,200 students. They serve similar student populations and get largely similar results on state exams. However, Eau Claire’s total expenditures are about $8 million more per year than those of Oshkosh, which spends about $110 million a year to run its district.