Blog

Your Education Road Map

Politics K-12®

Politics K-12 kept watch on education policy and politics in the nation’s capital and in the states. This blog is no longer being updated, but you can continue to explore these issues on edweek.org by visiting our related topic pages: Federal, States.

Education

School Improvement Grant Program Gets Mixed Grades in Ed. Dept. Analysis

By Alyson Klein — November 21, 2013 6 min read
  • Save to favorites
  • Print

UPDATED

The U.S Department of Education’s second annual snapshot of the controversial School Improvement Grant program paints a mixed picture of the program, leaving open the question of whether an eye-popping infusion of federal cash—$3 billion in stimulus funding alone—and some serious federal strings had a dramatic impact on the nation’s lowest-performing schools.

While more than two-thirds of schools in the first cohort (which started in 2010-11) saw gains in reading and math after two years in the program, another third of the schools actually declined, despite the major federal investment, which included grants of up to $500,000 a year, the department said Thursday. And schools that entered the program in its second year (the 2011-12 school year) didn’t post gains in math and reading as impressive as those the first cohort saw in their first year.

With two years of data now, it’s interesting to note that rural and small-town SIG schools posted bigger gains than their city and suburban counterparts in math—and were almost as impressive in reading. When the SIG program was first unveiled, rural advocates worried that the remedies specified wouldn’t work for their schools.

Schools that attempted more-dramatic interventions generally saw greater gains than schools that took more flexible approaches. (More on that below.)

U.S. Secretary of Education Arne Duncan said the data show that SIG has pointed long-troubled schools in the right direction.

“The progress, while incremental, indicates that local leaders and educators are leading the way to raising standards and achievement and driving innovation over the next few years,” he said in a statement.

But some researchers had a different take.

The fact that so many schools actually slid backward despite the big federal bucks is “a little bit alarming” said Robin Lake, the director of the Center on Reinventing Public Education at the University of Washington, which has studied the impact of the SIG program in the Evergreen State.

“Given the amount of money that was put in here, the return on investment looks negligible at this point,” she said. “I don’t know how you can interpret it any other way.”

Among the highlights from the department’s analysis, which you can read in full below:

• Generally, 69 percent of schools that entered the three-year program during its first year saw an increase in math after two years of participation. But 30 percent of schools saw declines, and 2 percent didn’t demonstrate any change.

• The results in reading for Cohort 1 were similar--66 percent demonstrated gains in reading, while 31 percent saw declines, and 3 percent saw no gains.

• Fifty-five percent of schools in the second cohort, which had only been in the program for one year at the time the analysis was released, showed gains in math, while 38 percent saw declines, and 7 percent demonstrated no change. Reading results were similar, with 61 percent of schools showing gains, 34 percent seeing declines, and 6 percent of schools demonstrating no change.

Overall, schools in the first cohort saw a bump of 8 percentage points in math over the course of the two years, and 5 percentage points in reading. Cohort 2 schools, which were in the program for a shorter period, went up 2 points in math and just 1 point in reading.

Things unanswered: Big questions still loom, including the scope of these gains. Last year, the Education Department identified schools whose performance had jumped by double digits. But the way much of this data is presented, it’s impossible to tell whether particularly high-performing schools are pulling up the average for schools that didn’t do nearly as well.

It’s important to note that this data compares schools in different states, which all set different bars for what it means to be proficient. So as the Education Department explains, the “averages” will very much be influenced by states that set both relatively high or relatively low proficiency standards. And without more specific data, it’s impossible to draw more sophisticated conclusions about where these test score gains are coming from.

And the analysis doesn’t contain any information about a major chunk of schools that were in the first cohort. For instance, the summary data showing changes in math scores covers just 534 schools. But there were more than 730 schools in Cohort 1. According to the notes provided by the department, the missing schools had changes in state tests or other factors that excluded them from the analysis.

There’s also no school-level data available for the second year of the program. The department released individual school data for the first year, but you have to be the ultimate Excel geek to get it.

“This is just the tip of the iceberg on the information we really need,” said Diane Stark Rentner, the deputy director of the Center on Education Policy, which has done extensive research on the School Improvement Grant program. It will be difficult to draw hard and fast conclusions about the three-year program’s effectiveness until there’s a third year of data, she added.

Plus, it’s unclear whether schools’ gains can be traced to the program itself, or to homegrown turnaround efforts already in progress, Rentner said.

Some background: The School Improvement Grant program required schools to choose from one of four turnaround models, all of which involved schools getting rid of the principal if that person had been on the job for more than two years.

Most schools choose the “transformation” model, considered the most flexible, which called for schools to extend learning time and gauge teacher effectiveness by gains in test scores. Another group of schools opted for “turnaround,"which required replacing 50 percent of a school’s staff. Other schools opted to “restart” - meaning that they turned themselves over to a charter-management organization. And a very small handful of schools closed altogether.

There were 835 schools tapped in the first cohort of the program, and another 503 in the second cohort. Schools received grants of up to $500,000 a year for three years.

So which models were most successful? That’s been a major question in SIG implementation. Preliminary results show that, at least for cohort 1 (the first schools into the program), the more dramatic interventions seemed to have yielded the biggest gains.

For example, cohort 1 schools that used “transformation” improved about 6 points in math over two years, and 3 points in reading. Schools in the same cohort using the “turnaround” model jumped 11 percentage points in math and 6 in reading, over the same time period. And schools using the “restart” model gained 9 points in math, and about 7 in reading.

Of course, it’s notable that the more dramatic models were also the least frequently used. Roughly 74 percent of schools in the first cohort used “transformation,” while just 16 percent used “turnaround” and 6 percent used “restart.”

Political background: The SIG program is almost certain to undergo major changes if and when Congress reauthorizes the Elementary and Secondary Education Act. Congressional Republicans have moved to slash all funding for the program, and it wasn’t included in a bill to reauthorize the ESEA that passed the House with only GOP support earlier this year. And a bill supported only by Democrats that was approved by the Senate education committee in June would give states new turnaround options, including allowing them to submit turnaround ideas to the U.S. Secretary of Education for approval.

Rep. John Kline, R-Minn., the chairman of the House education committee and the author of the House GOP legislation, said in a statement that the data support the House’s approach to take turnarounds out of federal hands.

“These tepid results underscore the limits of top-down mandates and the need for a new approach to education reform—one that allows state and local leaders to determine the best way to raise the bar in our schools,” he said.

A version of this news article first appeared in the Politics K-12 blog.