The number of schools that have failed to make “adequate yearly progress” under the federal No Child Left Behind Act is proving to be a moving target, as states clean up erroneous data and grant appeals to schools that challenge the unwanted label.
In Texas, the number of campuses that met their annual improvement targets rose from 74 percent to 81 percent, following numerous appeals. The proportion of Mississippi schools that made adequate yearly progress jumped from about half to 75 percent after the state changed one of its criteria for rating schools. In Illinois, more than 300 schools may have been wrongly identified as not making adequate progress because of coding errors.
Many state officials describe the fluctuating numbers as a temporary problem that will abate as educators gain more experience with the legislation that President Bush signed into law two years ago this week. This school year marks the first time that schools have been identified as making or not making adequate yearly progress based on the complex formula in the federal law, a reauthorization of the Elementary and Secondary Education Act.
Schools receiving Title I money that consistently fail to meet their targets face increasingly severe sanctions, such as a requirement that their students be allowed to transfer. States also must design rewards and penalties for non-Title I schools. “I think a large part of the problem is that this is a huge data-collection effort, and it’s a new process for states. It’s very complex,” said James A. Watts, the vice president for state services for the Atlanta-based Southern Regional Education Board.
“As a result, in the first go-round, it’s been difficult for states to make certain this data is together enough in an appropriate and accurate fashion,” he said. “That’s why there’ve been revisions.” States were supposed to identify the number of schools that failed to make adequate progress based on 2002-03 data before the start of the current school year, so that parents could avail themselves of choice or supplemental services. By Dec. 22, states had to provide the U.S. Department of Education with a list of Title I schools identified for improvement, corrective action, or restructuring under the law because they had failed to meet their targets for multiple years. Federal officials said late last month that the “overwhelming majority” of states had submitted those numbers on time, although they did not have a final count.
Participation Rates
Many of the appeals lodged by schools and districts with their states focus on the law’s requirement that schools test 95 percent of their students and 95 percent of those in each of several specific subgroups: racial and ethnic minorities, students from poor families, students with limited English, and those with disabilities.
Some states have given schools leeway in meeting the participation requirement as part of the appeals process, in part because the absence of just a few students can mean the difference in meeting, or not meeting, progress goals.
Texas, for example, decided to grant an automatic appeal to any school in which fewer than 10 students overall or in any subgroup failed to participate on test day—even if the school missed the 95 percent mark.
Criss Cloudt, the associate commissioner for accountability and data quality in the Texas Education Agency, said the state decided to adopt a uniform rule to avoid making more subjective judgments about what qualified as a bona fide absence on a case-by-case basis.
“What about the student who’s in a coma, home awaiting a liver transplant?” she said. “We had one school where a group of five or six students got into a fight and all the students were expelled” and, therefore, didn’t take the tests.
Of the 507 appeals the state granted, Ms. Cloudt noted, 95 percent of the schools had test-participation rates of 90 percent or higher. The lowest participation rate was 82 percent. Schools were largely unaware of the participation requirement at the time of testing, she added, because federal officials had yet to approve the state’s accountability plan under the No Child Left Behind law.
Nevada officials have agreed with district decisions to grant appeals to schools that tested at least 90 percent of their students, in part because that’s what the state mandated in the past. The legislature raised the test-participation requirement to 95 percent last June to comply with the federal law, well after spring testing.
“We had originally requested from the federal government that we allow a 90 percent participation rate this first year,” said Paul M. La Marca, the director of assessments, program accountability, and curriculum for the Nevada education department. “Although they were sensitive to our predicament, they said no. But we did indicate to them that, given the inconsistency in state and federal requirements, we expected it to be an appeals issue.”
“We fully expect that this next year, 95 percent will be the target, and we won’t be granting those appeals,” he added.
California officials have granted additional leeway on the test-participation rule to high schools and those with fewer than 100 students. Preliminary results showed that 65 percent of the state’s high schools had failed to make adequate progress solely because of the participation requirement, said William L. Padia, the director of the policy and evaluation division for the state education department.
The state decided to grant appeals to high schools that tested at least 90 percent of their students on the California High School Exit Examination—the test used to comply with the federal law—and at least 95 percent on the California Standards tests in English and mathematics. Schools with fewer than 100 students won their appeals if five or fewer students failed to take part in state tests.
California has yet to release final figures on the number of schools granted appeals.
Federal Monitoring
Whether the U.S. Department of Education will permit such appeals is an open question. “We’re currently in conversations with them about it,” said Ms. Cloudt of Texas. “There’s not a resolution, as far as I know.
“We thought it was pretty clear that the appeals process was in the purview of individual states,” she added.
Similarly, Mr. Padia of California said: “Our perspective is that we had the administrative authority to run the appeals process. As long as we weren’t flying in the face of the law, we were OK.”
Ron Tomalis, the acting assistant secretary for the federal office of elementary and secondary education, said: “We gave great flexibility to the states in putting an appeals process in place with the understanding, or the caveat, that it has to remain consistent with the law. You can’t rewrite the law through an appeals process.”
He said the department would talk with states about the appeals process, as needed, as part of ongoing monitoring efforts.
In Mississippi, the proportion of schools making adequate progress jumped from about half to three-quarters, after the state switched the additional academic indicator it used to rate elementary and middle schools—a move it made with the federal government’s approval.
In addition to rating schools based on test-participation rates and the percent of students who score at or above the proficient level on state tests, states must choose at least one additional academic indicator—graduation rates at the high school level, and an indicator of their choosing for elementary and middle schools.
Originally, the state had planned to use a “growth indicator” that measured how much academic progress students made from year to year as the additional academic indicator. But Susan M. Rucker, an associate superintendent in the Mississippi Department of Education, said that measurement proved to be too complicated in the context of the federal law, so the state switched to attendance rates.
Coding Problems
A bigger problem in Minnesota and other states has been cleaning up coding problems and other data errors.
In Illinois, coding errors related to test-participation rates for student subgroups may have led to the mistaken identification of more than 300 schools as not having made adequate progress. For instance, some students who were identified as low-income on enrollment data may not have been identified as such on test forms.
The state board of education announced last month that the schools could recheck their data and have the results recalculated. The state received more than 200 phone calls and dozens of letters and e-mails from administrators questioning the AYP calculations. Illinois schools also threw out thousands of tests by students who enrolled after Sept. 30, a policy designed to avoid penalizing schools with high mobility. Some say the policy has led to inflated test scores at some schools.
While states had hoped to cross-check results in private, before the numbers were publicly released, “sunshine laws” in some states made that impossible.
Minnesota gave schools 30 days to make changes to their data after the state education department drew up its preliminary list. The state even set up an online computer system to help work through those changes.
“There were 14,000 individual data changes made by schools during that 30 days,” said Bill T. Walsh, the director of communications for the state agency.
As the deadline for releasing the data drew nearer, however, newspapers began asking to see the list. Mr. Walsh sought the opinion of the state attorney general’s office, which confirmed that even the preliminary data were public under state law.
“It was a little bit painful,” Mr. Walsh said. “The frustrating part, as a state agency, was a school might have been identified due to errors the school made in inputting the data. But it looked like the state put out a faulty list.”
In the end, 93 Minnesota schools, after correcting their data, were removed from the list of those failing to make adequate progress. Based on that experience, the department and the governor’s office are seeking state legislation that would remove the preliminary AYP calculations from the state’s open-records laws.
First-Year Woes
State officials in Minnesota and elsewhere hope that the problems represent understandable first-year glitches.
“It certainly hurt the credibility of the law a little bit, but I think in Minnesota, it’s a one-year problem,” Mr. Walsh said. “In the past, there was not a lot of incentive for schools to be perfectly accurate with where Johnny and Julie took their tests, and whether they were limited-English-proficient or not. The stakes weren’t high on keeping very good numbers.”
Now, that’s changing. “What’s happening is, since it’s so high-stakes for everybody, data that the schools normally wouldn’t correct, they’re going back through the test publisher and correcting—at substantial costs to them to do this,” Mr. Padia of California said.
“So, until the system settles down, it’s going to be real tough at first, which is why we’re trying to be as generous as possible in the early couple of years,” he added. “Schools are reeling from postings and appeals and all this stuff.”
Staff Writer Joetta L. Sack contributed to this report.