In December 2005, a year after President Bush signed the latest version of the Individuals with Disabilities Education Act into law, states were each required to craft a self-evaluation framework of “measurable and rigorous” targets for improving the academic progress of students with disabilities.
By this week, the states must submit a massive amount of data to the federal government explaining how well they met the standards they set for themselves—and continuing a process that has prompted major changes to the way states monitor such students.
The Feb. 1 deadline is for annual performance reports and state performance plans, both created under the 2004 reauthorization of the IDEA. The reports will include information on 20 different “indicator” areas for the part of the federal special education law that focuses on 3- to 21-year-olds, also known as Part B of the IDEA.
The indicators cover such matters as graduation rates, dropout rates, overrepresentation of minorities in special education, and postsecondary outcomes for students with disabilities.
States must also submit data and judge their performance on 14 indicators related to infants and toddlers with disabilities, who are covered by Part C of the law.
Some of the indicators are based on data states have had to collect for some time, like graduation rates, though the format for reporting the data may have changed. For those indicators, states will be submitting annual performance reports showing their progress toward a target by Feb. 1.
In other cases, states are reporting data for the first time, such as for postsecondary outcomes for students with disabilities. In those cases, states are being asked to submit their “state performance plans,” which will tell the Department of Education how they plan to collect the data, and the targets they plan to meet.
Some of the data-collection requirements have been carried forward from the 1997 version of the special education law. But new methods of analyzing the information have required a thorough overhaul of how states collect, compile, and analyze data on students with disabilities.
The 2004 mandate “is so big, it has redefined part of the core work of the state education agency,” said Jacquelyn J. Thompson, the state director of special education in Michigan.
Priority Areas
In the Michigan education department, employees have been assigned new tasks and the state has had to hire contractors to handle the work. Ms. Thompson, who is also the president of the National Association of State Directors of Special Education, said she hasn’t had time to figure out how much it has cost the state to comply with the federal requirement.
But “it’s been hundreds of thousands, if not millions of dollars” for her state, she said. “When you’re playing around with data, it’s not a casual undertaking.”
The Individuals with Disabilities Education Act requires states to submit annual performance reports to the Department of Education with information on 20 indicators regarding school-age children. Some examples:
Indicator 1:
Percentage of youths with individualized education programs in the state graduating from high school with a regular diploma, compared with the percentage of all youths in the state graduating with a regular diploma.
Indicator 2:
Percentage of youths with IEPs dropping out of high school, compared with the percentage of all youths in the state dropping out of high school.
Indicator 8:
Percentage of parents with a child receiving special education services who report that schools facilitated parent involvement as a means of improving services and results for children with disabilities.
Indicator 9:
Percentage of districts with disproportionate representation of racial and ethnic groups in special education and related services that is the result of inappropriate identification.
Indicator 14:
Percentage of youths who had IEPs, are no longer in secondary school, and who have been competitively employed, enrolled in postsecondary school, or both, within one year of leaving secondary school.
SOURCE: U.S. Department of Education
A full list of the indicators is available from the Education Department.
By July 1, the Education Department will evaluate the states based on how well they met their goals. The states are also required to evaluate school districts based on the same indicators, but do not have a federally-set deadline.
Those determinations will be released to the public, and over time, a state that has continuing problems meeting its standards risks having some of its federal special education aid withheld.
Alexa Posny, the director of the Education Department’s office of special education programs, said in an interview that the state performance plan process is an “evolutionary” way of assessing how well states educate students with disabilities. She said she hoped states are “really looking at it as an opportunity to look at the data and look at what they’ve been doing right to make sure student needs are being met. When you look at the data it kind of helps you tell the story.”
In addition to collecting the information, the states are each required to come up with a six-year plan for constant improvement.
The Education Department also plans to use the state information as a way to focus its technical assistance to where it’s most needed, said Ruth Ryder, the director of the division of monitoring and state improvement.
The 2004 renewal of the IDEA established priority areas that states must pay attention to in their self-monitoring efforts. For example, one priority requires states to focus on whether racial and ethnic minorities are overrepresented, based on their proportions of student enrollment in special education categories such as mental retardation or emotional disturbance. The calculations are made both at the state and district level. States are also supposed to ensure that minority-group members who are in those disability categories aren’t placed in self-contained classrooms away from their peers in greater numbers than white students with the same disabilities.
The goal of the program is positive, but implementing the program has been highly challenging, many state directors of special education say. Officials at the Education Department’s office of special education programs have had monthly conference calls and released several memos to state directors. The department has also funded several resource centers to provide in-depth assistance on the various indicators, such as the National Center on Special Education Accountability Monitoring, based at the Louisiana State University’s campus in New Orleans.
But crunching numbers to come up with an appropriate, and defensible, result has put a lot of pressure on staff members, some state officials say.
“Basically, the first problem for states was there was no lead time to work on this,” said Paul J. Raskopf, the director of the office of financial and data services at the Virginia Department of Education. “These reporting requirements do not lend themselves to existing student data systems.”
Peg Brown-Clark, the director of special education for Wyoming, said her state was fortunate to have a “pretty good infrastructure in place for our data.”
It just takes time to get individual districts up to speed, she said. For instance, she said, one indicator requires some information on school suspensions that Wyoming districts have not normally kept.
“It just takes at least a year” for all districts to start submitting usable data, Ms. Brown-Clark said.
Useful Results?
The process, meanwhile, has already yielded some positive results. States, and the resource organizations that work with them, say the scope of the work has required a level of collaboration that was not necessary in the past.
“It’s a different kind of requirement, and everyone is learning together,” said Michael Sharpe, the director of the North Central Regional Resource Center, a federally financed center based in Minneapolis that provides special education guidance to the education departments in nine states.
“Our role at the RRC is to put people together and facilitate the brainstorming so they can share these ideas,” Mr. Sharpe said. “Nobody has to be an expert in everything.”
Daniel J. Losen, a senior education law and policy associate with the Civil Rights Project at Harvard University said he applauded the fact that the reporting process puts minority disproportionality, his particular area of interest, at the forefront of state monitoring of special education. IDEA has required that minority overrepresentation be monitored by states since the 1997 version of the law, “but there was no teeth around [the requirement] at all,” he said.
Now, “we’re on the right track. Racial disparities in special education finally is a priority, and everyone is looking at it.”
One key remaining question is whether the collected data will drive instruction in a way that improves the academic progress of children with disabilities. Because the process is so new, state directors say they aren’t yet sure whether every indicator is of equal worth.
“Is this the right direction to go? I would say yes,” Ms. Thompson of the Michigan education department said. “Has it been manageable? I don’t think so. In a couple of years, we’ll be looking at these indicators again because some of them don’t have great utility.”
The ones that are most important, she suggests, are those that deal with graduation and dropout rates, student participation in state assessments, and postsecondary outcomes.
“And the only way we impact those is by working within the greater context of education,” Ms. Thompson said. “It’s not a special ed, it’s an ‘every’ ed issue.”