The Department of Education will give every state an incomplete on its reading performance when it releases “the nation’s report card” this week.
But the anticlimactic grade didn’t come about because states failed to turn in their homework--it’s because the department’s contractor erred in its first attempt to grade it.
Even though “the impact of the error appears to be minimal,” said Pascal D. Forgione Jr., the commissioner of the National Center for Education Statistics, the department will not release on Feb. 10 a state breakdown of the 1998 National Assessment of Educational Progress “Reading Report Card.”
Instead, the NCES will go ahead with its news conference, as planned, to outline the results of national trends, but will withhold the state breakdowns until the contractor recalculates and ensures state numbers are accurate.
“We should release data in a timely manner,” said Mr. Forgione, whom federal law vests with the power to unveil NAEP scores. “Given this [error] doesn’t impact the national results, I felt we should go ahead with the release.”
The 1998 reading results are being eagerly awaited to inform one of the hottest education debates of the decade: whether to emphasize literature and comprehension or phonics--the sounding out of letters and words--when teaching children to read.
The NAEP reading exams, which are the only national assessment of achievement in that subject, were the first set given since 1994. On that administration, 40 percent of 4th graders scored below NAEP’s “basic” level. Thirty percent of 8th graders and 25 percent of 12th graders failed to reach basic. High school seniors’ scores dropped between 1992 and 1994, but 4th and 8th grade results remained about the same.
Since the release of those scores, political leaders throughout the country, President Clinton included, have repeatedly called for a renewed emphasis on reading instruction, especially in the early grades. Numerous states have passed legislation or adopted regulations spelling out how educators should teach the subject.
As a result, many policymakers had looked to the release of these latest state data to judge whether their efforts were on track, despite the short amount of time since changes were made.
For his part, Mr. Clinton has used the earlier scores in proposing a reading initiative and a national 4th grade reading test.
Secretary of Education Richard W. Riley plans to include the NAEP scores in a report recommending ways to teach reading that he had scheduled for release along with the test results. But that report also will be delayed, probably until the state-by-state results are published, according to Julie Green, Mr. Riley’s press secretary.
State officials had been given the incorrect scores to review last month before the error was discovered, Mr. Forgione said.
Twice Spotlighted
Though disappointed, the chairman of the National Assessment Governing Board, known as NAGB, the independent panel that oversees NAEP, agreed to releasing the data in two stages. “It didn’t seem right to sit on [the national information] if it’s available,” said Mark D. Musick, NAGB’s chairman and the president of the Southern Regional Education Board in Atlanta.
With two releases, Mr. Forgione said, the statistics center will be able to highlight separate storylines. This week, it will give a glimpse of how the nation is doing compared with the similar exams administered in 1992 and 1994. When the state-by-state data are available later this month or in March, the department can give a detailed picture of the 34 participating states that will have longitudinal data to compare over the four-year span.
“We are doing all we can to make sure we come out with [state results] as quickly as possible,” Mr. Forgione said.
While states study their NAEP scores and compare them with those of other states, they don’t urgently need them, according to one state official.
In Maryland, the state testing program is the main source of achievement data, noted Ronald A. Peiffer, an assistant state superintendent .
“The NAEP scores help us anchor our state results on a national level,” Mr. Peiffer said. The delay in their release “doesn’t hurt us.”
Ambiguous Roles?
Problems with scoring and calculating NAEP results are not new. In 1995, the NCES discovered computer errors in previously published results on the 1992 and 1994 exams.
For this latest test, Westat, the Rockville, Md.-based contractor in charge of scoring the assessment, inserted the wrong numbers used to account for differences in the population of test-takers.
Because NAEP is a sampling of achievement in which students take separate portions of the assessment, statisticians need to adjust the results using formulas that level the playing field.
CES and Westat officials discovered the mistake late last month when they noticed that too many states failed to score above the national average. Upon reviewing the data, Westat realized it had used the incorrect formula to adjust state scores based on the numbers of students with disabilities or whose English proficiency was limited.
Westat will pay the cost overruns created by the blunder, Mr. Forgione said.
“It reinforces to me that NAEP is a very complex assessment model,” he said. Each step of the process is intricate and, if not done exactly right, can skew results, he explained.
The events of the past two weeks are also a sign of the ambiguous roles of the Education Department and the assessment’s independent governing board, Mr. Musick said.
Congress created the board to remove NAEP policy-setting from the political appointees who run the federal department and to buffer the testing program from partisan policymakers.
But federal law designates the NCES commissioner as the one who decides how and when NAEP data are released. The governing board must approve that plan.
“For the most part, the board goes along with it,” Mr. Musick said.
In this case, Mr. Forgione consulted Mr. Musick and board staff members, and recommended that the national scores be released before the state scores.