State tests of student achievement echo state-level trends on the National Assessment of Educational Progress—considered the nation’s gold standard of academic testing—more closely than has been generally recognized, according to a new study.
In a report issued last week, the Center on Education Policy found “more agreement than is commonly acknowledged” between trends on test scores in 23 states and those on NAEP.
Sixty-seven percent of the states studied showed progress on both state tests and the national assessment in 4th grade reading between 2005 and 2009, with 76 percent showing progress in 8th grade reading, 79 percent in 4th grade mathematics, and 90 percent in 8th grade math, the report says.
The study also found, however, that while trends were positive on both types of tests, the gains were usually far larger on the state tests than on NAEP. And some experts argued that upward trends on both tests have limited significance. More significant, they said, is the fact that the rates of improvement on state tests outstrip those on “the nation’s report card.”
A new study compares trends on state tests and on NAEP from 2005 to 2009.
95%: Of the 21 states with sufficient state test data in grade 8 reading, 20 showed gains between 2005 and 2009 in the percentage of students reaching the proficient level on state tests.
81%: Seventeen of these 21 states showed gains during this period in the percentage of students reaching the basic level on NAEP.
SOURCE: Center on Education Policy
The CEP, a research group based in Washington, examined trends in the tests that states use for federal-accountability purposes under the No Child Left Behind Act in the 23 states that had sufficient comparable data. The study analyzes students’ average scores on state tests and NAEP and compares trends in the percentage of students scoring at the “proficient” level on state tests with the percentage scoring at the “basic” level on NAEP.
The comparison of states’ proficient benchmark with NAEP’s basic one was informed in part by a study of 47 states last year by the U.S. Department of Education’s National Center for Education Statistics, which found that most states’ definitions of proficiency were closer to NAEP’s basic level than to NAEP’s proficient level. That study also suggested that many states might have lowered their student-proficiency bars to meet the requirements of the NCLB law. (“Test Rigor Drops Off, Study Finds”, November 4, 2009.)
Upward Bound
In the CEP study, researchers found that in most states, a rise in state test scores coincided with rising NAEP scores. That was true when students’ average scores were examined, as well as when the percentage scoring proficient on state tests was compared with the percentage scoring basic on NAEP.
“In nearly all cases, trends went up on both assessments,” the report says. “Upward trends on both the state test and NAEP in the same state offer stronger evidence that students are mastering higher levels of knowledge and skills.”
The study noted, however, that the states with the largest increases in their own test scores were often not the same states that saw the biggest gains on NAEP. Its authors say that the larger gains in state tests could be explained by score inflation that was driven by “teaching to the test” or by the possibility that state tests are aligned more closely to state standards. Students’ motivation could also factor into the results, they say, since the national assessment is a no-stakes undertaking for students, teachers, and schools.
The authors note key differences between state and the national tests, such as their content, their proficiency cutoff scores, and what they measure. State tests reach nearly all students, for instance, and NAEP reaches a representative sample of its target group.
‘Lack of Agreement’
Andrew D. Ho, an assistant professor of education at Harvard University’s Graduate School of Education, said he did not see the agreement rates between state tests and NAEP as high—78 percent across all states, subjects, and grade levels, according to his own calculations—when the agreement rate by sheer chance would be about 76 percent. It isn’t meaningful, he said, to focus on agreement rates when the trends on both types of tests happen to be positive.
More meaningful, Mr. Ho suggested, is the study’s finding that the rate of progress on state tests far outstrips that on NAEP.
“The take-home point isn’t about agreement, but the surprising lack of agreement about the amount of progress being made,” he said.
The study is notable for confirming, with NAEP trend data, the progress being made on state tests, said Jack Jennings, the president of the Center on Education Policy. But it’s equally valuable for its findings on how some states’ test progress diverges from that on NAEP, he said.
“What we’ve seen overall is an affirmation by NAEP of the general trend in state test scores. That’s encouraging,” he said. “But it’s still worth raising questions about why states see different results [from the national assessment].”