Three years after students in Puerto Rico took their first crack at a specially designed version of the National Assessment of Educational Progress, their scores have yet to be made public, a delay that has frustrated some members of NAEP’s policy-setting board.
The test being given in the U.S. commonwealth also breaks new ground for NAEP. It is the first version of “the nation’s report card” written entirely in Spanish for students taught primarily in that language, which will allow those scores to be compared against students there and in the 50 states. But crafting the test in another language has brought a host of challenges.
NAEP was first given in Puerto Rico in 2003, then again in 2005. When officials at the National Center for Education Statistics, which administers the test, conducted an internal review of the 2003 results, they found that students had answered questions correctly only 25 percent of the time, on average, according to information released by federal officials in March. Students also skipped large numbers of items: As many as 80 percent of test-takers left some questions blank. Many of those problems occurred again in 2005.
Held to the Same Standards
The No Child Left Behind Act requires that Puerto Rico take part in NAEP to remain eligible for federal Title I funding targeted at public schools with large numbers of disadvantaged children. A sampling of students throughout the United States must take the assessment every two years in math and reading.
As a U.S. commonwealth, Puerto Rico has the right to set policy for its schools and curricula, although it is also expected to comply with some federal policies, including the No Child Left Behind Act.
3.9 million residents
604,000 total public school students
1,538 public schools
28 percent of Puerto Rico’s school funding comes from federal sources—roughly three times the average amount of the 100 largest school districts in the United States.
$3,404 is spent on an average per-pupil basis ($6,606 in the largest U.S. school districts).
47 percent of households in Puerto Rico are considered to be living in poverty, a higher percentage than in any of the largest school districts in the 50 states.
96 percent of students in Puerto Rico attend Title I-eligible schools, compared with 51 percent of students in the largest school districts in the 50 states.
$466 million was the amount Puerto Rico received in Title I aid in fiscal 2005.
SOURCE: U.S. Department of Education
Because of the difficulty of translating a test of English-reading skills into Spanish, federal officials are requiring students in Puerto Rico to take part only in the math NAEP. In 2002, U.S. officials said the 2003 trial would be used solely to determine the feasibility of offering comparisons between students in Puerto Rico and in the 50 states.
Officials at the NCES, the primary statistics arm of the U.S. Department of Education, are confident the results from the 2003 and 2005 exams will be released by late this summer, said Peggy Carr, an associate commissioner of the NCES.
NCES officials, after reviewing the Puerto Rico test results from 2003 and 2005, have nearly resolved their doubts about how to interpret them, Ms. Carr said. “We’re not as concerned as we were before.”
Yet the delays have frustrated some members of the National Assessment Governing Board, including Luis A. Ramos, who questions if the scores from Puerto Rico’s students are being withheld simply because they are low.
“It is incumbent upon [us] that when we conduct the assessment, we report the results,” said Mr. Ramos, a native of the island, in an interview. “NAEP’s proven itself to be a great yardstick for students in the United States. … I want Puerto Rico to be held to the accountability standards of No Child Left Behind.”
Lost in Translation
The NCES crafted the original 2003 Puerto Rico test with help from a bilingual advisory committee, some of whose members had lived in the commonwealth. After reviewing the problems on the first test, those experts also sought help with the test translation from teachers working in Puerto Rican schools. In addition to making changes in the vocabulary and wording of some questions, officials gave students more time on the 2005 test than their counterparts in the 50 states. Yet the problems persisted.
John Stevens, another governing-board member who said he is disappointed that scores have not been released, said the board and the NCES need to know more about how motivated students in the commonwealth are to take the exam. “It’s a different place,” he said. “To what extent have we engaged people in Puerto Rico?” he said.
Translating tests in math can be as difficult as, if not more challenging than, in other academic subjects, said Guillermo Solano-Flores, a professor of education at the University of Colorado at Boulder. (“Math: the Not-So-Universal Language,” July 13, 2005.)
Sometimes, the problem is the arcane vocabulary of math, he said. In Spanish, test designers may seek to break up questions into different sentences, or repeat certain words, to make the translation clearer. But those changes can inadvertently make the question easier or harder, he said.
“That item loses its assessment value,” said Mr. Solano-Flores, who has studied math and science translations.
José A. Rivera, the NAEP coordinator for Puerto Rico’s education department, said he believes lack of alignment between the test and the curriculum could be a factor in students’ answering questions incorrectly or leaving them blank. Puerto Rican officials are undertaking a major effort to craft stronger curricula and standards in schools and link them with assessments. Approximately 2,800 4th and 8th graders at 105 schools in Puerto Rico took the 2005 NAEP.
Re-Evaluation Common
Officials elsewhere have struggled to motivate students to take the NAEP tests seriously, particularly at the high school level. But Mr. Rivera believes 4th and 8th graders in Puerto Rico want to do well on the exam.
“The Puerto Rico Department of Education and their students take NAEP seriously,” he wrote in an e-mail response to questions from Education Week. “This is not a factor in driving down scores.”
Two international tests, the Program for International Student Assessment, or PISA, and the Trends in International Mathematics and Science Study, or TIMSS, have translated exam questions across many languages, including Spanish and English.
Albert E. Beaton, who directed the 1995 TIMSS test worldwide, said it is not uncommon for test designers to re-evaluate results if scores from one population come in far below the norm, or if students skip many questions. The 1995 TIMSS test was translated into numerous languages.
The goal of large-scale tests like NAEP, Mr. Beaton said, is not simply to know that a student population is scoring above or below the norm, but how far above or below. “I would take the conservative approach,” he said about deciding when to release scores.