Includes updates and/or revisions.
As state leaders and education advocates weigh evaluating U.S. students using international benchmarks, a new report argues that one prominent test, PISA, is flawed and may not be appropriate for judging American schools on global standards.
The author, Tom Loveless, a senior fellow at the Washington-based Brookings Institution, also contends that questions asked on the Program for International Student Assessment surveys of students’ beliefs and attitudes about science reflect an ideological bias, which undermines the test’s credibility.
He cites an example from one PISA questionnaire, which seeks to gauge “a sense of students’ responsibility for sustainable development,” and asks test-takers if they agree with certain statements, such as “having laws that protect the habitats of endangered species.”
A response requires a “political judgment,” Mr. Loveless writes. Also, the questions are vague, making it difficult for the scientifically literate to know how to answer, he argues.
“It is difficult to see how declaring support or opposition to a policy without knowing the details” is related to responsible citizenship, Mr. Loveless adds.
Andreas Schleicher, the head of education indicators for the Organization for Economic Cooperation and Development, the Paris-based group that oversees the test, called the report “disingenuous” and misleading on some points.
‘A First Reading’
He noted that the student questionnaire is not in any way connected to the main, publicly reported PISA scores for science and math, which are most commonly cited in the news media and by policymakers. It is clear, he said, that the test scores and the questionnaire give policymakers two different sets of information. Results from the questionnaire are put in separate indices in PISA reports, he noted.
“These questions explore significant science-related contemporary issues,” Mr. Schleicher said in an e-mail, and give policymakers “a first reading” of students’ attitudes about science, even if the phrases are not perfect.
Mr. Loveless also casts doubt on whether PISA’s practice of measuring skills that students pick up both in and out of school makes it useful for state policymakers who want to improve their K-12 systems. Another international test, the Trends in International Mathematics and Science Study, or TIMSS, and the U.S.-based National Assessment of Educational Progress, or NAEP, focus primarily on in-school skills.
In addition, he said the OECD takes policy positions that it should not be doing if it collects and interprets score data, because it creates potential for conflict.
Mr. Schleicher said PISA emphasizes students’ ability to apply knowledge in an out-of-school context, but that doesn’t mean students necessarily learned those skills outside the classroom.
One central PISA goal is to assess students’ “capacities to extrapolate from what they know and transfer and apply their knowledge and skills to novel settings,” Mr. Schleicher said, which, he added, is a prized skill in science.
Last September, the National Governors Association, Achieve, and the Council of Chief State School Officers announced plans to create an advisory group to produce a “road map” to benchmark U.S. school performance with that of top-performing nations.
Mr. Loveless writes that the NGA would “like states to use PISA” in that process. But Dane Linn, the director of the NGA’s education division, disputed that, saying the organizations are not committed to any particular approach, but considering a range of rigorous international exams.
“It behooves us to not exclude PISA in examining how other countries measure performance,” Mr. Linn said. Different elements of PISA, TIMSS, and other international tests are likely to appeal to state policymakers. Debates about which kind of test material, emphasizing in-school “content,” as opposed to the “application of knowledge,” miss the point, Mr. Linn added. “It’s both.”