Challenging popular notions about the compatibility of athletics and academics, a new study finds that sports “powerhouse” schools do not sacrifice classroom achievement.
The annual Brookings Institution report says that some schools with top-ranked athletic teams actually performed slightly better on state exams than schools with less successful sports programs.
The wide-ranging report, “How Well Are American Students Learning?,” was released last week by the Washington think tank’s Brown Center on Education Policy. In other findings, it says that charter schools’ students score significantly below those in regular public schools on achievement tests, and it faults American students’ computational skills.
Read the report “How Well Are American Children Learning?,” from the Brookings Institution. (Requires Adobe’s Acrobat Reader.)
Tom Loveless, the director of the center and the author of the report, identified 163 athletic “powerhouse” high schools for that portion of the report. He used national and regional rankings from the newspaper USA Today to find the top high schools in baseball and basketball since 1997-98 and in football since 1991.
Analyzing how students at those schools performed on state tests in reading and mathematics compared with those in other schools in their states with similar racial and socioeconomic backgrounds, he found that powerhouse schools performed slightly better as schools without high- profile sports programs.
“Winning at basketball can go hand and hand with winning at mathematics,” Mr. Loveless writes. “However, high schools with advantaged socioeconomic circumstances are better able than other schools to integrate excellence at sports into an ethos of achievement that pervades school culture.”
While powerhouse schools in urban and rural areas don’t drop off academically because of their sports’ success, the research found, public schools with both standout athletic teams and high academic achievement were more likely to be in wealthy suburban neighborhoods with predominantly white, non-Hispanic populations. In three-quarters of the states, those suburban schools scored higher on academic tests than nonpowerhouse suburban schools with similar demographics, the study found.
Charter Achievement
In its charter school findings, the report says that students in those largely independent public schools “perform significantly below regular public schools” on state achievement tests.
For the analysis, Mr. Loveless collected test- score data from 10 states and combined reading and math achievement from 1999 through 2001 into a composite score. Scores were adjusted for socioeconomic status and racial composition. The states selected had at least 30 charter schools open in 1999, tested students in grades 4, 8, and 10, and used the same achievement tests in 1999, 2000, and 2001.
Using that information, charter schools scored in the 41st percentile, meaning 59 percent of traditional public schools performed better on the achievement exams.
Charter schools’ scores are influenced by the fact that parents select those schools for their children, the report notes, and if charter school students or their families are fundamentally different from students attending traditional public schools, those differences, and not the quality of the schools, may be reflected in the differences in test scores.
Jeanne Allen, the president of the Center for Education Reform, a Washington-based research and advocacy group that supports charter schools, said in a statement that the report was inconclusive and ignored other research tracking students’ performance in charter schools over years.
Those other findings, she said, show that charter schools attract low- performing students, and that the schools produce achievement gains for students who have been in the charter schools for at least two years.
Math Skills
The report also criticizes the performance of 17-year- olds on arithmetic problems on the National Assessment of Educational Progress, saying those scores have dropped considerably since 1990.
That year, it says, 76 percent of 17-year-olds correctly answered basic-arithmetic problems on NAEP, compared with 71 percent in 1999, the latest year for which data were available. Since 1990, scores for 9-year-olds and 13-year-olds have remained flat in arithmetic, the report says.
Mr. Loveless argues that the problem dates back to 1989, when the National Council of Teachers of Mathematics reported that “shopkeeper arithmetic” was dominating classroom teaching. That view, Mr. Loveless says, was embraced by the National Assessment Governing Board, the federal panel that oversees the NAEP exams, and federal officials decided not to include arithmetic as a separate reporting category.
“So we arrived where we are today: a federally endorsed state of ignorance on the computation skills of American students,” Mr. Loveless contends in the report. He recommends that NAEP report arithmetic scores as a separate category for the 4th and 8th grades, and he calls for a national campaign that emphasizes arithmetic and computation skills.
But Roy Truby, the executive director of the NAEP governing board, called the report “seriously flawed.” In a statement, Mr. Ruby said that Mr. Loveless had omitted questions on percentages, which, if included, would have shown performance in arithmetic unchanged during the 1990s for 17-year-olds, and up for 13-year-olds since then.
“Computation is used in NAEP,” Mr. Truby said. “There has been no effort to hide or downgrade it. To imply that NAEP is part of a conspiracy to eliminate arithmetic from mathematics is silly, if not bizarre.”
Also, late last week, officials of the National Center for Education Statistics, which oversees NAEP, said that the data Mr. Loveless had gathered from the NCES Web page lacked details about whether students had used calculators to answer the questions. Because of that, they said, Mr. Loveless was working under the impression that students were not using calculators, though on about half the questions he studied they were using them.
“We took [the questions] off the Web, and we will go back and indicate which ones involved a calculator and which ones did not,” Gary Phillips, the acting commissioner of the NCES, said. Mr. Phillips said he did not know whether the Brown Center report’s findings on computation would be affected.
Mr. Loveless declined to comment on the issue.