Assessment

Districts Push Back Over Cheating Probe

By Christina A. Samuels — April 03, 2012 5 min read
  • Save to favorites
  • Print

A newspaper investigation that turned up unusual test-score fluctuations in about 200 school districts in a nationwide sample of 14,700 has revived a debate about cheating on standardized tests—and prompted immediate pushback from some of the districts flagged by the analysis. They contend that the newspaper’s methodology was flawed.

The Atlanta Journal-Constitution article looked at test scores in about 69,000 schools around the country. The reporters requested average reading and mathematics results for state exams given in grades 3-8 from 50 states and the District of Columbia, as well as the count of students tested for each school, grade, and subject in those jurisdictions.

The newspaper did not have access to student-level data. Instead, it created “classes” of all the test-takers in a given grade in each school—for example, comparing all the 3rd grade test-takers to all the 4th grade test-takers in the same school the next year. If the 4th graders in a “class” performed unusually better or unusually worse on state standardized tests than they had the previous year, that school was flagged. In some schools, the scores varied so widely that it was nearly impossible to attribute the variation to chance, the article said.

In the March 25 article, the reporters said that the fluctuations did not prove there was cheating in those schools, a point reiterated by Kevin Riley, the editor of the Journal-Constitution, in an interview with Education Week.

“What we’ve really done is something that points out suspicious scores and says, ‘This warrants further investigation,’ ” Mr. Riley said. He also noted that the “vast majority” of educators are working in districts where no suspicious variations were found.

The story did note that similar test-score fluctuations were seen in the Atlanta district, which was the center of a recent high-profile cheating scandal. The Journal-Constitution‘s extensive investigation into its 50,000-student hometown district eventually prompted a state probe, which found evidence of adult-led cheating on the 2009 Georgia state test at 44 of the 56 schools examined. (“Report Details ‘Culture of Cheating’ in Atlanta Schools,” July 13, 2011.)

The responses from many districts and education groups, some of which were released preemptively a few days before the article appeared, indicated that they saw themselves as being accused of cheating based on methodology they considered severely flawed.

The 78,000-student Nashville, Tenn., district said the schools flagged there were campuses with high rates of student mobility, making it hard to measure one cohort of students against another. The district also said that scores for students in special education taking modified assessments, measured on a 200-to-400 scale, were averaged in with the scores of students in regular education classes, which are scored on a 600-to-900 scale. One Nashville middle school was included in the original article as having an unusual fluctuation, but based on those statistical problems, the district said the newspaper’s calculations were 48 scale-score points lower than the true score. A reference to that school was later taken out of the online version of the story.

More Analysis Needed?

The newspaper analysis also flagged some schools in the Houston Independent School District. The 203,000-student district responded to the story by noting that it has had confirmed cases of cheating. But the district also said it takes a vigorous stance against testing impropriety.

Houston school officials also took exception to the Atlanta newspaper’s methods, saying that, although test-score variations can be a “useful statistical tool,” such analyses tend to flag schools with large changes in their student enrollments, or schools that serve special populations. For example, an alternative school with short-term placements was flagged, as were two “overflow” schools that serve as crowding-relief campuses.

Mr. Riley, the newspaper’s editor, said that the paper is responding seriously to the objections of districts in the report and that it plans follow-up stories. However, he also said that the Atlanta district hammered the paper’s investigative methods, which were eventually proved correct. “What we need now is courageous people who will dig into this without fear,” he said.

Jaxk H. Reeves, an associate professor at the University of Georgia, in Athens, and the director of the Statistical Consulting Center there, worked with the newspaper on its analysis. In an interview, he said student mobility, which districts have coalesced around as a way to discount the newspaper’s results, is not as important as districts suggest. That’s because even though schools may not serve the same students from year to year, they tend to serve the same types of students, in terms of demographics and achievement. The newspaper also made some adjustments for mobility, for example, excluding “classes” where student numbers varied by more than 25 percent from one year to the next.

If mobility were the sole reason for the variation, Mr. Reeves said, then more schools in urban districts, where mobility is often high, should have been flagged.

“I do believe if a district is being flagged a lot, they should look at individual schools,” Mr. Reeves said.

Gary J. Miron, a professor of evaluation, measurement, and research at Western Michigan University, in Kalamazoo, has emerged as a critic of the newspaper’s work. A week before the report’s release, he evaluated some of the data for Ohio districts for a separate story published in the Dayton Daily News, which is owned by the same newspaper group that owns the Journal-Constitution. He said he identified weaknesses in the research, but was told that the paper would be moving forward with publication.

Mr. Miron said the reporters made an “adequate” first step at identifying irregularities, but it was only a first step. What is also needed is student-level data and then erasure-analysis data from the testing companies.

“Throughout the reporting, they imply cheating,” Mr. Miron said, noting that there are a number of explanations beyond cheating that address the fluctuations. “They should finish their analysis.

A version of this article appeared in the April 04, 2012 edition of Education Week as Test-Cheating Probe Spawns Questions Over Its Methodology

Events

School & District Management Webinar Crafting Outcomes-Based Contracts That Work for Everyone
Discover the power of outcomes-based contracts and how they can drive student achievement.
School & District Management Webinar EdMarketer Quick Hit: What’s Trending among K-12 Leaders?
What issues are keeping K-12 leaders up at night? Join us for EdMarketer Quick Hit: What’s Trending among K-12 Leaders?
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
Teaching Students to Use Artificial Intelligence Ethically
Ready to embrace AI in your classroom? Join our master class to learn how to use AI as a tool for learning, not a replacement.
Content provided by Solution Tree

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Why the Pioneers of High School Exit Exams Are Rolling Them Back
Massachusetts is doing away with a decades-old graduation requirement. What will take its place?
7 min read
Close up of student holding a pencil and filling in answer sheet on a bubble test.
iStock/Getty
Assessment Massachusetts Voters Poised to Ditch High School Exit Exam
The support for nixing the testing requirement could foreshadow public opinion on state standardized testing in general.
3 min read
Tight cropped photograph of a bubble sheet test with  a pencil.
E+
Assessment This School Didn't Like Traditional Grades. So It Created Its Own System
Principals at this middle school said the transition to the new system took patience and time.
6 min read
Close-up of a teacher's hands grading papers in the classroom.
E+/Getty
Assessment Opinion 'Academic Rigor Is in Decline.' A College Professor Reflects on AP Scores
The College Board’s new tack on AP scoring means fewer students are prepared for college.
4 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
Luca D'Urbino for Education Week