Assessment

Districts Push Back Over Cheating Probe

By Christina A. Samuels — April 03, 2012 5 min read
  • Save to favorites
  • Print

A newspaper investigation that turned up unusual test-score fluctuations in about 200 school districts in a nationwide sample of 14,700 has revived a debate about cheating on standardized tests—and prompted immediate pushback from some of the districts flagged by the analysis. They contend that the newspaper’s methodology was flawed.

The Atlanta Journal-Constitution article looked at test scores in about 69,000 schools around the country. The reporters requested average reading and mathematics results for state exams given in grades 3-8 from 50 states and the District of Columbia, as well as the count of students tested for each school, grade, and subject in those jurisdictions.

The newspaper did not have access to student-level data. Instead, it created “classes” of all the test-takers in a given grade in each school—for example, comparing all the 3rd grade test-takers to all the 4th grade test-takers in the same school the next year. If the 4th graders in a “class” performed unusually better or unusually worse on state standardized tests than they had the previous year, that school was flagged. In some schools, the scores varied so widely that it was nearly impossible to attribute the variation to chance, the article said.

In the March 25 article, the reporters said that the fluctuations did not prove there was cheating in those schools, a point reiterated by Kevin Riley, the editor of the Journal-Constitution, in an interview with Education Week.

“What we’ve really done is something that points out suspicious scores and says, ‘This warrants further investigation,’ ” Mr. Riley said. He also noted that the “vast majority” of educators are working in districts where no suspicious variations were found.

The story did note that similar test-score fluctuations were seen in the Atlanta district, which was the center of a recent high-profile cheating scandal. The Journal-Constitution‘s extensive investigation into its 50,000-student hometown district eventually prompted a state probe, which found evidence of adult-led cheating on the 2009 Georgia state test at 44 of the 56 schools examined. (“Report Details ‘Culture of Cheating’ in Atlanta Schools,” July 13, 2011.)

The responses from many districts and education groups, some of which were released preemptively a few days before the article appeared, indicated that they saw themselves as being accused of cheating based on methodology they considered severely flawed.

The 78,000-student Nashville, Tenn., district said the schools flagged there were campuses with high rates of student mobility, making it hard to measure one cohort of students against another. The district also said that scores for students in special education taking modified assessments, measured on a 200-to-400 scale, were averaged in with the scores of students in regular education classes, which are scored on a 600-to-900 scale. One Nashville middle school was included in the original article as having an unusual fluctuation, but based on those statistical problems, the district said the newspaper’s calculations were 48 scale-score points lower than the true score. A reference to that school was later taken out of the online version of the story.

More Analysis Needed?

The newspaper analysis also flagged some schools in the Houston Independent School District. The 203,000-student district responded to the story by noting that it has had confirmed cases of cheating. But the district also said it takes a vigorous stance against testing impropriety.

Houston school officials also took exception to the Atlanta newspaper’s methods, saying that, although test-score variations can be a “useful statistical tool,” such analyses tend to flag schools with large changes in their student enrollments, or schools that serve special populations. For example, an alternative school with short-term placements was flagged, as were two “overflow” schools that serve as crowding-relief campuses.

Mr. Riley, the newspaper’s editor, said that the paper is responding seriously to the objections of districts in the report and that it plans follow-up stories. However, he also said that the Atlanta district hammered the paper’s investigative methods, which were eventually proved correct. “What we need now is courageous people who will dig into this without fear,” he said.

Jaxk H. Reeves, an associate professor at the University of Georgia, in Athens, and the director of the Statistical Consulting Center there, worked with the newspaper on its analysis. In an interview, he said student mobility, which districts have coalesced around as a way to discount the newspaper’s results, is not as important as districts suggest. That’s because even though schools may not serve the same students from year to year, they tend to serve the same types of students, in terms of demographics and achievement. The newspaper also made some adjustments for mobility, for example, excluding “classes” where student numbers varied by more than 25 percent from one year to the next.

If mobility were the sole reason for the variation, Mr. Reeves said, then more schools in urban districts, where mobility is often high, should have been flagged.

“I do believe if a district is being flagged a lot, they should look at individual schools,” Mr. Reeves said.

Gary J. Miron, a professor of evaluation, measurement, and research at Western Michigan University, in Kalamazoo, has emerged as a critic of the newspaper’s work. A week before the report’s release, he evaluated some of the data for Ohio districts for a separate story published in the Dayton Daily News, which is owned by the same newspaper group that owns the Journal-Constitution. He said he identified weaknesses in the research, but was told that the paper would be moving forward with publication.

Mr. Miron said the reporters made an “adequate” first step at identifying irregularities, but it was only a first step. What is also needed is student-level data and then erasure-analysis data from the testing companies.

“Throughout the reporting, they imply cheating,” Mr. Miron said, noting that there are a number of explanations beyond cheating that address the fluctuations. “They should finish their analysis.

A version of this article appeared in the April 04, 2012 edition of Education Week as Test-Cheating Probe Spawns Questions Over Its Methodology

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Leadership in Education: Building Collaborative Teams and Driving Innovation
Learn strategies to build strong teams, foster innovation, & drive student success.
Content provided by Follett Learning
School & District Management K-12 Essentials Forum Principals, Lead Stronger in the New School Year
Join this free virtual event for a deep dive on the skills and motivation you need to put your best foot forward in the new year.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Privacy & Security Webinar
Navigating Modern Data Protection & Privacy in Education
Explore the modern landscape of data loss prevention in education and learn actionable strategies to protect sensitive data.
Content provided by  Symantec & Carahsoft

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Should Teachers Be Tough Graders? Here's What They Have to Say
Teachers on social media give their opinions on whether stricter grading helps their students learn more.
2 min read
Close cropped photo of a teacher's grade on an essay graded 'F' in red with the words "See Me"
iStock/Getty
Assessment The State of Teaching Where Teachers Say the Pressure to Change Grades Comes From
Teachers are more likely to be pressured by parents than school leaders.
4 min read
Conceptul image in blues of a teacher handing out graded papers.
Liz Yap/Education Week and E+
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Assessment Sponsor
Testing Season: Who Are We Really Testing For? Transforming Assessments from Obstacles to Opportunities
As another testing season approaches, a familiar question weighs heavily on our minds: who are these tests serving?
Content provided by Achievement Network
Assessment What the Research Says AI and Other Tech Can Power Better Testing. Can Teachers Use the New Tools?
Assessment experts call for better educator supports for technology use.
3 min read
Illustration of papers and magnifying glass
iStock / Getty Images Plus