Federal

New Uses Explored for ‘Value Added’ Data

By Debra Viadero — May 28, 2008 6 min read
  • Save to favorites
  • Print

With “value added” methods of measuring student-learning gains continuing to grow in popularity, policymakers and researchers met here last week to explore possible new ways of using the sometimes controversial approaches and to debate their pluses and pitfalls.

The May 23 conference at the Urban Institute, a think tank based here in the nation’s capital, examined the policy implications for value-added statistical designs, which typically measure students’ learning gains from one year to the next. Such methods have been spreading since the early 1990s.

While value-added designs are still imperfect technically, various speakers at the gathering said, they can provide new information to help identify ineffective teaching and the impact of certain programs and practices, for example. The data they provide can help educators reflect on their own practices, give administrators grounds for denying tenure to poorly performing teachers, or be used by states to calculate whether districts are making adequate yearly progress under the federal No Child Left Behind Act.

And value-added models can answer such important research questions as what makes a good teacher and whether problems in retaining early-career teachers actually harm or help schools, speakers said.

Yet when it comes to high-stakes decisions, supporters, critics, and scholars of value-added research models seemed to agree on one point: Value-added calculations, if they’re used at all, should be one among several measures used in judging the quality of schools or teachers.

“Assessment results are one critically important measure,” said Ross Wiener, the vice president for programs and policy at the Education Trust, a Washington-based research and advocacy group that focuses on educational inequities. “There are other things that teachers do that are important.”

Last week’s Urban Institute event piggybacked on an April conference at the University of Wisconsin-Madison, where researchers aired technical cautions about value-added research methodology and shared some other research supporting its usefulness. (“Scrutiny Heightens for ‘Value Added’ Research Methods,” May 7, 2008.)

An organizer of the Wisconsin meeting said at the Washington event that the limitations of value-added designs should be kept in perspective. Both the Washington conference and the Wisconsin gathering that preceded it were sponsored jointly by the Carnegie Corporation of New York, the Joyce Foundation, and the Spencer Foundation. (All three philanthropies underwrite coverage in Education Week.)

“I ask you not to lose sight of what I think is the main message,” said Adam Gamoran, the director of the Madison-based Wisconsin Center for Education Research, “which is that value-added models are better than the alternatives.”

Measuring Change

When it comes to accountability efforts, the alternatives for most education systems are techniques that rely on snapshots of student achievement at a single time, such as percentages of students who meet state academic targets.

The theoretical appeal of value-added accountability systems, which measure learning gains from one year to the next, is that educators would get credit only for the progress students made in their classrooms and not get penalized for the learning deficiencies that students brought with them to school.

In practice, though, various value-added models are proving controversial. A case in point is the New York City school system’s efforts to use such techniques to rate schools and evaluate teachers’ job performance, noted Leo E. Casey, the vice president for academic high schools for the 200,000-member United Federation of Teachers, the local teachers’ union.

At the conference, Mr. Casey faulted the school system’s teacher-evaluation project for relying on scores from tests taken by students in January, for failing to take into account the fact that students are not randomly assigned to classes, and for employing statistical calculations that he said are unintelligible to nonstatisticians.

“It’s really important that teachers, students, and parents believe the system on which they are being graded is a fair system,” he told conference-goers.

Opposition from his group, which is an affiliate of the American Federation of Teachers, and other teachers’ unions led New York state lawmakers in April to legislate a two-year moratorium on any efforts by districts to link student-performance data to teacher-tenure decisions. In the meantime, a state task force will be formed to study the issue.

Teacher Characteristics

Studies that try to identify which characteristics of teachers are linked to students’ learning gains are another, less controversial use of value-added methodology. Do veteran teachers do a better job, for example, than novices?

Studies examining such questions have shown that, while experience has proved to be important in some ways, possession of other credentials, such as a master’s degree, seems to have no impact on student performance, according to Douglas N. Harris, an assistant professor of educational policy studies at the Wisconsin research center.

Given the cost of a master’s degree—about $80,000, by his calculations—value-added methods might be a less expensive way to reward good teachers and signal which ones a school system ought to hire, Mr. Harris suggested.

“But we still need a path to improvement, and existing credentials might serve that function,” he said.

Value-added research models can also provide more information than experimental studies about the long-term effectiveness of particular programs or interventions in schools, said Anthony S. Bryk, a Stanford University scholar who is the incoming president of the Carnegie Foundation for the Advancement of Teaching, based in Stanford, Calif.

Mr. Bryk is currently using the statistical technique to track the progress of a professional-development program known as the Literacy Collaborative in 750 schools. He said that, while randomized studies are considered the gold standard for research on effectiveness, they can’t provide information about the different contexts in which a particular program works, the range of effect sizes that are possible, or whether the improvements change over time.

“You can only get so far by weighing and measuring,” he said. “What I’m arguing for is the use of value-added models toward building a science of improvement.”

From Data to Decisions

Whether schools will know how to make use of data collected through value-added statistical techniques is an open question, however.

Daniel F. McCaffrey, a senior statistician in the Pittsburgh office of the Santa Monica, Calif.-based RAND Corp., studied 32 Pennsylvania school districts taking part in the first wave of a state pilot program aimed at providing districts with value-added student-achievement data in mathematics.

He and his research colleagues surveyed principals, other administrators, teachers, and parents in the districts involved in the program and compared their responses with those from other districts having similar demographic characteristics.

“We found it was really having no effect relative to the comparison districts,” Mr. McCaffrey said.

Even though educators, for instance, seemed to like the data they were getting and viewed the information as useful, few were doing anything with the results, he said. Twenty percent of the principals didn’t know they were participating in the study, Mr. McCaffrey said, noting also that the program was still young at that point in the evaluation process.

Despite such challenges, other speakers at the conference argued that the use of value-added methodology should become more widespread. Said Robert Gordon, a senior fellow at the Center for American Progress, a Washington think tank: “The way we will learn about implementation problems, I think, is to implement.”

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Assessment Webinar
Reflections on Evidence-Based Grading Practices: What We Learned for Next Year
Get real insights on evidence-based grading from K-12 leaders.
Content provided by Otus
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
Promoting Integrity and AI Readiness in High Schools
Learn how to update school academic integrity guidelines and prepare students for the age of AI.
Content provided by Turnitin
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Reading & Literacy Webinar
What Kids Are Reading in 2025: Closing Skill Gaps this Year
Join us to explore insights from new research on K–12 student reading—including the major impact of just 15 minutes of daily reading time.
Content provided by Renaissance

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Federal McMahon Says Schools With 'Gender Plans' Could Be Violating Federal Privacy Law
The U.S. Department of Education opened investigations under FERPA into two states, alleging violations of parents' rights.
5 min read
Secretary of Education Linda McMahon speaks to reporters at the White House in Washington, Thursday, March 20, 2025.
Secretary of Education Linda McMahon speaks to reporters at the White House in Washington, Thursday, March 20, 2025. McMahon said that the U.S. Department of Education would make a "revitalized effort" to pursue federal student privacy law violations for parents' rights, asserting that school "gender plans" that aren't available to parents violate the federal law.
Ben Curtis
Federal Dramatic Cuts to Ed. Data Programs Will Have Far-Reaching Consequences, Researchers Warn
Education research organizations asked Congress to intervene in cuts to ed. data, research staff.
6 min read
Image of performance data analysis.
NicoElNino/iStock/Getty
Federal See Which Schools Trump's Education Department Is Investigating and Why
The agency has opened more than 80 investigations. Check out our map and table to review them.
2 min read
President Donald Trump speaks before signing an executive order barring transgender female athletes from competing in women's or girls' sporting events, in the East Room of the White House, Wednesday, Feb. 5, 2025, in Washington.
President Donald Trump speaks at the White House on Feb. 5, 2025, before signing an executive order barring transgender females from competing in women's or girls' sports. Transgender athlete policies have been a common subject of investigations into schools, colleges, state education departments, and athletic associations by the U.S. Department of Education since Trump took office.
Alex Brandon/AP
Federal Opinion Federal Ed. Research Has Been Slashed. Here’s What We All Lose
The long-term costs to our students far outstrip any short-term taxpayer savings from the Trump cuts.
Stephen H. Davis
4 min read
Person sitting alone on hill looking at the horizon feeling sad, resting head in hand. Mourning the loss of education research data.
Vanessa Solis/Education Week + iStock/Getty Images