Opinion Blog


Rick Hess Straight Up

Education policy maven Rick Hess of the American Enterprise Institute think tank offers straight talk on matters of policy, politics, research, and reform. Read more from this blog.

Assessment Opinion

Why Are Advanced Placement Scores Suddenly So High?

The new passing rates reflect a more accurate picture of student mastery, argues the College Board
By Rick Hess — October 08, 2024 10 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
  • Save to favorites
  • Print

In the past few months, a lot of attention has been paid to the spike in some Advanced Placement exam results. EdWeek’s Ileana Najarro reported in August that these increased passing rates have “led some educators and researchers to question whether AP exams have become easier” or if the College Board (which runs the AP) is boosting passing rates “to compete against dual-credit programs.” As one would expect, the College Board says that’s not the case. But I was interested to hear more about what the College Board says is going on, especially given the massive expansion of AP—from 1 million tests taken in 2003 to nearly 3 million in 2024. I reached out to Trevor Packer, senior vice president for the AP program, who has overseen it since 2003. Here’s what he had to say.

—Rick

Rick: Trevor, there’s been much concern this summer about sudden jumps in the scores on some AP exams. Can you explain what’s going on?

Trevor: Sure, and thanks for your interest, Rick. Historically, most of the 40 AP subjects have had exam pass rates of 60 to 80 percent. However, in nine subjects—primarily those in the humanities—pass rates have fluctuated between 40 to 50 percent and 70 to 80 percent over time. New technologies for rapid data collection and analysis have recently enabled researchers to utilize such evidence to derive AP pass rates rather than rely on the more subjective process used in the past. As a result, the pass rates in these nine AP subjects—world history, English literature, biology, macroeconomics, microeconomics, chemistry, U.S. government, U.S. history, and European history—are now consistent with the traditional pass rates in the other AP subjects. We brought about these changes in part because, over the years, we’ve polled hundreds of college faculty and admissions officers, and they have consistently expressed a conviction that AP students—typically the most academically strong of American high schoolers—should be able to achieve 60 to 80 percent pass rates on AP exams.

Rick: You suggest that the pass rate should be 60 to 80 percent. Could it be that most of the tests should be graded a little more rigorously? How do you approach this tension, especially given that students are taking a lot more AP exams today?

Trevor: Yes, it’s been exciting to see the expansion of access to AP. Still, despite such growth, the most common AP courses are taken by only 10 to 15 percent of high schoolers, students who have much stronger academic backgrounds than the majority of those who enter college. So, most educators do not find it surprising that the top 10 to 15 percent of high school students would be able to achieve a 60 to 80 percent pass rate on the AP exams. Regarding the rigor of AP testing: The hundreds of college faculty who reviewed this year’s AP exams consistently reported that the AP exam questions were significantly more challenging than the exam questions their own first-year students experience in their college classes.

Rick: One thing that’s causing a lot of buzz is the adjustment to AP U.S. History scores. The pass rate for that exam is now 72 percent—up from 48 percent a couple years ago. That seems like a really big jump. Can you talk me through what’s going on there?

Trevor: As I mentioned, the approximately 15 percent of high school students who take AP U.S. History are academically much stronger, on average, than the overall college-going population. To illustrate this: The average combined PSAT score of AP students who have been receiving a “failing” AP score of 2 is 1084; in comparison, the PSAT scores of college students who have been receiving an A in colleges’ own U.S. History survey courses is 1058. In other words, students receiving a “failing” AP exam score of a 2 actually have higher PSAT scores than college students who have been receiving college A’s. We also know that AP U.S. History students typically have 145 hours in class to learn the material, whereas college students typically have 50 hours. This means that AP students have much more time to learn the content, review the content, and develop and practice the skills measured by the exam, including generating thesis statements and summoning historical evidence across a wide range of short and long essay questions. These proficiencies have traditionally been cited as the benchmark for passing an AP History exam, and hence, 72 percent deserve a passing score. But it’s important to note that even though the pass rate for the AP U.S. History exam increased to match the stable pass rates in other AP subjects, it remains lower than college grades 30 years ago, let alone 2024, a year in which 84 percent of college students are receiving grades of C or higher.

Rick: OK, I love that you’re getting in the weeds here. But those of us who aren’t as familiar with these results may have trouble tracking all the numbers. Given that the PSAT tests reading and math rather than U.S. History, can you help me understand what we should make of that comparison?

Trevor: While I’d never claim that the PSAT measures the same content as the AP U.S. History exams, the reading and data-analysis skills on the PSAT are a well-established predictor of readiness for college coursework and have proved to be strongly correlated with AP U.S. History exam performance. By comparison, the correlation between a student’s prior grades in high school history classes and their performance on the AP U.S. History exam performance is much lower.

Rick: You mentioned that the College Board used to rely on subjective determinations to tweak passing rates. How do the changes you’ve made compare with those that used to get made in that fashion?

Trevor: None of this year’s changes is nearly as large as the year-to-year changes that we saw when small panels of professors and teachers used their subjective judgments and estimations to recalibrate AP pass rates approximately every five years. For example, in 2013, a panel subjectively reduced AP Biology scores by 74 percent. Rather, the evidence-based standard-setting process we’re now using stabilizes AP pass rates at the level they’ve been for most of the past 70 years and maintains them consistently from year to year. Even after including the increases in these nine subjects, AP’s pass rates remain lower than they have been in many prior years.

Rick: How did the more subjective process actually work? And what questions did this process raise for you?

Trevor: Until recently, the pass rate was determined by small panels of 10 to 18 educators who reviewed each exam question and estimated what percentage of students they believed should pass the exam. In the nine subjects whose scoring we changed this year, the pass rates used to swing up and down every few years when panelists convened to confirm the pass rates. This fluctuation raised real questions about whether these subjects warranted lower pass rates than the other 31 subjects. Were these regular increases and decreases due to variability within the panels? If not, what was the rationale for why AP History or English students should have lower pass rates than students taking AP Psychology, Calculus, or Music Theory?

Rick: For those of us trying to understand how the College Board’s calibration is different from that kind of subjective scoring, what are some of the specific changes?

Trevor: In 2022, the AP program began running a data-collection and -analysis project to derive pass rates from a quantitative analysis of AP students’ academic abilities, content knowledge, quantitative skills, and literacy. Within the measurement profession, this is called evidence-based standard setting, or EBSS. Our EBSS found that the pass rates in these nine subjects should be similar to the long-term pass rates in most other subjects: Sixty to 80 percent of AP students were able to craft thesis statements across a range of historical and literary topics, summon evidence consistently, and demonstrate the other foundational skills of these disciplines. We showed the assembled data to the panels, who found the analysis more convincing than their estimations; accordingly, the panelists endorsed these findings as the pass rates for these nine subjects. We applied this same evidence-based methodology to a range of subjects with a pass rate of 60 to 80 percent. The results did not increase any of those subjects’ pass rates but instead found that these stable rates should be preserved—another confirmation that AP subjects should generally have pass rates ranging from 60 to 80 percent rather than 40 to 50 percent.

Rick: How do you decide where to set the passing score for an exam? How do you make sure it’s not too hard or too easy?

Trevor: We do not have a predetermined pass rate—rather, the ESBB process rapidly generates a ranking of student performance across the 140 points possible on the exam, which then enables researchers and subject-matter experts to identify which students have consistently performed with competency across the range of short and long essay questions. In other words, large-scale data collection and analysis enables the passing rate to be established at specific proficiency levels, rather than relying on estimates and opinions of panelists.

Rick: How do you determine if a student “performed with competency”? And has that changed?

Trevor: The standards for “performing with competency” have not changed—they remain the same as they have been for many years. In AP U.S. History, for example, the standards focus on whether a student is able to elicit historical evidence from primary sources, analyze claims made in secondary sources, and craft effective historical arguments themselves. So the difference this year is not in the standards themselves but in the process used to identify which students were able to meet those enduring standards. The old process relied upon a panel’s estimation of what percentage of students they believed should be able to meet those standards, and as I’ve mentioned, it swung up and down over time as different panels arrived at different subjective judgments. The new, evidence-based process tags each exam question with information about which skills the student who answers it correctly demonstrated so that rather than relying upon panelists’ estimations, we can identify within the student’s own performance data whether they met the standards or not. The data clearly indicated that 72 percent of AP U.S. History students demonstrated competency in developing thesis statements and supporting them with evidence across a range of topics in the course.

Rick: Some skeptics have suggested that the real impetus here was the fear of losing paying customers to the growing ranks of dual-enrollment programs. How do you respond to such concerns?

Trevor: As we have done for decades, we raise or lower the AP scores using the most current methodologies and technologies possible. When we tweak the scores, our sole motive is to determine as accurately as possible whether a student has demonstrated the proficiency necessary for receiving a passing score. We have consistently lowered the pass rate when panels have reached such determinations, in direct opposition to the general leniency with which other programs give passing scores to most or all students. But when the available evidence indicates that a student has developed and demonstrated solid proficiency in a subject, we must award that student a passing score, regardless of how our motives may be misrepresented or mischaracterized.

Rick: Another concern that’s been raised is that this is really a matter of the College Board caving to the kind of grade inflation we see in college classes and on high school transcripts. What’s your take on this critique?

Trevor: Because the process now sets the pass rate at performance levels where students are demonstrating clear proficiency across a wide variety of short and long essays, it is not subject to the variability we saw from panel to panel in the prior model. In other words, the process we now use actually guards against the influence of college grade inflation because the AP process anchors the pass rate in the performance of specific proficiencies like crafting a thesis statement and summoning historical evidence to support an argument, among other skills. As part of this change, we checked the new pass rates for each of these nine subjects to see how they compare to college grading practices and found that AP’s pass rates remain more conservative than college grading practices were 30 years ago, let alone today.

The opinions expressed in Rick Hess Straight Up are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Events

Artificial Intelligence K-12 Essentials Forum Big AI Questions for Schools. How They Should Respond 
Join this free virtual event to unpack some of the big questions around the use of AI in K-12 education.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in Schools
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by Panorama Education
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Science Webinar
Spark Minds, Reignite Students & Teachers: STEM’s Role in Supporting Presence and Engagement
Is your district struggling with chronic absenteeism? Discover how STEM can reignite students' and teachers' passion for learning.
Content provided by Project Lead The Way

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Massachusetts Voters Poised to Ditch High School Exit Exam
The support for nixing the testing requirement could foreshadow public opinion on state standardized testing in general.
3 min read
Tight cropped photograph of a bubble sheet test with  a pencil.
E+
Assessment This School Didn't Like Traditional Grades. So It Created Its Own System
Principals at this middle school said the transition to the new system took patience and time.
6 min read
Close-up of a teacher's hands grading papers in the classroom.
E+/Getty
Assessment Opinion 'Academic Rigor Is in Decline.' A College Professor Reflects on AP Scores
The College Board’s new tack on AP scoring means fewer students are prepared for college.
4 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
Luca D'Urbino for Education Week
Assessment Opinion Students Shouldn't Have to Pass a State Test to Graduate High School
There are better ways than high-stakes tests to think about whether students are prepared for their next step, writes a former high school teacher.
Alex Green
4 min read
Reaching hands from The Creation of Adam of Michelangelo illustration representing the creation or origins of of high stakes testing.
Frances Coch/iStock + Education Week