Assessment

Assessment Governing Board Crafts Definition of ‘Prepared for College’

By Sarah D. Sparks — August 20, 2013 4 min read
  • Save to favorites
  • Print

The governing board for the tests known as “the nation’s report card” has marked its own definition of what makes a student academically prepared for college.

At a meeting here this month, members of the National Assessment Governing Board, which supervises the National Assessment of Educational Progress, voted 17-2 to adopt language that will define the new “college prepared” scores in reading and mathematics on the assessment.

The language will be used for reporting in the next 12th grade NAEP, whose results will be announced in spring 2014. Those results will include a nationally representative sample of seniors as well as state-level results for 13 states that volunteered to give the test to more students.

The final definition is fairly limited, with members voting to say that the percentages of students performing at or above 163 out of 300 in mathematics and 302 out of 500 in reading on the 12th grade assessments would be “plausible estimate[s] of the percentage of students who possess the knowledge, skills, and abilities [in those subjects] to make them academically prepared for college.”

The two members who voted against the measure Aug. 3 were Andrés A. Alonso and Rebecca Gagnon.

Mr. Alonso, a professor of practice at the Harvard Graduate School of Education and a former chief executive of the Baltimore public schools, argued that the research was not strong enough to set particular cutoff scores for college preparation. By contrast, Ms. Gagnon, the director of the Minneapolis board of education, argued that NAGB should take a firmer stance. She said the cutoff scores are “reasonable” estimates rather than “plausible.”

The new definitions are based on more than 30 studies, including several comparing the content and predictive value of the federally sponsored NAEP with those of college-placement assessments, such as the SAT, the ACT, and Accuplacer, as well as longitudinal studies in Florida of how students who performed at different levels on NAEP later fared in freshman-level college courses.

Researchers used the federal High School Transcript Study, a 2009 study linking outcomes between NAEP and the SAT, and a longitudinal study of Florida students to compare performance on NAEP in reading and math with the test scores considered “college-readiness benchmarks” in the act and the SAT in 2005 and 2009.

‘Aspirational’ Level

In both subjects, the researchers found students who met the “proficient” achievement level on NAEP—176 out of 300 in math and 302 out of 500 in reading—also scored at or above the college-readiness benchmark scores on the SAT and the ACT. In 2009, 38 percent of 12th graders scored at or above proficient in reading; only 26 percent reached proficiency in math.

Ready for Higher Education?

NAEP achievement levels are now being aligned with benchmarks on whether students are prepared for college. Seniors who score “proficient” in reading will be considered college-ready. The college-ready benchmark for math falls between proficient and “basic.”

Source: National Center for Education Statistics

Chester E. Finn Jr., who was the chairman of NAGB when the NAEP achievement levels were first approved, said at a symposium this summer in Washington that the “proficient” level was always intended to be “aspirational,” while “ ‘basic’ was supposed to show you were literate and could make your way through the subway system.”

“Now, 23 years later, when college and career readiness is on everyone’s lips, ... lo and behold, the pretty-clear conclusion reached is NAEP ‘proficient’ comes pretty darn close to college preparedness,” said Mr. Finn, the president of the Thomas B. Fordham Institute, a Washington-based research group.

To get a more nuanced look at how students of different performance levels fared in college, the researchers tracked students by using Florida’s K-20 student longitudinal system.

Based on the Florida data, students who earned at least a 298 out of 500 in reading or 162 out of 300 in math—reading nearly at the “proficient” level and math in the “basic” range—also at least met the ACT or SAT college-placement benchmarks, had a first-year college grade point average of at least 2.67, and were placed in nonremedial courses in math and literature.

In a parallel effort to set career-readiness benchmarks within NAEP, the governing board also had studied ways to connect NAEP to readiness for work as an automotive master mechanic, computer-support specialist, heating and air-conditioning technician, licensed practical nurse, and pharmacy technician, but it was not able to draw conclusions about how performance on NAEP would relate to such careers.

For example, among NAEP’s math-framework objectives, 64 percent to 74 percent were “not evident as prerequisite” in any of the training required for the careers studied, a finding Cornelia Orr, the board’s executive director, called “quite shocking.”

“This was a very hard task, but it was very revealing,” Ms. Orr said. “We found no evidence that someone prepared for job training is academically prepared for college. That said, someone prepared for college is certainly prepared for job training.”

Writing Questioned

The governing board plans to conduct more linking studies between NAEP and the SAT and the ACT; longitudinal studies in Florida, Illinois, Massachusetts, Michigan, and Texas; and a linking study with the act’s Explore, an assessment for 8th graders, in Kentucky and Tennessee.

However, Achieve, a Washington-based college-readiness advocacy group, wrote in a July 30 letter to NAGB that NAGB’s college-preparedness benchmarks don’t gauge how well students are prepared for college-level writing and questions in that subject released by NAEP “do not come close to assessing [the] skill set” involved in writing based on multiple sources.

NAGB has not officially responded to the letter, but there are also moves to develop more-detailed descriptions of the skills and NAEP questions that the “college-prepared” cutoff scores represent. “The one thing I’ve been concerned about from the very beginning of this research is its applicability to real life,” said board member W. James Popham, a professor emeritus of education and information studies at the University of California, Los Angeles. “Real examples for real people would be useful.”

A version of this article appeared in the August 21, 2013 edition of Education Week as Assessment Governing Board Crafts Definition of ‘Prepared for College’

Events

Artificial Intelligence K-12 Essentials Forum Big AI Questions for Schools. How They Should Respond 
Join this free virtual event to unpack some of the big questions around the use of AI in K-12 education.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in Schools
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by Panorama Education
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Science Webinar
Spark Minds, Reignite Students & Teachers: STEM’s Role in Supporting Presence and Engagement
Is your district struggling with chronic absenteeism? Discover how STEM can reignite students' and teachers' passion for learning.
Content provided by Project Lead The Way

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Massachusetts Voters Poised to Ditch High School Exit Exam
The support for nixing the testing requirement could foreshadow public opinion on state standardized testing in general.
3 min read
Tight cropped photograph of a bubble sheet test with  a pencil.
E+
Assessment This School Didn't Like Traditional Grades. So It Created Its Own System
Principals at this middle school said the transition to the new system took patience and time.
6 min read
Close-up of a teacher's hands grading papers in the classroom.
E+/Getty
Assessment Opinion 'Academic Rigor Is in Decline.' A College Professor Reflects on AP Scores
The College Board’s new tack on AP scoring means fewer students are prepared for college.
4 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
Luca D'Urbino for Education Week
Assessment Opinion Students Shouldn't Have to Pass a State Test to Graduate High School
There are better ways than high-stakes tests to think about whether students are prepared for their next step, writes a former high school teacher.
Alex Green
4 min read
Reaching hands from The Creation of Adam of Michelangelo illustration representing the creation or origins of of high stakes testing.
Frances Coch/iStock + Education Week