Students take tests exhaustively throughout their K-12 careers—so do they need to take a separate exam to gauge reading and writing skills for work? It’s a question still a long way away from having a clear answer.
By far the most popular workforce-readiness tests for schools are ACT Inc.'s WorkKeys. The Iowa City, Iowa-based organization offers a host of exams in the WorkKeys suite, but the most commonly administered ones are Workplace Documents, which measures reading in a workplace context; Graphic Literacy, which focuses on finding and interpreting information from charts, tables, and graphics; and Applied Math.
Alabama, Michigan, South Carolina, and Wisconsin have all their high school students take WorkKeys tests, and in eight other states, districts have the option to administer them. Students who get a score of 3 or higher on each exam, out of a maximum score of 7, also can earn a credential, called the National Career Readiness Certificate.
First developed in 1992, WorkKeys tests themselves never get as specific as welding techniques or coding in Java. Instead they’re based on shared skills that cross fields.
ACT has aggressively marketed its services to businesses, as well as to school leaders. It offers job-profiling services and maintains a database of some 22,000 job profiles in which employers and others can look up estimated WorkKeys scores corresponding to the skills needed for those jobs. In all, there were 1.7 million WorkKeys test-takers in 2017, ACT said.
Most of the research on WorkKeys has been conducted to confirm that the test is valid—in other words, that it measures what it’s designed to measure. Fewer external studies show the connection between students’ scores and their future success in specific jobs.
Evidence does indicate, however, that the test content differs significantly from traditional reading and math tests.
Studies commissioned earlier this decade by the federal board that administers the National Assessment of Educational Progress concluded that WorkKeys measured a far narrower slice of content than NAEP reading or math, though a few topics overlapped.
The cognitive focus of the exams also differed. WorkKeys’ documents exam, for example, emphasized locating and recalling information, whereas NAEP emphasized students’ ability to critique or evaluate something they’d read. And test questions were sometimes better linked to NAEP’s 8th grade frameworks, and at other times, to its 12th grade frameworks.
ACT debuted updated tests in 2017 to reflect new skills it identified in its job profiles, said Tom Langenfeld, the principal assessment designer for WorkKeys. The changes include a stronger focus on digital reading (as in an email chain), on spreadsheets in the math exam, and especially on identifying useful, accurate charts on the graphic-literacy exam.
Too Restrictive?
Some testing experts say they worry WorkKeys may be too limited to gauge specific skills in demand in the workforce today as jobs grow more specialized.
“The construct of WorkKeys has been restricted to reading information, locating information, and relatively simple math, and then further restricted beyond that because there are only certain things you can do with multiple-choice tests,” said Joseph Martineau, a senior associate at the Center for Assessment, a testing-consulting group. “I think they could update their studies and they could take advantage of technology-enhanced test items where they’re having people actually do things, like type out data into a table and organize it.”
WorkKeys has stayed with paper-and-pencil tests partly at employers’ requests, Langenfeld noted.
And some employers that rely on the exam say they can spot key differences in employee skill based on their scores.
RoyOMartin, an Alexandria, La., wood-product company that’s been in existence since 1923, requires new hires in production to score a 3 on each of the exams making up the career-readiness certificate. Donna Bailey, the company’s vice president for human resources, said she’s been exploring raising that bar to a 4—or possibly making it a threshold for advancing to different skills within the plant.
“I think a score of 4 shows that a person is more apt to read for meaning, whereas a 3 shows they just have basic reading skills,” she said. “And there is a difference with those folks [with 4s]. They are more independent; they can critically think. When they read something, they are reading to solve a problem.”
More to the point, she said, is that as manufacturing becomes less manual and more technical, employers need bigger doses of ongoing training. And while employers bear some of that responsibility, workers must have the independent reading and research abilities to refresh their skills as needed.
“Adult learners have to be self-learners,” she said. “Sometimes, they are going to have to find the information themselves.”