School & District Management

Comparing Paper and Computer Testing: 7 Key Research Studies

By Benjamin Herold — February 23, 2016 7 min read
  • Save to favorites
  • Print

Do the computer-based exams that are increasingly prevalent in K-12 education measure skills and knowledge as accurately as traditional paper-based tests?

With news that millions of students who took PARCC exams by computer tended to score worse than those who took the same exams with paper and pencil, it’s a technical question that is again getting heavy scrutiny.

Earlier this month, officials from the multistate Partnership for the Assessment of Readiness for College and Careers acknowledged to Education Week that there were discrepancies in scores across different formats of its exams.

Illinois, Rhode Island, and the Baltimore County, Md., schools are among the states and districts that have found such a pattern, with the advantage for paper-based test-takers appearing to be most pronounced in English/language arts and upper-grades math.

In Rhode Island, for example, officials found that 42.5 percent of the students who took the PARCC English/language arts exam on paper scored proficient, compared with 34 percent of those who took the test by computer. A spokesman for the state education department said the variability in scores there appears to be due in large measure to varying degrees of “student and system readiness for technology.”

Researchers and psychometricians have been wrestling with the dilemma of comparing paper- and computer-based test results for more than 20 years, said Derek Briggs, a professor of research and evaluation methodology at the University of Colorado at Boulder. He serves on the technical-advisory committees for both PARCC and the Smarter Balanced Assessment Consortium, the two main groups that have created tests aligned with the Common Core State Standards.

Briggs said computer- and paper-based versions of an exam shouldn’t necessarily be expected to measure the same abilities, or have comparable results. Part of the motivation for pouring hundreds of millions of federal dollars into the new consortia exams, after all, was to use technology to create better tests that elicit, for instance, more evidence of students’ critical-thinking skills and ability to model and solve problems.

But the reality is that in some states and districts, the technology infrastructure doesn’t exist to support administration of the computer-based exams. All children don’t have the same access to technology at home and in school, nor do their teachers use technology in the classroom in the same ways, even when it is present.

And some students are much more familiar than others with basic elements of a typical computer-based exam’s digital interface—how to scroll through a window, use word-processing features such as copying and pasting, and how to drag and drop items on a screen, for example. A mounting body of evidence suggests that some students tend to do worse on computer-based versions of an exam, for reasons that have more to do with their familiarity with technology than with their academic knowledge and skills.

To give a deeper look at the issues behind this “mode effect,” Education Week examined seven key research studies on the topic:

1. “Online Assessment and the Comparability of Score Meaning”

Educational Testing Service, 2003

“It should be a matter of indifference to the examinee whether the test is administered on computer or paper, or whether it is taken on a large-screen display or a small one,” wrote Randy Elliot Bennett more than a decade ago. Bennet was one of the leaders in the field of psychometrics and mode-comparability, and this overview explores a range of mode-comparability issues. “Although the promise of online assessment is substantial, states are encountering significant issues, including ones of measurement and fairness,” the paper reads. “Particularly distressing is the potential for such variation [in testing conditions] to unfairly affect population groups, such as females, minority-group members, or students attending schools in poor neighborhoods.”

2. “Maintaining Score Equivalence as Tests Transition Online: Issues, Approaches, and Trends”

Pearson, 2008

The authors of this paper, originally presented at the National Council of Measurement in Education, highlight the “mixed findings” from studies about the impact of test-administration mode on student reading and mathematic scores, saying they “promote ambiguity” and make life difficult for policymakers. The answer, they say, is quasi-experimental designs carried out by testing entities such as state departments of education. The preferred technique, the paper suggests, is a matched-samples comparability analysis, through which researchers are able to create comparable groups of test-takers in each mode of administration, then compare how they performed.

3. “Does It Matter If I Take My Mathematics Test on Computer? A Second Empirical Study of Mode Effects in NAEP”

Journal of Technology, Learning, and Assessment, 2008

“Results showed that the computer-based mathematics test was significantly harder statistically than the paper-based test,” according to Randy Elliot Bennett, who is also the lead author of this paper, which looked at results from a 2001 National Center for Education Statistics investigation of new technology for administering the National Assessment of Educational Progress in math. “In addition, computer facility predicted online mathematics test performance after controlling for performance on a paper-based mathematics test, suggesting that degree of familiarity with computers may matter when taking a computer-based mathematics test in NAEP.”

4. “The Nation’s Report Card: Writing 2011”

National Center for Education Statistics, 2014

As the NCES moved to administer its first computer-based NAEP writing assessment, it also tracked the impact in this study of how 24,100 8th graders and 28,1000 12th graders performed. Doug Levin, then the director of the State Educational Technology Directors Association, summed up the findings in a 2014 blog post: “Students who had greater access to technology in and out of school, and had teachers that required its use for school assignments, used technology in more powerful ways” and “scored significantly higher on the NAEP writing achievement test,” Levin wrote. “Such clear and direct relationships are few and far between in education—and these findings raise many implications for states and districts as they shift to online assessment.”

5. “Performance of 4th-Grade Students in the 2012 NAEP Computer-Based Writing Pilot”

NCES, 2015

This working paper found that high-performing 4th graders who took NAEP’s computer-based pilot writing exam in 2012 scored “substantively higher on the computer” than similar students who had taken the exam on paper in 2010. Low- and middle-performing students did not similarly benefit from taking the exam on computers, raising concerns that computer-based exams might widen achievement gaps. Likely key to the score differences, said Sheida White, one of the report’s authors, in an interview, is the role of “facilitative” computer skills such as keyboarding ability and word-processing skills. “When a student [who has those skills] is generating an essay, their cognitive resources are focused on their word choices, their sentence structure, and how to make their sentences more interesting and varied—not trying to find letters on a keyboard, or the technical aspects of the computer,” White said.

6. “Mathematics Minnesota Comprehensive Assessment-Series III (MCA-III) Mode Comparability Study Report”

Minnesota Department of Education and Pearson, 2012

This state-level study of mode effects on exams administered in spring and summer of 2011 used the matched-samples comparability-analysis technique described in the Pearson study. “Although the results indicated the presence of relatively small overall mode effects that favored the paper administration, these effects were observed for a minority of items common to the paper and online forms,” the study found.

7. “Comparability of Student Scores Obtained From Paper and Computer Administrations”

Oregon Department of Education, 2007

This state-level mode-comparability study looked across math, reading, and science tests administered by both computer and paper. “Results suggest that average scores and standard errors are quite similar across [computer] and paper tests. Although the difference were still quite small (less than a half a scale score point), 3rd graders tended to show slightly larger differences,” the paper reads. “This study provides evidence that scores are comparable across [Oregon’s computer] and paper delivery modes.”

Library Intern Connor Smith provided research assistance.
A version of this article appeared in the February 24, 2016 edition of Education Week as Seven Studies Comparing Paper and Computer Test Scores

Events

Artificial Intelligence K-12 Essentials Forum Big AI Questions for Schools. How They Should Respond 
Join this free virtual event to unpack some of the big questions around the use of AI in K-12 education.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in Schools
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by Panorama Education
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Science Webinar
Spark Minds, Reignite Students & Teachers: STEM’s Role in Supporting Presence and Engagement
Is your district struggling with chronic absenteeism? Discover how STEM can reignite students' and teachers' passion for learning.
Content provided by Project Lead The Way

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

School & District Management Local Education News You May Have Missed in 2024 (and Why It Matters)
A recap of four important stories and what they may signal for your school or district.
7 min read
Photograph of a stack of newspapers. One reads "Three schools were closed and..."
iStock/Getty
School & District Management Principals Polled: Where School Leaders Stand on 10 Big Issues
A look at how principals responded to questions on Halloween costumes, snow days, teacher morale, and more.
4 min read
Illustration of speech/thought bubbles.
DigitalVision Vectors
School & District Management Opinion You’re the Principal, and Your Teachers Hate a New District Policy. What Now?
This school leader committed to being a bridge between his district and school staff this year. Here’s what he learned.
Ian Knox
4 min read
A district liaison bridging the gap between 2 sides.
Vanessa Solis/Education Week via Canva
School & District Management The 4 District Leaders Who Could Be the Next Superintendent of the Year
Four district leaders are finalists for the national honor. They've emphasized CTE, student safety, financial sustainability, and more.
4 min read
Clockwise from upper left: Sharon Desmoulin-Kherat, superintendent of the Peoria Public School District 150; Walter Gonsoulin, superintendent of Jefferson County Schools; Debbie Jones, superintendent of the Bentonville School District; David Moore, superintendent of the School District of Indian River County.
Clockwise from upper left: Sharon Desmoulin-Kherat, superintendent of the Peoria school district in Illinois; Walter Gonsoulin, superintendent of Jefferson County schools in Alabama; Debbie Jones, superintendent of the Bentonville, Ark., school district; and David Moore, superintendent in Indian River County, Fla. The four have been named finalists for national Superintendent of the Year. AASA will announce the winner in March 2025.
Courtesy of AASA, the School Superintendent's Association