Fourth graders had trouble using some of the basic functions on a computer-based writing assessment, including drop-down menus, editing icons, the highlighter tool, and text-to-speech, according to a recent study by the National Center for Education Statistics.
The results, which could be seen as a foreshadowing of the types of problems students might have with computer-based common-core tests in the spring, are part of a larger federal pilot study I wrote about last week. That study found that 4th graders can use computers to type, organize, and compose well enough to be assessed on their writing.
In a 60-student “usability study,” (a precursor to the full pilot) the researchers looked at how well students were able to use particular features on the NAEP computer-based assessment platform.
It’s worth noting that of the students involved in the usability study, 100 percent said they had access to a computer at school, and 93 percent said they had access at home.
Students did well with adjusting the volume, pressing the “play” button, scrolling, and understanding the timer, as shown by the table below. (The tables were taken from the “technical memorandum,” provided to me by an associate research scientist with the NCES, which released the study.)
However, when it came to some other basic functions, many of which are not unique to that particular platform, students struggled.
They did not know how to use the drop-down menus to see the full array of writing tools. They didn’t know the icons for “bold,” “italics,” “copy,” and “paste.” Most students skipped the directions telling them how to adjust their computer screens. They did not spend time reading the general instructions, either. One in six students could not figure out how to use the zoom.
Sixty-seven percent of students didn’t know how to use the highlighter—a potentially useful tool that will be available to all students on the Smarter Balanced and PARCC writing tests for the common core. And a quarter of students were confused by the icon for text-to-speech, another feature available on the impending common-core tests.
The researchers used these findings to redesign the writing test platform. The new computer-based assessment—which they then piloted with 13,000 students—had fewer directions per page, bigger icons, and better-labeled tools.
The platforms for the common-core tests have already been designed (and piloted and field tested), so the above findings are probably not going to influence decisions there. But they could be quite useful for teachers trying to prepare young students to take those tests in the spring. Above all, the usability study, though small in scale, underscores the need for direct instruction and practice with computer-based writing tools—even with those students who’ve had plenty of access to technology.