How much can a test really tell you if a student gives up while taking it?
Quite a lot, as it turns out—if teachers know exactly when and how a student disengages. That’s what some schools nationwide started to do earlier this summer, using real-time alerts during a computer-based adaptive test to spot students going off task, and salvage meaning from assessments that students think they are writing off.
“It’s really a shift from seeing [testing] simply as a way to improve achievement scores—which it does—to seeing it as a source of data that teachers can use to understand their students,” said Jim Soland, a research scientist with the research and testing firm Northwest Evaluation Association.
The alerts are part of an ongoing research project between NWEA and the 50,000-student Santa Ana school district, a member of the California Office to Reform Education, or CORE, district consortium.
Teachers in the mostly poor and Hispanic Santa Ana district have always spotted the signs of students shutting down on a test, but they couldn’t tell how badly it affected their performance until the test was already complete.
“They play with the keyboard, they may be fidgeting,” said Emily Wolk, the assistant director of research and evaluation for Santa Ana, whohas watched over her share of tests. “Of course, we are very concerned about that; we know there is a connection between their engagement on a test and how they do in other areas as well.”
Gauging the Speed of Thought
Schools generally use two ways to measure student engagement on a test. Teachers can look at the results of the test and compare it to what they know of the student to decide how hard he worked, or they can ask a student directly if she gave the test her best effort. Both measures are subjective, and aside from noticing a student physically putting her head down, it can be difficult for teachers to realize a student has checked out during a test until it’s already been turned in. Even then, it’s hard to tell the difference between a student who is distracted because he had a fight with his mom over breakfast and one who despairs because a question is too hard.
Instead, NWEA researchers Stephen Wise, Soland, and Nate Jensen measure student-response times to hundreds of questions on the NWEA’s MAP Growth test, an adaptive, computer-based assessment taken by about 20 percent of U.S. students. It tracks the difficulty of the questions it presents to students’ previous test performance.
Wise and his colleagues found that under normal circumstances, students take 40 to 50 seconds to read and answer each question. If students starts to disengage from a test, “they start answering items very quickly—2 to 3 seconds—more quickly than it would take to evenread the question,” Wise said.
By tracking this “rapid guessing,” Wise and his colleagues can monitor how much effort students put into each question in real time and compare it with how hard students and teachers think the students worked. “People had thought that you are either engaged or disengaged on a test, and if you disengage, you stay disengaged,” Wise said, “but we found there’s no set pattern. It’s not like people shift into not trying and stay there. It’s more that they size up an item when they see it, and if it looks like more effort than I’m prepared to give, I’m just guessing.”
Who Guesses?
On average, the researchers found that after students first start rapid-guessing, they still legitimately try to answer 80 percent of the remaining questions on a test. Only 1 percent to 2 percent of those who disengage from a test rapid-guess on all or most of their questions.
Elementary students are less likely to disengage on a test than those in middle and high school—where as many as half of students rapid-guess at least once in a given test, Soland said. While there are relatively small racial differences in disengagement, boys are significantly more likely than girls to start rapid-guessing.
The subject matters, too. Though students often report math as moredifficult than reading, the researchers found that students on average are nearly twice as likely to rapid-guess on a reading test as on a math test, and they are more likely to guess at individual questions on any test that requires more reading.
Soland and Jensen now are working with Santa Ana to connect frequent test disengagement to other problems in school. They found that students who are less skilled at communicating with adults and classmates were more likely to disengage, as were those with more of a fixed mindset about academic skills—the belief that such skills are innate, rather than built with practice. But most of all, a student’s likelihood of disengaging on a test was associated with his or her self-management and self-regulation skills, the ability, for example, to show up for class prepared and on time. “As they disengage from tests and the course material, a whole host of other things come up ... attendance, suspensions, course failure ... that have been connected to risk of dropping out of school,” Soland said.
“What we’re really showing is lack of test engagement is a symptom around a lot of deep-rooted problems,” Soland said. “In my mind, there’s this chain from, if a kid has low motivation, a lack of self-belief in academic subjects, that can manifest itself in a lot of different ways.”
Soland and Jensen are now using the tests to build practical measuresof students’ social-emotional development and connection to school under the first national Social-Emotional Assessment Design Challenge award by the Collaborative for Academic, Social, and Emotional Learning.
NWEA is also changing its test reporting to show a student’s engagement levels as well as a performance score. It would show, based on the student’s performance before she started guessing, how much better she could have performed by trying harder.
“Eventually, people are going to start viewing test data differently: not just, here is your score, but here is how engaged people were when they gave you these scores; here’s how well does this reflect what students know and can do,” Wise said.
In the pilots so far, alerting the teacher to disengagement during the test “had a profound effect on [students’] engagement,” Wise said, though there are no formal evaluations of the intervention yet.
“This is a tool,” said Santa Ana’s Wolk. “A teacher knows the students, knows how to quietly go over and say, ‘Hey, how’s it going? I care about you, and I’m concerned you are moving too quickly here.’ ”