Includes updates and/or revisions.
It’s a daunting job for two big groups of states to design multilayered assessment systems by 2014, and a panel of experts has made it even more daunting, composing a long list of concerns about what it will take to make the venture a success.
On its list, the panel included high-level, long-range items such as integrating the tests into systems of instruction, and nitty-gritty, immediate worries such as making sure the tests’ computer demands don’t blow schools’ electrical circuits.
The to-do list was sketched out during a six-hour hearing convened April 15 by the U.S. Department of Education. It was the first in a series aimed at advising the two state collaboratives as they design tests for the new common standards in mathematics and English/language arts that have been adopted by all but six states, using $360 million in federal Race to the Top money. Forty-five states are participating in the assessment consortia.
The meeting focused on the technological challenges states and districts might face in the 2014-15 school year, when the federal government expects the tests to be fully operational. Questions of technological capacity loom, since both consortia plan assessments that are largely computer-based.
Presiding over the hearing, Ann Whalen, a top adviser to U.S. Secretary of Education Arne Duncan, called the technological questions “sticky.” By day’s end, the long list of cautions led one panelist to extend his sympathies to the two consortia, each of which was represented by a small battery of technical experts.
“If I were sitting in your shoes right now, I’d be feeling a bit nervous,” said Michael Russell, who directs the Nimble Innovation lab at Measured Progress, a Dover, N.H.-based assessment company. As chuckles rippled around the room, Mr. Russell hurried to add an optimistic spin, expressing confidence that the test-design teams would adjust to the challenges ahead.
“Twenty years from now, we’ll look back, and it’ll be second nature,” he said.
The two consortia plan to work jointly to size up districts’ capacity to handle large-scale computer-based testing. They said they would soon issue a joint request for proposals from vendors to design a technology-readiness assessment that would give the consortia a sense of districts’ capacity and signal them about what they might need to transition to online testing.
States and districts must soberly assess what’s needed to make that shift, since the consortia’s “great visions of grandeur” involve not just summative tests, but systems that include interim or formative tools, banks of test items, portals through which test results can be accessed and analyzed, and more, said John Jesse, the assessment director in Utah, which belongs to the SMARTER Balanced Assessment Consortium.
The most problematic capacity issues will be at the schools themselves, Mr. Russell said. If an Internet router can’t handle 60 or 70 computers at once, for instance, problems could arise if a social studies teacher decides to stream video for her class while large groups of students are taking tests elsewhere in the building, he said.
Experts also warned the SBAC and the other consortium, the Partnership for the Assessment of Readiness for College and Careers, or PARCC, to be cautious about administering tests on multiple kinds of devices. Schools might not be able to be “device agnostic,” Mr. Russell said, if a student can’t demonstrate achievement as well on a tablet computer as on a desktop computer. It might not be possible to deliver assessments the same way on different devices “without measurement effects,” he said.
Lessons Learned
A trio of officials from Virginia was on hand to share the state’s experience, since that state was on the leading edge when it ventured into online assessment in 2000. Sarah Susbury, who oversees test administration, scoring, and reporting, noted that Virginia had six years to phase in its online tests, a luxury the consortia won’t have, with their deadlines only three years away. Virginia did not adopt the common standards or join either assessment consortium.
In moving to online testing, a key lesson for Virginia was that it’s impossible to separate assessment and technology, Ms. Susbury said. Experts in both areas must leave their traditional silos and work closely together, she said.
Virginia officials emphasized the importance of having one portal that could serve as the hub for the testing system, from entering and viewing student data to training teachers and posting status updates about problems with the system. Especially during testing windows, it’s crucial to have one place for districts to check for problems, since the state education department and its contracted help desk can’t answer every phone call, said Shelley Loving-Ryder, Virginia’s assistant superintendent for assessment and school improvement.
If that experience is any guide, the consortia should be prepared to do “a lot of hand-holding” in training educators at all levels on the new systems, Ms. Loving-Ryder said. As Virginia prepares to pilot online writing assessments, 5th grade teachers are particularly concerned because they fear 10-year-olds’ keyboarding skills could affect test outcomes, she said. The education department is sharing the test interface ahead of time to give teachers time to practice, she said.
Security Concerns
Online testing also prompts new questions about security of test data, several panelists said. Ms. Susbury warned that “encryption and security are critical” as states endeavor to protect test data. The state learned the hard way that it had to plan for unimagined security emergencies. In the first month of its program, a backhoe doing work on school property during a test severed a key computer line, prompting fear that students’ online responses wouldn’t be preserved, she recalled.
Mobile devices and “open source” systems pose challenges as well, without adequate attention to security, said Denny Way, a senior research scientist for psychometric and research services at Pearson.
One approach to security for certain kinds of test items, he said, would be to produce and publicly share a huge number of items. Releasing hundreds of essays in English/language arts, for instance, would make it impossible to prepare for so many essays, making cheating very difficult.
Wes Bruce, Indiana’s chief assessment officer, warned against that approach. He said he heard that one district in his state released so many test items—in an apparent bid to help students prepare for the test—that it constituted what he viewed as “a crime against kids.”
Concerns expressed by the experts ranged from the big-picture to the very down-to-earth.
Richard F. Rozzelle, the president of the Center for Educational Leadership and Technology, which helps districts and states manage information technology, urged states to use the new testing systems as an opportunity to build a new “information architecture” that would integrate all pieces of the education spectrum, from curriculum design to management of assessment data.
Virginia officials urged the test-design teams to size up even the most basic forms of capacity at the school level. They recounted how one rural district decided to charge all its wheeled carts of laptop computers overnight, overloading the electrical circuits and shutting off heat in all its buildings.
Stories like that prompted Ken Wagner, the assistant commissioner for data systems and information and reporting services in New York state, to note that the “boring, mundane details” can easily sink an ambitious assessment system. “If we don’t start talking about specific details,” he said, “we’re going to regret it.”