Standards & Accountability

Will Common-Core Testing Platforms Impede Math Tasks?

By Liana Loewus — September 23, 2014 9 min read
  • Save to favorites
  • Print

As two state consortia work to finish new assessments aligned with the Common Core State Standards, some mathematics experts say they’re worried that the computer-based testing platforms will hamper a key element of the exams: open-ended math-performance tasks that test students’ ability to apply their knowledge.

Unlike previous state assessments, those being developed by the two federally funded consortia will include complex, multipart word problems that students will answer on screen. While some of those questions will provide built-in tools that allow students to put points on a graph or draw lines on a ready-made picture, other questions will ask them to write their answers in narrative form, using a keyboard.

Some experts contend that forcing students to write a solution doesn’t match the expectations of the common-core math standards, which ask students to model mathematics using diagrams, graphs, and flowcharts, among other means.

“It’s not like, during the year in classrooms, these kids are solving these problems on the computer,” said David Foster, the executive director of the Morgan Hill, Calif.-based Silicon Valley Mathematics Initiative, which provides professional development for math teachers, creates assessments, and has worked with both consortia. “It’s such an artificial idea that now it’s test time, so you have to solve these problems on computers.”

But experts with the two state consortia—the Partnership for Assessment of Readiness for College and Careers, or PARCC, and the Smarter Balanced Assessment Consortium—say the tests that are slated to debut this school year, while imperfect, are a huge step forward in assessment methods, and have advantages over paper-and-pencil tests of the past.

The tests will also continue to improve as developers analyze the field-test results over the next few months, and in coming years as freehand drawing tools and other capabilities are incorporated, they say. For now, though, costs, security issues, and timing have prevented the use of more advanced technology.

Questions about the accuracy of the new tests are especially critical in states where teacher evaluations will be tied to the 2015 results. Many states agreed to link teacher evaluation and student achievement to improve their chances of getting federal Race to the Top grant money or No Child Left Behind Act waivers. While the U.S. Department of Education recently gave states the flexibility to hold off on tying test scores to teacher evaluations, many states did not take that option.

Classroom vs. Testing

Each performance task on the common-core tests asks students to apply a variety of skills to a novel scenario.

For instance, a sample middle school PARCC item provides a table with the number of brass and percussion instruments in three different marching bands. Students are asked to calculate the proportion of brass to percussion players using an on-screen “equation editor,” which they click on to put in numbers and symbols. Eventually, they’re asked to solve a multistep problem about the makeup of a new band and “show or explain” all their steps.

For Smarter Balanced, students will do one performance task; it will generally take 45 to 90 minutes, depending on the grade level. Students taking the PARCC assessment will likely encounter two sessions of about 50 to 100 minutes with up to 20 shorter performance-based tasks, though these details have yet to be finalized. For both consortia, parts of the performance tasks will be scored by hand. Students will be allowed to use scratch paper throughout both consortia’s tests, but those sheets will not be turned in or scored.

Currently, at least 26 states and the District of Columbia are planning to use the assessments developed by either PARCC or Smarter Balanced. Also, some of the states that have backed away from earlier plans to use the consortia tests may well use individual test items.

Math educators are quick to differentiate between concerns about the content of the tests and the testing platforms.

“I think there are some potentially really good questions [on the tests], but when you give kids a great question but all they can use to solve this problem is the keyboard in front of them, you sort of put handcuffs on the kids,” Mr. Foster said.

Martin Gartzman, the executive director of the Center for Elementary Mathematics and Science Education at the University of Chicago, has been leading the charge among a group of Illinois math educators who are worried about the PARCC testing platform.

“The primary tools kids use to solve problems and justify their answers every day in math class and required by the common core aren’t available for kids to use during these tests,” he said. “We’re going to get tests that are just going to frustrate people and are going to really underrepresent what students know.”

Thought Processes

But being able to communicate a thought process in words is a critical skill, said Doug Sovde, PARCC’s director of content and instructional supports. “I think it’s entirely reasonable for somebody to work through something on scratch paper and be able to share their reasoning after they worked through the task,” he said. “It’s not unlike a student making an outline before they write their research paper.”

Jan Mulqueeny, an assistant superintendent in District 126 in Alsip, Ill., said she witnessed students struggling with the transfer from scratch paper during PARCC field testing last spring.

“I saw [4th grade] students writing their answers to the problems through drawings and creating tables and a variety of ways. But when they tried to replicate that in the text box and with the equation editor, they were totally stymied,” she said. “When you picked up the student scratch paper and saw the work they did, and saw what was on the screen, there was certainly a discrepancy.”

PARCC is planning to eventually provide a drawing tool for students, said Mr. Sovde.

“We are working to develop something that works on all devices, all browsers, Java, et cetera,” he said. “That’s going to take some time to get it right, and we are pursuing that functionality.”

Shelbi Cole, the director of mathematics for the Smarter Balanced Assessment Consortium, said it’s important to understand that a lot of research has gone into item development to help ensure test-score validity. The group is analyzing field-testing results now, including comparisons of how students did on the same items using computers vs. paper and pencil.

“The stuff that ends up in the operational pool is what lives after we’ve done all this research,” Ms. Cole said.

Further, she said, computer-based administration provides opportunities that aren’t possible with paper-and-pencil tests, such as the use of spreadsheets or problems that mimic the development of an app. Those aren’t ready now, but “we have a plan in place” for such features down the road, she said.

More broadly, “I don’t know that math class should be a paper-and-pencil subject anymore,” Ms. Cole said. “There are lots of great tools out there people are using.”

Mr. Foster of the Silicon Valley Mathematics Initiative, who consulted for PARCC and wrote exemplar performance tasks for Smarter Balanced, strongly disagreed.

“I’m a mathematician, and I never solve problems by merely sitting at the keyboard. I have to take out paper and pencil and sketch and doodle and tinker around and draw charts,” he said. “Of course, I use spreadsheets all the time, but I don’t even start a spreadsheet until I know what I want to put in the cells.

“All Smarter Balanced and PARCC are going to look at is the final explanation that is written down,” he said, “and if there’s a flaw in the logic, there’s no way to award kids for the work they really did and thought about.”

Mr. Foster added: “I’ve played with the platform, and it makes me sick. And I’ve done it with problems I’ve written.”

What the Standards Say

The common-core math standards explicitly set the expectation that students should at times create visuals to represent their understanding. The Standards for Mathematical Practice, which describe the processes that proficient math students use in solving problems, say students should be “able to identify important quantities in a practical situation and map their relationships using such tools as diagrams, two-way tables, graphs, flowcharts, and formulas.”

A 7th grade geometry standard asks students to “draw (freehand, with ruler and protractor, and with technology) geometric shapes with given conditions.” A majority of the fractions standards in grades 3-5 ask students to solve problems by, for example, “using a visual fraction model.”

“We know there’s a gap between what the standards say, what students are asked to do in the classroom, and what we’re measuring,” Ms. Cole of Smarter Balanced said. “Certain graphical sketches probably won’t be assessed in the exact way they’re called for in the standards. ... But these are isolated pieces in a much larger system in what we’re trying to do. We’ve come a really long way.”

Mr. Sovde of PARCC also emphasized that the new tests, while still being improved, are “a massive step forward from where state assessments have been up to this point.”

A math-assessment specialist who helped with item development and review for Smarter Balanced said she believes the platform works well.

“I feel like I’m able to understand and to hear what the student is thinking” by reading his or her response, said Jennifer Judkins, who works in the Washington state education agency and has graded more than a thousand student responses to open-ended math questions for Smarter Balanced.

“I can’t say it’s 100 percent perfect, but, for the most part,” she said, “I could understand what they were doing.”

But, as James W. Pellegrino, a professor of education at the University of Illinois-Chicago who serves on the technical-advisory committees of both consortia, points out, students can solve a single problem in any number of ways, not all of which are easy to explain in words.

“The worry is [the platform] narrows the scope of what students can do, and the evidence they can provide about what they understand,” he said. “That leads to questions about the validity of the inferences you can make about whether students really developed the knowledge and skills that are part of the common core.”

‘No Free Lunch’

As Mr. Foster sees it, there’s at least one simple solution that could be incorporated soon: scan students’ scratch paper into the system.

“Scanning technology is so easy and so cheap in this day and age,” he said. “That’s not high-tech stuff.”

Most schools have had scanning technology for many years, he said.

Mr. Sovde of PARCC said the group is considering that option.

Even so, Scott Marion, an associate director of the National Center for the Improvement of Educational Assessment, in Dover, N.H., which advises both consortia, argues that scanning has its own set of problems. For instance, the papers may not be legible or they may not scan well, Mr. Marion said. Plus, “you’re shipping these boxes and somebody in school is handling them, and that makes it easier for cheating.”

He added: “There’s no free lunch in this. It’s going to be a challenge no matter what way we do it.”

Coverage of efforts to implement college- and career-ready standards for all students is supported in part by a grant from the Bill & Melinda Gates Foundation, at www.gatesfoundation.org. Education Week retains sole editorial control over the content of this coverage.
A version of this article appeared in the September 24, 2014 edition of Education Week as Will Common-Core Testing Platforms Impede Math Tasks?

Events

School & District Management Webinar Crafting Outcomes-Based Contracts That Work for Everyone
Discover the power of outcomes-based contracts and how they can drive student achievement.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in Schools
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by Panorama Education
School & District Management Webinar EdMarketer Quick Hit: What’s Trending among K-12 Leaders?
What issues are keeping K-12 leaders up at night? Join us for EdMarketer Quick Hit: What’s Trending among K-12 Leaders?

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Standards & Accountability State Accountability Systems Aren't Actually Helping Schools Improve
The systems under federal education law should do more to shine a light on racial disparities in students' performance, a new report says.
6 min read
Image of a classroom under a magnifying glass.
Tarras79 and iStock/Getty
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Standards & Accountability Sponsor
Demystifying Accreditation and Accountability
Accreditation and accountability are two distinct processes with different goals, yet the distinction between them is sometimes lost among educators.
Content provided by Cognia
Various actions for strategic thinking and improvement planning process cycle
Photo provided by Cognia®
Standards & Accountability What the Research Says More than 1 in 4 Schools Targeted for Improvement, Survey Finds
The new federal findings show schools also continue to struggle with absenteeism.
2 min read
Vector illustration of diverse children, students climbing up on a top of a stack of staggered books.
iStock/Getty
Standards & Accountability Opinion What’s Wrong With Online Credit Recovery? This Teacher Will Tell You
The “whatever it takes” approach to increasing graduation rates ends up deflating the value of a diploma.
5 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
DigitalVision Vectors/Getty