Opinion
Reading & Literacy Opinion

Writing: An Unexamined Gatekeeper

By Ardith D. Cole — February 26, 2007 8 min read
  • Save to favorites
  • Print

“Kids just aren’t learning like they used to. Have you seen those test results?” I hear this sentiment expressed by people everywhere, and often feel compelled to respond, “But tests are far more difficult now. They’re not just multiple-choice anymore. Today, they include writing.”

Everyone seems surprised to learn that our high-stakes assessments now include writing. Most say that multiple-choice tests are easier than written-response ones. They’d rather select a response than construct one.

BRIC ARCHIVE

Yet some states now include written-response items on every test—in reading, science, even math. And some of the writing tasks are lengthy and complex. This inclusion may present a significant variable, one that influences student performance and test scores. That’s how writing has become a gatekeeper for student promotion and graduation, as well as for schools’ “adequate yearly progress” and federal funding.

Researchers have reported many testing discrepancies in the past, but has anyone investigated how written response affects test results? After results for the 2006 SAT showed the largest drop off in scores in 31 years, administrators of the test took a close look—and blamed the exam’s new writing section. How did writing influence that drop in scores? Does it influence state tests the same way?

Right now, we don’t know how performance is affected when the writing variable is added. This means we cannot use current state-test comparison data until we first investigate all between-state assessment variables and their effects on performance.

Some of the between-state differences that may be significant include test formatting, writing tasks, types of writing and instruction, scoring and rubrics, and the accessibility to instructional supports.

A change in test formatting on that 2006 SAT produced significantly different results. So when states add written-response tasks to tests that previously contained only multiple-choice items, will their students experience a similar toll-taking challenge? Students in the nation’s capital encountered such a challenge last year, and only 28 out of 146 District of Columbia schools reached the 2006 benchmark. Did the addition of writing influence those scores? If so, how?

When comparing written-response and multiple-choice tests, we must also study how test length, difficulty level, and guessing affect results. Such differences currently invalidate comparisons of reading data between California and New York state, for example, or between Oregon and Washington state, because New York and Washington use written-response on their reading tests, while California and Oregon use only multiple-choice. What’s more, New York students construct written responses on all assessments (math, science, social studies), while states such as California require students to use writing only for the writing section.

Think about it. In every subject but one, students in California or Oregon read a prompt, followed by a question, and then “bubble in” their answer. Multiple-choice allows “bubblers” to guess, which gives them some chance of correctly selecting the one right answer.

On the other hand, in Washington, New York, New Jersey, Kentucky, Delaware, and other written-response states, guessing is less of an option. Instead, responders must analyze a multifaceted prompt, then organize right-answer facts from which they construct a single- or multiple-paragraph response—a far cry from coloring in circles on a Scantron sheet.

We also need to investigate how the tasks on written-response tests differ between states and between subject areas. Tasks currently vary in quantity, length, type, and, thus, difficulty. Answers range from mere one-word completion responses to multiple-page essays.

The variety of written-response-test prompts boggles the mind. Take a brief, cross-state cyber trip. Click on “released items.” You’ll notice that some prompts evoke self-based responses, while others require text-based answers.

English writing prompts generally call for the self-based type and welcome creativity, such as this one from Nevada’s 2000 writing test: “Discuss an event or situation that taught you a lesson or created a change in your life.”

But when states use writing to assess subject areas, such as reading, prompts usually require text-based responses. Thus, after reading the article titled “Porcupines,” Vermont students use text facts to answer this prompt: “Explain why porcupines do not have to fight. Use information from the article to support your answer.”

Some tests complicate matters by combining self-based and text-based response within the same task, as this one from Illinois does: “Would a person like Doc Marlowe best be described as someone to be admired or not admired? Explain why. Use information from the story and your own observations to support your answer.”

Whether written responses are brief or extended, machine-scored or human-scored, text-based or self-based, writing is elevated to the status of gatekeeper for those subjects. Might we conceivably predict, then, that students who have trouble with writing will have difficulty in every subject that’s tested through writing?

But what if the student knows the subject well—even knows the right answers—but does not write well? What if he can’t spell? What if his handwriting resembles that on the last prescription you took to the pharmacy? What if the student is from another country and confuses syntax? Should this person be encouraged to move out of New York, Washington, Ohio, Kentucky, Connecticut, and other written-response states and into a state like California? After all, his graduation could depend on it. But so might his career.

In 2004, the National Commission on Writing declared that students must learn to write well “in all subjects.” The panel called writing a “ ‘threshold skill’ for both employment and promotion, particularly for salaried employees.” Writing, it said, “could be your ticket in … or it could be your ticket out.”

There’s little doubt that students need instruction in all forms of writing. But here’s the catch: Who’s teaching written-response? English teachers, who have always been in charge of the writing realm, do not usually focus on just-the-facts responses, but rather on writing characterized by strong voice, enticing leads, clever metaphors, and creative description. Yet it is right-answer writing that is needed to construct a correct answer to many test prompts, such as this one on Washington state’s science test:

“A scientist at Acme Chemical found an unlabeled container. He knew the contents were either HCl, NaOH, or NaCl. Using Chart A on page 2 of the Direction and Scenario Booklet:

Describe four tests that can be used to determine which chemical was in the container.

For each test, tell how it would help him determine which chemical is in the container.”

Unfortunately, too many students labor long and hard composing a creative response to one of these right-answer prompts, thus making the task more difficult than it needs to be. What’s more, in the working world, will employers care about leads and voice? They’ll probably want creative thinking and just-the-facts writing.

Should we then throw creative writing out the window? Indeed not. But let’s acknowledge the influential scoring idiosyncrasies between creative writing and the right-answer writing needed to produce a test response.

Writing tests are scored holistically using rubric scales, which allow for shades of correctness. Most of these stretch to accommodate an infinite number of responses, so debates over scoring arise. That’s why states such as Nevada offer a writing-scores appeals process to those who disagree with their scoring.

Conversely, right-answer writing, because of its tight, text-based boundaries, makes scoring less debatable. Is that why some states are moving away from creative writing tasks or excluding the writing section of their tests from accountability calculations?

Moreover, expense and training related to correcting tests vary. Some written responses are machine-scored, but most are hand-scored, sometimes by educators, other times by noneducators. Is the added expense of training and correcting why states like Texas choose to use written-response only on their high school assessment?

And what about differences in accessibility to instructional-support systems? Staff development varies significantly between states, as do procedures related to released items and sample responses. Some states keep tests under tight wraps. Such “secure” tests make it almost impossible for anyone to view a particular student’s actual assessment to locate that student’s individual needs.

Scores are returned to schools when tests are not, but a few states quickly return scores and tests to schools. Some reuse tests. Others do not. These differences create between-state accessibility discrepancies, but do they also affect between-state results? If so, how much?

These are only some of the unexamined variables related to state assessments. But they bear witness to the fact that researchers must stop comparing apples to pot roast. Comparisons that use state assessments currently are unreliable and invalid because of potentially influential, unexamined differences. In light of the testing craze inspired by the No Child Left Behind Act, it seems unconscionable that writing’s gatekeeper status has gone unrecognized.

Regardless of federal or state mandates, however, one thing remains obvious: We need to help all students develop skills in both creative and right-answer writing, using authentic experiences that demonstrate the diversity and the importance of writing. It is, after all, increasingly the ticket the gatekeeper will require.

Related Tags:

A version of this article appeared in the February 28, 2007 edition of Education Week as Writing: An Unexamined Gatekeeper

Events

School & District Management Webinar Crafting Outcomes-Based Contracts That Work for Everyone
Discover the power of outcomes-based contracts and how they can drive student achievement.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in Schools
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by Panorama Education
School & District Management Webinar EdMarketer Quick Hit: What’s Trending among K-12 Leaders?
What issues are keeping K-12 leaders up at night? Join us for EdMarketer Quick Hit: What’s Trending among K-12 Leaders?

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Reading & Literacy What Teachers Say They Need Most to Help Struggling Teen Readers
Educators also want more time in the school day to work on reading skills, a new survey finds.
4 min read
Close cropped photo of an open book with a teen girl's eyes peering over the top of the book.
Jack Hollingsworth/Getty
Reading & Literacy Opinion Boys Don't Love to Read. Could This Former Teacher Be on to Something?
Boys are falling behind in reading. Books with military-history themes may help reverse this trend.
7 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
Luca D'Urbino for Education Week
Reading & Literacy Is Handwriting a Lost Art? What One College’s Kerfuffle Over Cursive Can Tell Us
Since 2014, there’s been a resurgence of cursive and handwriting education.
6 min read
A photograph of a close up of cursive handwriting that is undecipherable
E+
Reading & Literacy Quiz Quiz Yourself: How Much Do You Know About Student Literacy Data?
Answer 7 questions about the importance of student literacy data and how to collect and use it.