The No Child Left Behind Act has pushed states to an unprecedented level of testing the reading, writing, speaking, and listening skills of students who are learning English. Today, 4½ years after the measure was signed into law, the results remain a closely watched work in progress.
As of this spring, 44 states and the District of Columbia had implemented new, comprehensive tests for English proficiency—leaving a handful of states that have missed the federal deadline for the exams. Now, the federal government must review and approve the tests, even though their impact is already being felt in classrooms.
“This is one of the positive outcomes of the No Child Left Behind Act, that these tests have been developed,” said Jamal Abedi, a professor of education at the University of California, Davis.
To comply with the law—a wide-ranging attempt by Congress and President Bush to hold schools accountable for raising the achievement of all students—each state’s English-proficiency test must be aligned with its English-language-proficiency standards for students who are learning the language. Most states have adopted such standards for the first time since passage of the law as well as putting in place the new tests to assess those students’ reading, writing, speaking, and listening skills.
Read the accompanying story,
New York, Arizona At Odds With Ed. Dept. Over English Testing
The kinds of tests states are using vary greatly. Some are off-the-shelf tests that commercial publishers say meet the letter of the law even though they haven’t been customized for individual states. Some others are off-the-shelf tests that have been changed to align more closely with particular states’ standards.
Almost two dozen states are using tests developed by consortia financed by the U.S. Department of Education, while others have created their own assessments. Oregon, for example, crafted an online test, and Texas has devised an observation protocol for listening, speaking, and writing while using a conventional test to assess reading.
For some states, the new English-proficiency tests mean that teachers who once assessed students’ English skills in 30 minutes must now use a test that takes from four to six hours.
The tests have yet to prove themselves. Some educators believe they will help improve instruction for English-language learners, but others fear they will distract schools further from focusing on instruction for those students.
Kathleen Leos, the director of the office of English-language acquisition at the federal Education Department, confirmed that five states—Arkansas, Kentucky, Montana, North Dakota, and Utah—failed to meet the department’s deadline of this past spring to have the new English-language-proficiency tests in place.
While Florida’s new test hasn’t been implemented, Florida is not on the list of those that missed the deadline, according to Ms. Leos, because it had a statewide testing system in place that sufficed, at least temporarily.
Ms. Leos said the federal department is supporting the states that are lagging behind, rather than cracking down on them. “I did anticipate there would be wiggle room in the actual implementation of the system,” she said in a recent interview.
Test Options
So far, 22 states and the District of Columbia have adopted tests developed by the four consortia of states that were financed with $10 million from the Education Department.
Even though that number is far fewer than the 40-plus states that were involved in the consortia, Ms. Leos said the federal money was well spent because the consortia helped all the participating states become more informed about the standards- and test-development process.
Julia Lara, who headed a consortium coordinated by the Council of Chief State School Officers to develop the English Language Development Assessment, or ELDA, is surprised more states didn’t go with the tests from the consortia. Some may eventually switch to those tests, she said, if the Education Department doesn’t accept their off-the-shelf tests.
“I think the off-the-shelf option is problematic in that it’s probably just a slight improvement on what existed prior to the requirements of Title III [of the No Child Left Behind Act] to develop new sorts of assessments,” Ms. Lara said.
At the same time, Sari Luoma, the vice president for assessment for the Brea, Calif.-based Ballard & Tighe, Publishers, believes the market might expand for her company’s new off-the-shelf English-proficiency test, the IDEA Proficiency Test, which Alaska and North Carolina have already selected.
“There are states that aren’t happy with their consortium tests that might be considering other alternatives after a year or two,” she said.
Ms. Leos said the Education Department hasn’t reviewed any of the new assessments, so she can’t say whether they meet the requirements of the law. But if the tests do comply, she said, they will lead to improved teaching for English-language learners.
“An assessment aligned with the standards,” she said, “will be rich with information that will begin to impact instruction in the classroom.”
Classroom Effects Debated
But state officials disagree on whether the new tests are likely to help improve programs for English-learners.
Amelia Courts, the executive director of the English-as-a-second-language program for the West Virginia Department of Education, said the ELDA, which West Virginia has adopted, is a big improvement over the Woodcock-Muñoz Language Survey, which most West Virginia schools used before. The new test takes four to six hours to administer; the old test took only about 20 minutes, Ms. Courts said.
“We’re looking to get much more detailed reports that give an in-depth look at what each child can do,” she said. “A 20-minute, off-the-shelf quick glimpse of what a student can or can’t do is certainly going to inform instruction a lot less than the results we get [with the new test],” she said.
But B.J. Granbery, the division administrator for education opportunity and equity for Montana’s office of public instruction, said she doesn’t think a new test will mean improved instruction for Montana’s limited-English-proficient students. She noted that 81 percent of the 6,800 LEP students in Montana are American Indians and have needs different from those of LEP students in most other states, who are primarily from immigrant families.
The academic success of American Indian LEP students depends on much more than overcoming language barriers, Ms. Granbery said.
“A uniform approach does not fit our situation very well,” she said. “There are already measures of whether students are proficient in English, such as these other tests we have. These can be used to plan instruction. I don’t see how this [new test] is going to add much.”
Some teachers also contend the new tests don’t help students.
South Carolina’s new English-proficiency test, which is the ELDA, “doesn’t add nor does it improve” education for English-learners said J. Roberto Gonzalez, an ESL teacher for two high schools in the 19,000-student Beaufort County, S.C., schools. He was accustomed to using the Woodcock-Muñoz test, and he said the new test represents a loss.
“It takes away a lot of instruction time. It takes away the students’ self-confidence,” he said. “It doesn’t really test the students’ English skills, because it has them under pressure.”
Researchers’ Views
Researchers who specialize in education for English-language learners generally back the NCLB requirement that states use more-comprehensive assessments for English-language proficiency. But they caution that states will have to prove the usefulness of their new tests.
Mr. Abedi, of the University of California, Davis, was an independent evaluator of the ELDA and believes it is a good test. He added, however, that federal and state governments need to support ongoing research for comparisons between the available tests and for determinations of their validity.
“There are some [English-proficiency tests] that are whipped up,” said Charlene Rivera, the executive director of George Washington University’s Center for Equity and Excellence in Education, in Arlington, Va.
She has examined ACCESS for ELLs, a test developed by the World-class Instructional Design and Assessment consortium, or WIDA, now housed at the Wisconsin Center for Education Research at the University of Wisconsin-Madison, and says it is well done.
“You have to look at whether, in the end, people learn something about instruction—about language proficiency and the extent to which instruction is linked to it,” Ms. Rivera said of the tests.
She explained that previous English-proficiency tests were designed to identify and place English-language learners in programs. By contrast, the new tests are intended to evaluate students’ progress in English.
In addition, state officials noted that compared with the old tests, the new ones are standards-based and do a much better job of assessing academic English rather than just social English, as well as breaking out separate scores for reading, writing, speaking, and listening.
Difficult Task
For some states, creating or selecting a suitable English-proficiency test has been challenging.
The federal Department of Education didn’t give money to states themselves to pay for test development, but rather gave from $1.8 million to $3.6 million to each of the four state consortia to set English-language-proficiency standards and devise tests aligned with them.
With $2.3 million from the department and about $700,000 from states, the WIDA consortium developed ACCESS for ELLs, which is short for Assessing Comprehension and Communication in English State to State for English-Language Learners. The test takes about 2½ hours, according to Timothy J. Boals, the director of the WIDA consortium.
Some observers say the WIDA consortium had the strongest leadership of the four consortia, which they say explains why 12 states and the District of Columbia have decided to use that test.
Only five states adopted the full-length ELDA, though at one time 15 states had participated in the consortium that produced it. A sixth state, Ohio, created a shortened version of the ELDA. The cost of developing the ELDA was similar to that of ACCESS for ELLs. Ms. Lara said many states chose not to use the ELDA because of its four- to six-hour length.
Most states met a U.S. Department of Education deadline to implement new tests on English skills by this spring.
*Click image to download pdf (21.56 KB)
SOURCE: Education Week
Steven A. Ross, an ESL consultant for the Nevada Department of Education, said such concern about time was true for Nevada. “We have 75,000 [LEP] students, and if you can cut an hour off of each test, that’s 75,000 hours,” he said.
Nevada ultimately selected LAS Links, a new English-proficiency test published by CTB/McGraw Hill, which Mr. Ross said takes about an hour and a half to administer. Nevada officials have worked with the test publisher to set levels of English proficiency and a cutoff score for proficiency, but otherwise the test is not customized to Nevada, Mr. Ross said.
Colorado, Connecticut, Hawaii, Indiana, and Maryland have also purchased LAS Links, which stands for Language Assessment Scales Links, and have not customized it for their states, though Colorado has plans to do so for the coming school year and is calling the test the Colorado English Language Assessment.
New Jersey switched to using ACCESS for ELLs after negative responses from teachers during a field test of the ELDA.
Raquel Sinai, the coordinator of bilingual and ESL education for New Jersey, explained that the K-2 section of the test, which is based on observation, had definite drawbacks.
“It was very time-consuming,” she said. “It was very hard to have any kind of reliability. It relied too much on teacher judgment.”
Florida and Tennessee chose the test produced by a third consortium that was coordinated by AccountabilityWorks, a consulting firm based in Washington, D.C., with the Educational Testing Service, of Princeton, N.J., as the developer.
But Pennsylvania, which had been part of that consortium, switched gears because officials there were worried that the consortium wouldn’t complete a test in time for the state to meet the federal deadline, according to Thomas E. Gluck, the executive deputy secretary for the Pennsylvania education department.
Not Customized
Starting in the 2002-03 school year, Pennsylvania implemented the Stanford English Language Proficiency Test, which was not customized for the state. By spring 2005, the state had created English-language proficiency standards, and this summer it plans to convene advisory groups to write new test items that better align the test with those standards.
Mississippi is also using a version of the Stanford English Language Proficiency Test that hasn’t been customized for the state. Meanwhile, Arizona, South Dakota, Virginia, Washington state, and Wyoming have decided to use versions of the same test that have been augmented for state standards.
Lastly, a consortium called Mountain West created a bank of test items, but the consortium didn’t stay together after the test-development grant ran out. Those states were left to implement the test on their own. Michigan and Utah decided to use the bank of test items, but the other states in the consortium didn’t.
Montana and North Dakota, which were part of that consortium, are among the states that were still shopping for an English-proficiency test this summer.
“I have a real good test. It’s sitting in a box on my desk,” said Mari B. Rasmussen, the head of bilingual/language-acquisition services for North Dakota, which has about 6,000 LEP students. “The next step was to implement this.”
She added: “My heart and soul was in this project, and I think it’s a good thing, but North Dakota can’t implement this alone. The resources are tremendous to maintain a high-quality product.”
So, that test will remain in the box. Instead, she said, the state plans to join the WIDA consortium and become the 13th state, along with the District of Columbia, to adopt ACCESS for ELLs.