Education

E.D. Issues Study Ranking States On Education

By Thomas Toch — January 11, 1984 6 min read
  • Save to favorites
  • Print

In an effort to promote school improvement in the nation through “healthy competition,” Secretary of Education Terrel H. Bell last week released a report ranking the 50 states and the District of Columbia on the basis of the resources they allocate to education and the performance of their students.

“We’re not releasing enough of this type of information--information that can motivate the states to enact reforms,” the Secretary said at a press conference that was called to explain the report.

Compiled by the Education Department’s office of planning, budget, and evaluation, the report compares the states in 14 areas, including scores on college-admissions tests, dropout rates, expenditures per pupil, teachers’ salaries, and demographic characteristics, such as per-capita income and the percentage of children living in poverty. In each of the 14 categories, the states are ranked in descending order of achievement.

Figures were given for 1972 and 1982; most of the information has been published previously.

The report, presented as a chart, shows that test scores of college-bound seniors declined in every state (but not in the District of Columbia) between 1972 and 1982 and that the national graduation rate declined from 77.2 percent to 72.8 percent during the same period. Mr. Bell said the increase in the dropout rate was “the biggest surprise” in the study.

The highest dropout rates in 1982 were in the District of Columbia (44.2 percent), Mississippi (37 percent), and Louisiana (36 percent). The lowest rates were in Minnesota (11.8 percent), North Dakota (12.7 percent), and Iowa (14.2 percent).

Mr. Bell and department officials who worked on the study said it substantiated President Reagan’s often-stated claim that there is in many cases little correlation between increased spending on education and improvement in student learning.

“The study demonstrates that just pumping more money into schools is not the answer,” said Jay Noell, a department official. “Different types of reforms, such as the Secretary’s master-teacher idea, are needed.”

Mr. Bell and his aides mentioned Arizona, Florida, and New York as states that have produced notable results in spite of carrying substantial educational “burdens,” such as a high percentage of students from families living in poverty.

Mr. Bell and other department officials were otherwise reluctant to generalize from the statistics in the study and cautioned against “jumping to conclusions” about test scores and dropout rates without noting the special demographic makeup of a state that might influence them.

Nonetheless, testing experts reacted critically to the study’s use of scores from the Scholastic Aptitude Test (sat) and the American College Test (act) as the primary standard for evaluating the states’ education record.

“We feel it is completely inappropriate to use the sat as a measure of comparison between states, school systems, or schools,” said Robert C. Seaver, a vice president of the College Board, the sponsor of the sat “The size of the sample taking the test varies widely from state to state, and there are just too many factors that influence scores for the test to be used for this purpose.”

The American College Testing program, which administers the act, refused to provide the Education Department with a state-by-state breakdown of its scores. The department collected the previously unpublished information from state departments of education.

Patricia A. Gartland, an assistant vice president of the program, quoted the testing agency’s guidelines, stating there are “serious limitations” in making state-by-state comparisons.

“Aside from including essentially only college-bound students,” the guidelines say, “several other factors affect the makeup of the act student population in a given state, including entrance-testing requirements by colleges, student migration to out-of-state institutions, and state financial-aid programs. Not only should these limitations discourage casual generalizations about education in the state at large, but they make meaningful cross-state comparisons especially difficult.’'

Approximately 50 percent of roughly 3 million high-school seniors attended college last year. About 1 million students took the sat and 1 million took the act, according to the organizations.

Archie E. Lapointe, executive director of the National Assessment of Educational Progress (naep), a federally funded program that measures the academic achievement of precollegiate students nationally, said it is “clearly inappropriate” for the Education Department to use college-admissions-test scores to compare the educational performance of the states.

However, he characterized as “sincere” and “practical” what he called Secretary Bell’s effort to “look for techniques to motivate the states to improve” their schools.

Asked about such criticisms, Alan Ginsburg, another department official involved in the study, said, “We’re not saying [sat and act scores] are what we necessarily want to use, but they’re the best thing available. People want something.” At his press conference, Mr. Bell said, “The fact that it’s hard to measure the performance of the states should not dissuade us from using what we do have” to make such comparisons.

The survey was done in part, Secretary Bell said, to address a growing desire among educators and the public for the creation of measurable standards in education.

This growing interest is reflected in recent Gallup public-opinion polls and a tentative endorsement of such comparative measures recently by the Council of Chief State School Officers, an organization that 15 years ago lobbied successfully to shape the National Assessment’s examinations so that they cannot be used to compare the performance of the states. Mr. Lapointe said three states--Connecticut, New York, and Wyoming--are instituting tests this year that will allow them to compare the performance of their students with that of students in the naep samples. And Mr. Bell said the Education Department is exploring the possibility of reorganizing the naep tests so that state-by-state comparisons of the results can be made.

The department attempted to more accurately compare the performance of the states on the two college-admissions tests by grouping the states according to which test is most widely used in a state. There is a wide disparity between the states in the proportion of students that take the tests; that variability results in a misleading picture of the academic performance of the states, the testing officials said.

This problem is “significantly reduced” by breaking the states into the two groups, Mr. Bell said last week.

Of the 22 states in which the sat is most often used, the department ranked the District of Columbia, Georgia, and North Carolina, in that order, as the states with the most positive change in their sat scores between 1972 and 1982.

The combined math and verbal scores in the District rose 18 points during that period (though in 1982 it still had the lowest scores of the states in the sat group). In Georgia, the combined scores declined by 11 points, and in North Carolina they declined by 22 points. Scores declined by a greater amount in the rest of the sat states.

Of the 28 states in which the act is more often taken, the best performances on the test were in Wisconsin, Arizona, and Colorado, in that order. Scores declined in those states, but not as much as in the other act states.

Asked whether grouping the states according to their use of the sat or the act was sufficient to make such state-by-state comparisons reliable, Mr. Seaver of the College Board said, “No.”

The report was initially scheduled to be released last month at the National Forum on Excellence in Education in Indianapolis. It was delayed, according to department officials, because the department did not want to “embarrass” representatives at the forum from states that fared poorly in the report.

A version of this article appeared in the January 11, 1984 edition of Education Week as E.D. Issues Study Ranking States On Education

Events

School & District Management Webinar Crafting Outcomes-Based Contracts That Work for Everyone
Discover the power of outcomes-based contracts and how they can drive student achievement.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in Schools
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by Panorama Education
School & District Management Webinar EdMarketer Quick Hit: What’s Trending among K-12 Leaders?
What issues are keeping K-12 leaders up at night? Join us for EdMarketer Quick Hit: What’s Trending among K-12 Leaders?

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Education Briefly Stated: October 2, 2024
Here's a look at some recent Education Week articles you may have missed.
8 min read
Education Briefly Stated: September 18, 2024
Here's a look at some recent Education Week articles you may have missed.
9 min read
Education Briefly Stated: August 28, 2024
Here's a look at some recent Education Week articles you may have missed.
9 min read
Education Briefly Stated: August 21, 2024
Here's a look at some recent Education Week articles you may have missed.
9 min read