Opinion
Education Opinion

Why Gauge Students on A Global Scale?

By Archie E. Lapointe — February 06, 1991 8 min read
  • Save to favorites
  • Print

Because they are worried and fearful--and because war is so horrible--students should be learning to ask why Winston Churchill insisted, “War is horrible, but slavery is worse.” Why Robert E. Lee reflected, ''It is good that war is so horrible, lest we should love it too much.’' They should be learning to ask why, in its enduring cadences, Ecclesiastes says:

School leaders have always needed reliable information on the status of student academic achievement. Information on a global scale is ever more pertinent today, because the school-improvement movement--and its attendant economic implications--are without borders.

Educators also must have public support. Regardless of ideology or instructional philosophy, there is virtual unanimity in the United States for the proposition that our education system needs strong backing in every community.

If information is inspiration, some of this support could come because of IAEP II.

The purpose of the 1991 international assessment, then, is to produce, one year later in March of 1992, a set of reports that will detail each country’s achievement results, catalog home and classroom factors that affect student learning in the various countries, and describe other relevant behaviors, such as how much homework students do and how much television they watch.

Why bother? Can any assessment account for the differences between a rural classroom in Korea and one in France, or Taiwanese textbooks compared to the learning resources available in a Russian school? Why invest student time, teacher energy, and school cooperation in an international assessment?

Because these 13-year-olds share a planet whose ozone layer is fraying. Theirs will be a world grappling with complex technological issues, acid rain, radioactive waste, untreatable illness, hunger. In 10 years, when they are 23 years old, these youths will be shaping our global environment.

Today, the mathematical and scientific knowledge accumulated by the 105 million 13-year-olds on the earth is a nest egg for the planet.

The project that begins in March will rely on a careful structure and proven techniques. It employs the same sampling procedures in each country. It will present the same test, following the same standardized procedures, and ask the same background and attitude questions. Reports will carefully note the proportion of each country’s 13-year-olds who, for one reason or another, are not represented in the national sample. Each country will develop and follow a quality-control plan approved by Educational Testing Service, the project administrator, to ensure the validity and reliability of the findings. Schools will be randomly visited during the assessment.

Based on samples that represent more than one-fourth of the world’s 13-year-old population and building on tested procedures, IAEP II will generate a status report rich in information on a range of educational activities and outcomes.

In 1987, IAEP I, funded by the National Center for Education Statistics and the National Science Foundation, demonstrated that some of the content and procedures developed for the National Assessment of Educational Progress could be used to improve the efficiency of an international comparative study. With hundreds of mathematics- and science-test questions and a large investment in the methodology of assessment, NAEP was an appealing model for application in this wider sphere.

Data from IAEP I, reported in 1989 in A World of Differences, suggest a number of benefits from this kind of study:

  • To those setting standards for student achievement, it is instructive to observe what 13-year-olds in various countries can achieve. Those with the responsibility for setting achievement goals in the United States, for example, should know that in the Canadian province of Quebec and in Korea, more than 70 percent of 13-year-olds have success solving two-step mathematics problems, compared with 40 percent of our students.
  • To those recommending school policies and practices, it is helpful to identify factors that correlate positively with school success or failure. In 10 of the 12 populations compared, for instance, 13-year-olds who did more mathematics homework achieved higher math scores on the IAEP test.
  • To those concerned with long-range planning, either on a state or national level, it is informative to learn how successfully human resources are being developed in other countries. A case in point: Nearly 82 percent of Korean 13-year-olds agreed with the statement, “Much of what you learn in science classes is useful.” Only 30 percent of American students had that opinion. Why?
  • To business and labor leaders girding for the coming expansion of global economic competition, it is essential to keep abreast of our partners’ and competitors’ projected workforce characteristics. Five out of every 100 Korean 13-year-olds are able to “understand and apply more advanced mathematical concepts,” while only 1 out of every 100 of their U.S. counterparts can perform at this level.
  • To political and community leaders, competitive information can inspire support for upgrading learning conditions and justifies, for parents and students alike, the concentrated efforts needed to improve performance. Such a spur was the knowledge, in the last test, that in both mathematics and science, U.S. 13-year-olds performed at or near the bottom, compared with 13-year-olds of 11 other population groups from six countries.
  • These examples suggest how comparative findings can be worthwhile--if the data are valid and reliable and the results can be produced quickly and efficiently.

    Thanks to NAEP’s tested procedures, along with a fair amount of discipline, IAEP I yielded a thought-provoking report in less than three years, compared with previous experiences requiring six or more years. That test also indicated that while many of NAEP’s data-analysis techniques and reporting procedures “travel well,” the journey for test content, even in mathematics and science, requires extraordinary care.

    Comparative statistics, whether economic, medical, or educational, always face legitimate challenges:

    • Are the samples truly comparable? They must be independently and rigorously drawn. Each report must clearly identify the ranges of sampling error that influence the reliability of reported statistics, as well as the percentage and the characteristics of each country’s student population that is represented.
  • Do school programs (opportunities to learn) differ? A difficult question to address accurately, but school curricula and teacher practice reflect a country’s educational priorities and must be described to account for variance in performance.
  • Does the United States share a common definition of what excellence in math or science represents? Do we want to excel in what others define as science (that is, an accumulation of facts) or in what our experts might define as a “way of thinking”?
  • What effects do cultural differences have on student learning? Nonschool factors, often described as motivation, or the “desire to learn,” are increasingly recognized as key elements in the equation.
  • Education policymakers as well as teachers from around the world are searching for tools to help them identify and set reasonable standards. They are seeking with even greater interest to identify the factors that seem to improve the learning environment. Information from a variety of foreign countries, some with environments that closely parallel our own (Canada) and some that differ greatly (China) can yield clues of what is possible, and of strategies that may be helpful.

    Inevitably, these data will cause us to reflect upon a range of generally accepted assumptions about the preparation of teachers, the type of learning materials available, the student-teacher ratio, the length of school days and the school year, as well as many societies’ values and attitudes about the importance and role of education.

    But this kind of information can only be as helpful as its quality will allow. How can we assure that it will be as valid and reliable as possible? How can we be confident that its dissemination will be as accurate and as responsible as we can make it?

    In the planning for IAEP II, and as the project has been implemented with the guidance of the National Academy of Sciences’ board on international comparative studies in education, the multinational project team has addressed these questions systematically and conscientiously. The great motivator has been the self-interest of each participating country. The expenditure of this much energy, effort, and money would be pointless if the yield were unreliable data.

    The results of IAEP II will be as good as current technology allows. Like all survey research, the findings will have limitations. Nonetheless, with reasonable interpretation, they will constitute useful tools for the many professionals charged with the responsibility for finding ways to improve learning.

    In the short term, the reports from the test will provide insights into possible achievement targets and how we might improve academic achievement in the United States. They will be useful in spurring greater efforts to support our schools.

    In the long run, the assessment techniques polished through projects like IAEP will be repeatedly refined, to the benefit of educators in all nations.

    They, in turn, will advance from asking “why on earth” about the testing process itself to understanding “how on earth” each distinctive society prepares its children for their successful contribution to a shared future. That is, educators will learn from each other.

    That’s the bottom line.

    A version of this article appeared in the February 06, 1991 edition of Education Week as Why Gauge Students on A Global Scale

    Events

    School & District Management Webinar Crafting Outcomes-Based Contracts That Work for Everyone
    Discover the power of outcomes-based contracts and how they can drive student achievement.
    School & District Management Webinar EdMarketer Quick Hit: What’s Trending among K-12 Leaders?
    What issues are keeping K-12 leaders up at night? Join us for EdMarketer Quick Hit: What’s Trending among K-12 Leaders?
    This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
    Sponsor
    Artificial Intelligence Webinar
    Teaching Students to Use Artificial Intelligence Ethically
    Ready to embrace AI in your classroom? Join our master class to learn how to use AI as a tool for learning, not a replacement.
    Content provided by Solution Tree

    EdWeek Top School Jobs

    Teacher Jobs
    Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
    View Jobs
    Principal Jobs
    Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
    View Jobs
    Administrator Jobs
    Over a thousand district-level jobs: superintendents, directors, more.
    View Jobs
    Support Staff Jobs
    Search thousands of jobs, from paraprofessionals to counselors and more.
    View Jobs

    Read Next

    Education Briefly Stated: October 23, 2024
    Here's a look at some recent Education Week articles you may have missed.
    9 min read
    Education Briefly Stated: October 2, 2024
    Here's a look at some recent Education Week articles you may have missed.
    8 min read
    Education Briefly Stated: September 18, 2024
    Here's a look at some recent Education Week articles you may have missed.
    9 min read
    Education Briefly Stated: August 28, 2024
    Here's a look at some recent Education Week articles you may have missed.
    9 min read