IT Infrastructure & Management

Algorithmic Bias a Rising Concern for K-12 Ed-Tech Field

By Benjamin Herold & Sarah Schwartz — April 18, 2017 3 min read
  • Save to favorites
  • Print

From criminal sentencing to credit scores, algorithms and artificial intelligence increasingly make high-stakes decisions that have big implications for people’s freedom, privacy, and access to opportunity.

Despite the almost-blind faith people can put in such “artificial agents,” it’s no secret that they are often biased, according to a report from the RAND Corp. that has implications for education.

More than ever, RAND researchers Osonde Osoba and Bill Welser said in an interview, it’s important to raise awareness about the role that algorithms play and to push for a public accounting of their impact—particularly in areas that involve the public interest, including K-12 education.

“For the longest time, any time questions of bias came up, hard-core researchers in artificial intelligence and algorithms dismissed them because they were not ‘engineering concerns,’ ” Osoba said. “That was OK for commercial toys, but the moment the switch was made to applying algorithms to public-policy systems, the issue of bias no longer became a triviality.”

The new RAND report, “An Intelligence in Our Image: The Risks of Bias and Errors in Artificial Intelligence,” does not focus on education. Instead, the authors lay out examples such as the algorithmic bias in criminal sentencing and the problems with Tay, a chatbot developed by Microsoft that was supposed to learn the art of conversation by interacting with Twitter users—and quickly began spewing racist and vulgar hate speech.

Artificial agents can process the immense streams of data now running through society in ways that humans can’t, making them a necessary tool for modern society, the RAND researchers write. But too often, they say, the public ascribes objectivity and neutrality to algorithms and artificial intelligence, even though most function as a “black box” and some have been shown to result in different outcomes for different groups of people.

Origins of Digital Bias

Where does such bias come from?

The individual humans who program the artificial agents, who may have biases they are not even aware of; a pool of computer and data scientists that is far less diverse than the populations their products eventually affect; and biases in the data that are used to train the artificial agents to “learn” by finding patterns, RAND concluded.

All those issues are present in the ed-tech field.

One example: the growing field of “curriculum playlists”—educational software programs that rely on algorithms to choose what types of instructional content and learning experiences students have each day in the classroom. Algorithm-driven tools are also use by some districts to provide career and college guidance and to hire teachers.

What if such tools are biased against students of color, or students with special needs? How would educators, parents, and students even know?

Such questions are both realistic and important for the field to be asking, Osoba and Welser said.

“Educators need to not cede complete control to the computer,” Welser said. That means being aware which products used in the classroom, school, or district rely on algorithms and artificial intelligence to make decisions; understanding what decisions they are making; and paying attention to how different groups of students are experiencing the products.

Maribeth Luftglass, the assistant superintendent for information technology for the Fairfax County, Va., schools, said it is the district’s responsibility to make sure digital tools driven by algorithms remain bias-free. When it comes to assessing students, she said, adaptive algorithms will never replace a human teacher’s evaluation.

“It’s not to say that you can’t use artificial agents to help you identify particular potential gaps in instruction and learning,” Luftglass said.

The ed-tech product-development process provides opportunities to detect bias that are unavailable to companies in other industries, said Bridget Foster, the senior vice president for the Education Technology Industry Network, a division of the Software & Information Industry Association. “In education, developers are right there in the school, in the classroom, working with educators,” she said.

Foster’s organization recommends that ed-tech companies figure out how to mitigate bias from the early planning and development stages onward.

Welser said it’s too early to try to regulate the field or mandate bias testing across the board. But, he argued, it is time for a conversation to begin in K-12 about how to address the potential biases in algorithms.

Related Tags:

A version of this article appeared in the April 19, 2017 edition of Education Week as Algorithmic Bias a Rising Concern for K-12 Ed-Tech Field

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Well-Being Webinar
Attend to the Whole Child: Non-Academic Factors within MTSS
Learn strategies for proactively identifying and addressing non-academic barriers to student success within an MTSS framework.
Content provided by Renaissance
Classroom Technology K-12 Essentials Forum How to Teach Digital & Media Literacy in the Age of AI
Join this free event to dig into crucial questions about how to help students build a foundation of digital literacy.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

IT Infrastructure & Management What Districts Can Do With All Those Old Chromebooks
The Chromebooks and tablets districts bought en masse early in the pandemic are approaching the end of their useful lives.
3 min read
Art and technology teacher Jenny O'Sullivan, right, shows students a video they made, April 15, 2024, at A.D. Henderson School in Boca Raton, Fla. While many teachers nationally complain their districts dictate textbooks and course work, the South Florida school's administrators allow their staff high levels of classroom creativity...and it works.
Art and technology teacher Jenny O'Sullivan, right, shows students a video they made on April 15, 2024, at A.D. Henderson School in Boca Raton, Fla. After districts equipped every student with a device early in the pandemic, they now face the challenge of recycling or disposing of the technology responsibly.
Wilfredo Lee/AP
IT Infrastructure & Management Los Angeles Unified's AI Meltdown: 5 Ways Districts Can Avoid the Same Mistakes
The district didn't clearly define the problem it was trying to fix with AI, experts say. Instead, it bought into the hype.
10 min read
Close up of female hand holding smartphone with creative AI robot hologram with question mark in speech bubble on blue background. Chat GPT and failure concept.
Peshkov/iStock/Getty
IT Infrastructure & Management Aging Chromebooks End Up in the Landfill. Is There an Alternative?
Districts loaded up on devices during the pandemic. What becomes of them as they reach the end of their useful lives?
5 min read
Brandon Hernandez works on a puzzle on a tablet before it's his turn to practice reading at an after school program at the Vardaman Family Life Center in Vardaman Miss., on March 3, 2020.
Brandon Hernandez works on a puzzle on a tablet before it's his turn to practice reading at an after-school program at the Vardaman Family Life Center in Vardaman Miss., on March 3, 2020. Districts that acquired devices for every student for the first time during the pandemic are facing decisions about what to do at the end of the devices' useful life.
Thomas Wells/The Northeast Mississippi Daily Journal via AP
IT Infrastructure & Management Schools Can't Evaluate All Those Ed-Tech Products. Help Is on the Way
Many districts don't have the time or expertise to carefully evaluate the array of ed-tech tools on the market.
2 min read
PC tablet with cloud of application icons floating from off the screen.
iStock/Getty