Special Report
Artificial Intelligence Explainer

AI Literacy, Explained

By Alyson Klein — May 10, 2023 10 min read
Photo collage of computer with pixelated image of girl.
  • Save to favorites
  • Print

K-12 students have grown up in a world where artificial intelligence informs what their families buy at the grocery store, how scientists track the spread of diseases, and even how the photo filters work on their favorite social media apps.

But the technology was largely invisible to them—and their teachers—until a new version of ChatGPT burst onto the educational scene late last year. The tool can spit out an essay on Shakespeare, a detective novel, or a legal brief that appears remarkably like something a human has labored over. It can also do computer coding.

The technology poured accelerant on a conversation already underway: Now that AI is shaping nearly every aspect of our lives and is expected to transform fields from medicine to agriculture to policing, what do students need to understand about AI to be prepared for the world of work? To be a smart consumer and a responsible citizen?

“The AI genie is out of the bottle,” said Cynthia Breazeal, a professor of media arts and sciences at the Massachusetts Institute of Technology. “It’s not just in the realm of computer science and coding. It is affecting all aspects of society. It’s the machine under everything. It’s critical for all students to have AI literacy if they are going to be using computers, or really, almost any type of technology” in their daily lives.

The question of what makes a person AI literate is evolving. But it involves delving into technical questions like: “What’s happening under the hood? How does it work? How does it impact me and the world around us?” said Bryan Cox, the lead computer science specialist at the Georgia education department.

AI literacy is something that every student needs exposure to—not just those who are planning on a career in computer science, experts argue.

“When we all went to school, we learned how the light bulbs work or how the digestive system works or how photosynthesis works,” said Hadi Partovi, the CEO of Code.org, which works to expand computer science offerings in K-12 schools. “And you teach those things to everybody, not just the botanists or the electricians or the surgeons. You learn [these things] to have a better understanding of your world. But most people don’t know how the internet works, how a smartphone works, how an algorithm works, and they definitely don’t know how AI works.”

Here’s how to begin developing AI literacy, according to experts and educators:

1. Why it’s vital to have a basic understanding of how AI works

Grasping the technical aspects of AI—how the technology perceives the world, how it collects and processes data, and how those data can inform decisions and recommendations—can help temper the oftentimes inaccurate perception that AI is an all-knowing, infallible force, experts say.

“We need to demystify how these systems work, how you build these things, at a grade-appropriate kind of level, because there’s so much hype and confusion,” Breazeal said. “People talk about these things like a conscious ether that surrounds us. [Students] need to understand that we’re really talking about [something] people actually make and control and engineer.”

Artificial intelligence technologies replicate human-like intelligence by training machines and computer systems to do tasks that simulate some of what the human brain can do. It relies on systems that can actually learn, usually by analyzing vast quantities of data and searching out new patterns and relationships. These systems can actually improve over time, becoming more complex and accurate as they take in more information.

To be sure, some students can delve deeper into the inner workings of AI than others, she added. “If you’re an AP Computer Science kid, you’re going to get a little more in the technical weeds about it,” Breazeal said.

But all students must grasp that the decisions that AI makes—whether it’s to recommend a particular pair of boots to an Amazon customer or to flag a job applicant as a promising prospect for an employer—aren’t driven by the same kind of nuanced and creative reasoning a human can perform.

Instead, it’s “really advanced guesswork,” Partovi said. “AI right now is based on probability and statistics. It makes errors. It can spread misinformation. It can have bias. Understanding how it actually works underneath is important so that people recognize the weaknesses it has.”

And Partovi noted that students still need to learn how to write, even though the writing skills of technologies like ChatGPT are more sophisticated than most people might have imagined. The same idea applies to learning computer coding, he said. “The superpowers of AI are only available to people who know how to write or how to code,” Partovi said.

In fact, students want to know more about AI. More than nine in 10 teens say they would be interested in learning in high school about how to work with artificial intelligence, according to a survey by the nonprofit Junior Achievement with the marketing and data firm Big Village, conducted between February and March of this year.

Students should have some idea of how machines perceive the world, said Breazeal, citing a framework developed by AI4K12, a nonprofit organization aimed at helping schools teach AI. That means discussing things like speech-recognition technology, sensors, and machine vision and understanding how they work.

Children in early-elementary school, for instance, could start with a simple lesson in which they identify the human organs—ears, eyes—involved in hearing and seeing and then find their technological counterparts —microphone, camera—on a digital device.

Students also need to understand how biases in the data that’s used to train AI can allow the technology to continue to perpetuate discriminatory policies unless humans recognize the problem and do something about it.

2. Give students hands-on opportunities to understand how the technology works

One hands-on lesson for more advanced students: Give them a flawed historical dataset on which to train an AI system, Partovi said. For instance, students could create a program that gives suggested salary ranges for a company’s employees.

If that program is informed using data in which women are paid less than men for doing the same job, the technology will probably propose lower salaries for female employees than for male workers. But if women are at salary parity with men in the dataset, the results will be more equitable.

There are other ways of illustrating how human subjectivity can penetrate AI’s decisionmaking. For instance, the technology organizes information to draw conclusions, a less clear-cut process than students may initially realize.

To illustrate this, students in a middle school AI course that Georgia is piloting play a game in a physical classroom that initially asks students to decide whether something is edible or not, Cox said. Depending on their decisions, students choose where to stand in the room.

Some of the answers are obvious: A metal stop sign is not edible, for instance.

But there can be disagreement on a word like “chicken,” which vegetarians in the class may claim is not edible.

From there, the game begins adding more and more categories and subcategories to the list of options, mimicking how AI algorithms can work.

During the activity, “great conversations erupt,” said Cox from the Georgia education department. “And we’ll talk to the students about how the computer is making those same determinations” based on the viewpoints of the majority of people it interacts with. And those determinations are often faulty ones.

3. Discuss and analyze ethical questions about the technology

Once students are aware that humans—and not some sentient robot—are behind how these tools analyze and communicate, they can think about them in a broader context.

And that raises all kinds of important and interesting ethical issues. In Georgia’s middle school AI course, for instance, students might consider a case of passengers going to sleep in the backseat of a self-driving car while it continues along the road.

Then they’ll unpack questions without easy answers: “What are the legal implications of that? Do we need to stop them from doing that? Or is AI to the point where you can actually sleep in the backseat of a car and let it drive itself?” Cox said.

They can discuss legislation in states that have banned facial-recognition software, which is notoriously less reliable when it comes to identifying women and people of color. The technology has falsely flagged some people as having a criminal record.

Students consider questions like: Should all states ban facial-recognition software until it becomes more accurate? Or could it be used for some purposes but not others?

They can also discuss data-privacy concerns, including the implications of having devices such as smart speakers—Amazon’s Alexa, Google Assistant, and Apple’s Siri, among others—in homes. Teachers will pose questions: “How is that impacting you? Is it listening to you all the time? Is it sending information back to Google or Amazon all the time?” Cox said. “That’s really one of the biggest things we wanted to be able to do is to help them guard against the unintended consequences of AI.”

4. How to interact effectively with AI

Students will need to practice using AI tools to get information, the same way previous generations learned the card-catalog system to navigate the library.

ChatGPT, for instance, is deeply influenced by the prompt a user gives it, said Torrey Trust, an associate professor of learning technology in the College of Education at the University of Massachusetts Amherst.

For instance, a student could tell the tool to “’write about the American Revolution.’ It’s gonna give a very textbook response. And then you could say, ‘well, write about 15 women who shaped the American Revolution or draw connections between 15 women today and the American Revolution,’” she said. “The way you prompt the tool completely changes the output and your thinking and learning.”

The stakes in the real world, of course, can be a lot higher than a classroom assignment. Getting the prompt right can turn into a “matter of life and death” for a doctor using AI to pinpoint a diagnosis, Trust said.

That’s why it’s not a good idea to ban ChatGPT or other AI tools, as New York City and some other school districts have done, Partovi said. He noted that while the nation’s largest public school district essentially banned the technology except for use in limited circumstances, at least one prominent Manhattan private school is offering instruction in “prompt engineering,” an AI skill that could lead to a lucrative job. And that makes AI skill development an equity issue, some experts argue.

“I think a lot of the pushback was, ‘well, why do people need to know computational thinking in English/language arts?’” Trust said. “And I think AI has made that more clear.”

5. Let students know that AI skills are not just for computer science experts

Students need to be exposed to how AI is being used in the workforce today and how they might use the technology in their future careers, even if they don’t go into a computer science field.

“There are all kinds of people involved in the design and deployment of AI-based solutions,” Breazeal said. “One of the things we try to do is show [how AI impacts] very diverse roles, diverse industries so that kids can appreciate that ‘chances are when I enter the workforce, if I’m not making these AI kinds of things, I’m probably at least using them in a way that helps me get my job done.’”

In fact, if students are going to enter a policy field—or simply vote in elections—they’ll also need a grounding in how the technology works, said Leigh Ann DeLyser, the executive director and co-founder of CSforALL, a nonprofit organization, which seeks to help expand computer science education.

“The lack of solid citizenship understanding of how technology works leads business leaders and policy leaders to make bad decisions and ask crappy questions,” DeLyser said. For instance, during a recent congressional hearing at which TikTok’s CEO Shou Zi Chou testified, lawmakers appeared to misunderstand very basic information about how data storage and the app’s terms of use work.

What’s more, MIT’s Breazeal and other experts say that AI will work best when it is designed by people who are part of the community that the tool is aimed at serving. Having people from a variety of backgrounds—racial, socioeconomic, age, gender, profession—can help root out some of the biases embedded in society and, therefore, AI.

Students—particularly those from communities that are underrepresented in the AI field—need to understand that by getting in on the ground floor of this technology, they can help ensure that it works better.

“You have to take kids seriously,” Breazeal said. “We want to empower them to say, ‘I can make a difference right now.’ So let’s get you the education, the curriculum, the tools, the community so that you can make a difference. And that’s super empowering for kids.”

Related Tags:

A version of this article appeared in the June 07, 2023 edition of Education Week as AI Literacy, Explained

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Recruitment & Retention Webinar
Substitute Teacher Staffing Simplified: 5 Strategies for Success
Struggling to find quality substitute teachers? Join our webinar to learn key strategies to keep your classrooms covered and students learning.
Content provided by Kelly Education
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
AI in Education: Empowering Educators to Tap into the Promise and Steer Clear of Peril
Explore the transformative potential of AI in education and learn how to harness its power to improve student outcomes.
Content provided by Panorama Education
English Learners Webinar Family and Community Engagement: Best Practices for English Learners
Strengthening the bond between schools and families is key to the success of English learners. Learn how to enhance family engagement and support student achievement.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Artificial Intelligence Q&A What Happens When an AI Assistant Helps the Tutor, Instead of the Student
A randomized controlled trial from Stanford University examines the efficacy of an AI-powered tutoring assistant.
4 min read
Illustration of artificial intelligence bot in a computer screen teaching math.
iStock/Getty
Artificial Intelligence From Our Research Center 'We're at a Disadvantage,' and Other Teacher Sentiments on AI
Teachers say they have other, more pressing priorities.
3 min read
3D illustration of AI button of keyboard of a modern computer. Light blue button. 3D rendering on blue background. Training.
Bo Feng/iStock + Education Week
Artificial Intelligence Opinion What Makes Students (and the Rest of Us) Fall for AI Misinformation?
Researchers Sam Wineburg and Nadav Ziv explain how to turn your students into savvy online fact-checkers.
Sam Wineburg & Nadav Ziv
4 min read
Trendy pop art collage search concept. Halftone laptop computer with search bar and cut out hands pointing on it.
Cristina Gaidau/iStock
Artificial Intelligence Parents Sue After School Disciplined Student for AI Use: Takeaways for Educators
The Massachusetts lawsuit is one of the first to highlight the benefits and challenges of generative AI use in the classroom.
5 min read
Person using technology smart robot AI, enter command prompt. A.I. Chat concept AI, Artificial Intelligence.
iStock/Getty