Opinion Blog

Classroom Q&A

With Larry Ferlazzo

In this EdWeek blog, an experiment in knowledge-gathering, Ferlazzo will address readers’ questions on classroom management, ELL instruction, lesson planning, and other issues facing teachers. Send your questions to lferlazzo@epe.org. Read more from this blog.

Artificial Intelligence Opinion

Wondering How to Deal With AI in Your Classroom? Try These Ideas

By Larry Ferlazzo — February 05, 2025 6 min read
Conceptual illustration of classroom conversations and fragmented education elements coming together to form a cohesive picture of a book of classroom knowledge.
  • Save to favorites
  • Print

Today’s post is the latest in a two-year series on artificial intelligence in the classroom.

‘These Tools Aren’t Magic’

Jane Rosenzweig is the director of the Harvard College Writing Center, writes frequently about AI and education in her Writing Hacks newsletter and elsewhere, and publishes The Important Work, a newsletter written by high school and college writing instructors about teaching writing in the age of AI:

There’s a lot of discussion—and disagreement—about whether and how students should be using generative AI tools in the classroom. But as these tools become more and more widely available, what we do know is that our students are going to be using them and that we need to be able to talk to them about the role of generative AI in their education.

I’m wary of making grand pronouncements about how to talk to students about using generative AI tools when things are changing so quickly. But I do think there are some guidelines we can follow when deciding to use these tools in the classroom—or outside the classroom.

1. Teachers should understand the basics about how generative AI tools like ChatGPT actually work

I’m not suggesting that every teacher has to become a tech expert—I’m certainly not one. But if you’re going to use these tools or encourage your students to use them, it’s helpful to understand how these tools are trained and how they generate output.

Here’s an example from my own class: On the first day of class in the fall, one of my students mentioned that she really liked using ChatGPT because it’s more objective than humans. If you believed that, it would definitely shape how you use ChatGPT. But it’s not actually true: AI tools like ChatGPT can only answer questions based on what’s in their training data, and that data is drawn largely from what’s available online—not from some objective or all-knowing source.

AI tools also “hallucinate”—meaning they sometimes just give you inaccurate information. Students find it interesting to learn how these tools generate output, and you can explain this in ways that are grade appropriate. Here are some resources that I’ve found helpful for learning how generative AI tools work.

This explainer from the Financial Times explains how large language models work with helpful examples.

If you want to take a deeper dive, try this article, “Large language models explained, with a minimum of math and jargon.”

2. Talk to your students about what you want them to learn, not just about what tools like ChatGPT can do or whether they are allowed to use them.

I think it’s helpful to look at the use of generative AI tools in terms of what problems you’re trying to solve in the classroom. (In fact, I teach a writing course called To What Problem is ChatGPT the Solution.)

I’ve found this framework to be helpful for myself—but also for my students. I talk to them about what problems they’re solving when they use AI: Is it the problem of not having time to do the work? Is it the problem of not having an idea? Or is it an interesting, knotty problem that’s hard to solve that generative AI might help them solve in a cool way?

I also tell them that I’m not asking them to write papers because the world needs more papers; I’m asking them to write papers because it’s one way of thinking through a problem—and then we talk about how using AI at different points in the writing process may or may not get in the way of that thinking.

There’s a big difference between telling students to use or not to use generative AI and telling them why what you want them to do matters in the first place. Framing things this way may not always stop students from using these tools in ways you think are counterproductive—but it will help students understand where you’re coming from.

lookattheusejane

3. Be aware of the difference between useful and not useful ways of using these tools.

We’ve heard a lot about how AI tools like Khanmigo can provide personalized tutoring. But some teachers are finding that some students using these tools are not engaging with them or learning from them—and that sometimes the way Khanmigo helps students is different from what you’d do in your own classroom.

If you’re asking your students to use AI tools, it’s going to be helpful to be aware of how the same tool you’ve set up to enhance learning could get in the way of that learning. Dan Meyer offers a useful example of this over at his newsletter, Mathworlds.

4. Don’t remove the friction from the learning process.

Tools like ChatGPT are being marketed as efficiency tools—tools that will save us time so that, as OpenAI says, we can focus on other things. But learning requires time, and it requires friction.

If you’re going to use AI tools with your students, it’s useful to consider how you’re setting up assignments to allow for that productive friction.

When I made a chatbot to help my students practice counterargument, some of them were surprised that the chatbot didn’t enable them to do the work more quickly. But I wasn’t trying to help them be efficient; I was trying to help them learn something complicated.

I’ve written more about friction and learning here. This piece on friction and time-saving is a great overview of the conversation about friction and AI, with a focus on Magic School.

learningrequires

5. Beware of the hype.

It seems like new tools are being released every day, and I’m the first to note that tools like Google’s NotebookLM, which turns any text into a podcast, are pretty cool! But they were not designed to solve problems that we’re trying to solve in the classroom. They were designed to get people to use them.

I’ve found over the past few years that when I question the role of these tools in the classroom or express concerns about the hype, some people tell me that I must be anti-technology. But that’s not true at all—I was an early experimenter with GPT and I’m very interested in all of these tools.

However: It’s not our job as educators to adopt technology because it’s cool; it’s our job to ask hard questions and think about what will help our students learn. Which brings me back to my earlier question: When thinking about how to teach your students about AI, it’s useful to start by asking what problems you’re trying to solve in your classroom and how AI can help solve those (or whether it will create new ones).


We’ve entered an era where there will be new generative AI tools regularly that come with promises to magically solve all the challenges we face as teachers. But it’s worth keeping in mind that these tools aren’t magic—and that the way you choose to use them—or not—should always be based on what you’re trying to do in your classroom.

itsnotourjob

Thanks to Jane for contributing her thoughts!

Today’s post answered this question:

What are guidelines teachers should follow when teaching students to use or not use artificial intelligence?”

Consider contributing a question to be answered in a future post. You can send one to me at lferlazzo@epe.org. When you send it in, let me know if I can use your real name if it’s selected or if you’d prefer remaining anonymous and have a pseudonym in mind.

You can also contact me on Twitter at @Larryferlazzo.

Just a reminder; you can subscribe and receive updates from this blog via email. And if you missed any of the highlights from the first 12 years of this blog, you can see a categorized list here.

The opinions expressed in Classroom Q&A With Larry Ferlazzo are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Stop the Drop: Turn Communication Into an Enrollment Booster
Turn everyday communication with families into powerful PR that builds trust, boosts reputation, and drives enrollment.
Content provided by TalkingPoints
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Special Education Webinar
Integrating and Interpreting MTSS Data: How Districts Are Designing Systems That Identify Student Needs
Discover practical ways to organize MTSS data that enable timely, confident MTSS decisions, ensuring every student is seen and supported.
Content provided by Panorama Education
Artificial Intelligence Live Online Discussion A Seat at the Table: AI Could Be Your Thought Partner
How can educators prepare young people for an AI-powered workplace? Join our discussion on using AI as a cognitive companion.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Artificial Intelligence Letter to the Editor I’m Pro-Technology, But AI’s Role in Education Worries Me
A parent shares his concerns with artificial intelligence in K-12.
1 min read
Education Week opinion letters submissions
Gwen Keraval for Education Week
Artificial Intelligence 'Grok' Chatbot Is Bad for Kids, Review Finds
The chatbot on X suggests risky behavior, and is unsafe for teens, Common Sense Media says.
4 min read
Workers install lighting on an "X" sign atop the company headquarters, formerly known as Twitter, in downtown San Francisco, July 28, 2023. Grok is the artificial intelligence chatbot built into the social media platform X.
Workers install lighting on an "X" sign atop the company headquarters of X, a social media platform formerly known as Twitter, in San Francisco on July 28, 2023. Grok is the artificially intelligent chatbot built into the social media platform.
Noah Berger/AP
Artificial Intelligence States Put 'Unprecedented' Attention on AI's Role in Schools
Most of the bills address AI literacy and require guidance on responsible use of the technology.
4 min read
Image of AI in a magnifying glass superimposed over an aerial view of a school.
Collage via EdWeek and Getty
Artificial Intelligence 'Dangerous, Manipulative Tendencies’: The Risks of Kid-Friendly AI Learning Toys
Toys powered by AI often generate inappropriate responses to questions.
4 min read
Photo illustration of a 3d rendering of a chatbot hovering over a motherboard circuit.
iStock/Getty