Artificial Intelligence

Can Digital Tools Detect ChatGPT-Inspired Cheating?

By Alyson Klein — January 27, 2023 7 min read
Image of a examining a piece of written material.
  • Save to favorites
  • Print

Almost as soon as ChatGPT burst on the scene and stoked fears of widespread cheating, support for teachers also arrived in the form of detectors promising to sniff out writing generated by the artificial intelligence tool.

But just as ChatGPT sparked big questions around the purpose and different ways of teaching writing or what it means to communicate or be creative, these tools come with their own potential problems.

The online cheating or plagiarism detectors make mistakes. Teachers need training to understand and cope with their limitations. Too much reliance on them may leave schools poorly positioned to teach writing in a post-ChatGPT world. And AI writing tools are almost certain to get better at eluding these digital whistleblowers.

“These types of detectors could be maybe one tool in an arsenal,” said Christopher Doss, a quantitative researcher at the Rand Corporation, a research organization. “I don’t think they would ever be the only tool, so that a teacher can just create a file of assignments, run it through the system, [and] get a 100 percent accurate yes or no, and then they move on with their life. I just don’t think it’s ever going to be that simple.”

There are already several programs that help identify AI-crafted writing, and many more could become available soon. The Watson AI lab at the Massachusetts Institute of Technology developed a detector called GTLR. Packback, a learning platform, added an AI detection tool to its existing program. Even OpenAI, the developer of ChatGPT, has one. And Turnitin, a prominent plagarism detector, is developing one. It’s generally unclear what the error rates are for the products currently available.

Detectors could launch an ‘AI Arms Race’

Even if these programs can accurately pinpoint whether work was produced by the current version of ChatGPT, a fresh iteration of the AI writing tool is due out later this year, potentially sending educators back to square one.

“It’s a little bit of an arms race,” said Andreas Oranje, the vice president of Assessment and Learning Technology for the Educational Testing Service’s research and development division. “Eventually, these models [like ChatGPT] will incorporate more human behavior, get smarter. And so, then they become a little bit better and the tools that were made to detect [AI writing] are no longer working.” He likened the process to bacteria evolving, thwarting antibiotics.

But Annie Chechitelli, the chief product officer for Turnitin, a company that offers plagiarism detection software widely used in K-12 schools, believes it will be possible to spot ChatGPT’s writing for quite some time.

The bot tends to use hackneyed phrases—think “it was a dark and stormy night”—and constantly repeats the same wording and ideas.

“We’re seeing other traces come out that I’m confident, for the near future, will remain,” Chechitelli said. Turnitin expects to release its own AI detection software in time for the 2023-24 school year.

‘I can’t radically transform my classroom just yet’

Educators will ultimately need to figure out how to teach writing in a way that incorporates tools like ChatGPT, said Joshua Rosenberg, an assistant professor of STEM education at the University of Tennessee in Knoxville.

At some point, asking students to write without consulting AI will become “almost like requiring students not use the calculator when completing math problems,” he said. “That’s where this is going.”

But most educators aren’t prepared for such an abrupt transition, smack in the middle of the school year, Rosenberg said.

“I totally empathize with teachers who are like, ‘What the heck? It’s January! It’s been a crazy three years,’” Rosenberg said. “‘And I want to make sure that my students are understanding writing or English language arts concepts that I want them to learn and that [they] are expected to learn based on our state standards. I can’t radically transform my classroom just yet.’”

Some of the detection tools, though, aren’t user-friendly yet. Teachers that want to deploy one free detection program—GPT2 Output Detector, an open-source tool created with code from ChatGPT creator OpenAI—could be in for a frustrating process. It often crashes, according to some users. And in explaining its error rates, the tool uses technical jargon most educators won’t easily grasp.

That’s why two literacy-focused education technology nonprofit organizations, Quill.org and CommonLit.org, created an AI detector platform designed with teachers in mind called AIWritingCheck.org. It is essentially a teacher-tailored version of the GPT2 Output Detector, which the organizations say has an accuracy rate of detecting plagiarism roughly 80 to 90 percent of the time.

The move comes in response to a survey of more than 750 educators who use Quill’s literacy platform. More than 80 percent said they are concerned about students using ChatGPT to complete their writing assignments. And even though the latest, headline-grabbing version of ChatGPT has only been around since late last year, 17 percent of educators surveyed said they had already seen students try to pass off the bot’s work as their own original writing.

In that environment, teachers need to have some sort of mechanism to detect whether an assignment has been outsourced to AI, even if “they’re not perfectly accurate,” said Peter Gault, Quill’s founder and executive director.

“We don’t think that this is a be all, end all solution,” Gault said of his organization’s platform. “There will be false positives and false negatives, and that’s something that needs to be taken really seriously. But we think this is a helpful stopgap for teachers to give them more data and information than they otherwise would have access to.”

Other types of tools would be helpful too, he added. For instance, developers could create one that analyzes keystrokes or various versions of a draft to decide whether a particular piece of writing was produced by a human or a robot, said Gault and Michelle Brown, CommonLit’s founder and chief executive officer.

That type of technology might help educators get to a middle ground, where they can use ChatGPT to inform some writing instruction, but also expect students to do their own original work, Gault said.

“Right now, [ChatGPT is] either banned or it’s totally embraced,” Gault said. “How can we use the tool while still ensuring academic authenticity in the classroom?”

Teachers must ‘recognize the tool may be wrong’

Teachers who rely on these detectors need to be aware of their limitations, Rosenberg said.

It would be unfortunate for a detector to erroneously conclude that a bot-crafted essay was human-produced. But it could be even worse for a student who completed an assignment honestly to be accused of using tech to cheat, Rosenberg said.

That “could put a student through a really negative, possibly humiliating process,” Rosenberg said.

Teachers who don’t want their students using ChatGPT as a writing tool need to make that expectation clear from the outset, Rosenberg said.

If a student’s essay is flagged by an AI detector, teachers should see that as a “starting point for a conversation”—not a final verdict, Rosenberg said. Teachers could ask students to tell them about their writing process and “recognize that the tool might be wrong,” he said.

The truth could also be complicated. Students may have used AI as a starting point for generating ideas, or employed a tool like Grammarly, which may rewrite sentences to make them more coherent, Rosenberg said.

There are other clues that teachers could look for to figure out if a student relied, at least in part, on AI to complete an assignment, Doss said.

“In a really kind of black and white case, if you have a student who has struggled to write, and then they give you a really, really nicely written [piece], you might be suspicious as to whether or not they were actually the ones that created the product,” Doss said.

While Brown believes ChatGPT and tools like it will eventually be part of writing instruction, she said students will miss out if they rely too heavily on AI to do their work for them.

“Writing and learning how to write helps us learn how to think and it helps you organize your thoughts and helps you generate language,” she said. “I am not ready to say that all of education should just throw in the towel on the way we’ve taught writing and thinking and organization just because [developers] made cool AI.”

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in Schools
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by Panorama Education
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Science Webinar
Spark Minds, Reignite Students & Teachers: STEM’s Role in Supporting Presence and Engagement
Is your district struggling with chronic absenteeism? Discover how STEM can reignite students' and teachers' passion for learning.
Content provided by Project Lead The Way
Recruitment & Retention Webinar EdRecruiter 2025 Survey Results: The Outlook for Recruitment and Retention
See exclusive findings from EdWeek’s nationwide survey of K-12 job seekers and district HR professionals on recruitment, retention, and job satisfaction. 

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Artificial Intelligence From Our Research Center Why Schools Need to Wake Up to the Threat of AI 'Deepfakes' and Bullying
Schools are underprepared to deal with a deluge of AI-created videos that harm the reputations of students and educators.
11 min read
Custom illustration by Stuart Briers showing two identical male figures sitting in a chair with a computer dot matrix pointing to different parts of the body. The background depicts soundwaves, a play button, speaker icon, eye, and ear.
Stuart Briers for Education Week
Artificial Intelligence From Our Research Center AI Has Taken Classrooms by Storm. School Operations Could Be Next
Generative AI tools could help schools with operational tasks like budgeting, transportation, data analysis, and even zoning.
7 min read
Custom illustration by Stuart Briers showing a wrench that is filled with a blue abstract tech image of lines and dots, adjusting a cracked yellow school building. The light blue background reveals a subtle clock image.
Stuart Briers for Education Week
Artificial Intelligence From Our Research Center Will AI Transform Standardized Testing?
AI has the potential to help usher in a new, deeper breed of state standardized tests, but there are plenty of reasons for caution.
10 min read
Custom illustration by Stuart Briers showing the silhouette of a female student wearing a backpack and with a tech dot matrix and ruler in the background. There is a speech bubble containing letters in different languages highlighted within a magnifying glass.
Stuart Briers for Education Week
Artificial Intelligence From Our Research Center What Teacher PD on AI Should Look Like. Some Early Models Are Emerging
A more structured approach to professional development on AI is emerging.
7 min read
Custom illustration by Stuart Briers showing a females foot wearing gold loafers and dipping her toe into a pool where AI apps are reflecting off the water
Stuart Briers for Education Week