Artificial Intelligence

1 in 3 College Applicants Used AI for Essay Help. Did They Cheat?

By Alyson Klein — July 18, 2024 8 min read
Photo collage of robotic hand using computer keyboard.
  • Save to favorites
  • Print

Last spring, Makena, then a high school senior, was deep into cranking out some 70 essays for 20 college applications when her creativity started to wane.

So, she turned to a high-tech brainstorming partner: artificial intelligence.

One essay prompt asked Makena to describe a class she’d want to teach if she were a college professor. “I had no idea,” said Makena, who asked to be identified only by her first name to speak candidly about the admissions process. “I had never thought about it.”

She put her intended major and some favorite topics into an AI tool, which spit out a list of potential courses. Makena selected one and crafted her essay around it, without any further AI assistance.

In Makena’s mind, this wasn’t cheating.

“I wrote my own essays, 100 percent,” she said. After all, she could have found the same information on Google or by picking up a course catalogue. AI was just more efficient.

About a third of high school seniors who applied to college in the 2023-24 school year acknowledged using an AI tool for help in writing admissions essays, according to research released this month by foundry10, an organization focused on improving learning.

About half of those students—or roughly one in six students overall—used AI the way Makena did, to brainstorm essay topics or polish their spelling and grammar. And about 6 percent of students overall—including some of Makena’s classmates, she said—relied on AI to write the final drafts of their essays instead of doing most of the writing themselves.

Meanwhile, nearly a quarter of students admitted to Harvard University’s class of 2027 paid a private admissions consultant for help with their applications.

The use of outside help, in other words, is rampant in college admissions, opening up a host of questions about ethics, norms, and equal opportunity.

Top among them: Which—if any—of these students cheated in the admissions process?

For now, the answer is murky.

Colleges permit students to use pricey admissions coaches. But they are mostly silent on how AI can be used in crafting essays.

That’s created “this ethical gray area that students and [high school] counselors don’t have any guidance” on how to navigate, said Jennifer Rubin, a senior researcher at foundry10 and the lead author the report.

A ‘double standard’ on college admissions

Generative AI tools like ChatGPT have put a high-tech twist on decades-old questions of fairness in the college admissions process.

The system has “never been a level playing field,” Rubin said, citing the advantages that mostly benefit wealthier students, such as SAT tutors, paid college admissions coaches, and savvy, college-educated parents. “I think [AI] is just complicating it a little bit more because it’s a tool that’s readily available to everyone.”

To get a sense of the public’s perceptions on AI in college admissions, foundry10 included an experimental portion in its survey.

Participants reviewed an identical portion of a college essay. But one group was instructed that the applicant had help from ChatGPT in brainstorming ideas, refining content, and polishing the final draft—essentially the same tasks Makena used AI for.

Another group was told the applicant got assistance with the same parts of the writing process, from a paid college admissions coach. A third group was informed that the student worked entirely alone.

Participants rated the applicant who used ChatGPT as less authentic, less ethical, and less likable than the student who paid for professional help. (The student who worked solo got the highest ratings.)

Rubin perceives a “double standard” at work.

A student who can pay “thousands of dollars to someone who has the knowledge of how a [particular college] works and what’s needed or wanted in a college admissions essay is going to have an undue advantage,” she said.

College admissions coaching services typically cost from $60 to $349 per hour, according to data cited in Rubin’s report from Prepmaven, an admissions-preparation company.

The website of one such service, Ivy College Essay, advertises its Harvard connections. For between $1,500 and $4,800, depending on the number of applications, students receive help in brainstorming topics and “extensive written notes, comments, and guidance, focusing on both content and structure,” according to the site.

“We go back and forth as many times as needed until we have a very strong and solid Ivy League college application!” the company promises.

Assistance from ChatGPT on similar tasks “probably isn’t going to be as strong” as what such a service offers, Rubin said. “But it might provide students some form of feedback that they might not be able to get in their lives because they don’t have parents or caregivers” who have the savvy to help.

These issues are especially personal for Rubin, a first-generation college graduate who attended a private high school on scholarship. She had the help of her school counselors in applying to college.

But that assistance couldn’t make up for the gap between Rubin and many of her peers with highly educated parents, who could offer all sorts of support, she said.

Big questions on AI use go mostly unanswered by colleges

For now, high school counselors aren’t sure what to tell their students when it comes to how AI can be ethically used in the admissions process.

“My seniors have come to me and said, ‘Hey, I’ve got to write an essay about this. Where do I even start?’ Or ‘is it OK if I use ChatGPT?’” said Melissa Millington, a school counselor in Missouri. “I just really hit on, you cannot pass that off as your own work, because that’s not ethical.”

But, like Rubin, she sees some possibility for the technology in crafting applications that stop short of making AI a sole, uncredited ghost writer.

“If you are going to use it to get a starting point, that’s totally fine,” she said she’s told students. “Or if you want to write your essay, and then put it in there and ask it to clean [the] grammar,” that’s likely fair game.

While most colleges and universities are silent on the AI issue, some individual institutions have given applicants the green light to use AI in a limited fashion.

One of the country’s most prestigious institutions focusing on science, math, engineering and technology, CalTech, tells prospective students that it’s unethical to copy and paste an essay written entirely by generative AI. But it is acceptable to use AI to brainstorm or check grammar and spelling, the college says.

Georgia Polytechnical Institute, another highly regarded STEM-focused university, has a similar policy.

“If you choose to utilize AI-based assistance … we encourage you to take the same approach you would when collaborating with people,” the school’s website says. “Use it to brainstorm, edit, and refine your ideas.”

But for other colleges, any use of AI is unacceptable, at least officially. Brown University, for instance, cites its fraud policy and tells applicants that the use of AI is “not permitted under any circumstances.”

‘It always been an honor system’

Brown and other institutions have no real way of enforcing those policies, Rubin said.

AI detectors are notoriously unreliable. And they are disproportionately likely to flag writing by students who are not native English speakers, even if they didn’t use AI.

In fact, Kristin Woelfel, a policy counsel specializing in equity in civic technology for the Center for Democracy & Technology, a nonprofit organization that aims to shape technology policy, has gone so far as to say the detectors have the potential to violate students’ civil rights.

It doesn’t really matter if colleges have guidelines that prohibit AI use, Rubin said, because there’s no way to check on what kind of assistance an applicant received, human or not.

“It’s always been on the honor system,” she said.

Colleges that haven’t outlined their policies on AI in the application process are ignoring the obvious—and making life harder for high school counselors and their students, said Maritza Cha, who worked as a school counselor in Southern California for nearly a decade and has taught high school counseling as an adjunct professor.

“We’re at the point of either you can kind of put your head down in the sand and pretend it’s not happening, which is not realistic,” Cha said. “Or you can just acknowledge that they’re using some kind of AI” in the admissions process.

Counselors can model proper use of AI in the college search

While much of the work in setting clear guidelines needs to happen at the college level, there are steps high school educators can take.

Rubin believes that if counselors and teachers are really thinking about leveling the playing field between first-generation college students from low-income families and their peers, it might be helpful to show how generative AI can ethically guide the college admissions process.

For instance, students could put areas of study they are interested in and a desired geographic region into a tool like ChatGPT and ask for recommendations on where to apply.

“Generative AI can provide them some really concrete information,” Rubin said. Even though they should check that data against more accurate sources, it can help a student narrow their search.

Students can even have a “conversation back and forth” with AI if they don’t have access to a college counselor at school who can meet with them consistently, she said.

And they can model how to use AI to spur their creativity or proofread final drafts, without crossing the line into wholesale cheating, she said.

But, ultimately, high school educators and college officials need to have conversations about what responsible use of AI looks like, including in crafting college applications, Rubin said.

In Rubin’s view, those discussions should acknowledge that many students already have access to other types of help—whether that’s from professional consultants or parents and older siblings familiar with the process of applying to college.

Makena, for instance, thinks she can write a stronger, more personal essay than anything ChatGPT could cook up. She didn’t feel the need to pay a private counselor either, since she wanted to rely on her own voice as much as possible.

She did, however, have a low-tech, presumably cost-free assistant: Her father, who edited all 70-plus of her essays.

A version of this article appeared in the August 14, 2024 edition of Education Week as 1 in 3 College Applicants Used AI For Essay Help. Did They Cheat?

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Achievement Webinar
Scaling Tutoring through Federal Work Study Partnerships
Want to scale tutoring without overwhelming teachers? Join us for a webinar on using Federal Work-Study (FWS) to connect college students with school-age children.
Content provided by Saga Education
School & District Management Webinar Crafting Outcomes-Based Contracts That Work for Everyone
Discover the power of outcomes-based contracts and how they can drive student achievement.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in Schools
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by Panorama Education

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Artificial Intelligence Opinion What to Know About AI Misinformation: A Primer for Teachers (Downloadable)
It’s not difficult to educate students to be savvy about artificial intelligence. Two researchers offer simple steps.
Sam Wineburg & Nadav Ziv
1 min read
Modern collage with halftone hands, eyes and search box. Person looking for information in the search bar. Concept of searching, looking, finding opportunities and knowledge in internet. SEO concept
Alona Horkova/iStock + Education Week
Artificial Intelligence Q&A What Happens When an AI Assistant Helps the Tutor, Instead of the Student
A randomized controlled trial from Stanford University examines the efficacy of an AI-powered tutoring assistant.
4 min read
Illustration of artificial intelligence bot in a computer screen teaching math.
iStock/Getty
Artificial Intelligence From Our Research Center 'We're at a Disadvantage,' and Other Teacher Sentiments on AI
Teachers say they have other, more pressing priorities.
3 min read
3D illustration of AI button of keyboard of a modern computer. Light blue button. 3D rendering on blue background. Training.
Bo Feng/iStock + Education Week
Artificial Intelligence Opinion What Makes Students (and the Rest of Us) Fall for AI Misinformation?
Researchers Sam Wineburg and Nadav Ziv explain how to turn your students into savvy online fact-checkers.
Sam Wineburg & Nadav Ziv
4 min read
Trendy pop art collage search concept. Halftone laptop computer with search bar and cut out hands pointing on it.
Cristina Gaidau/iStock