When philosophy professor Darren Hick came across another case of cheating in his classroom at Furman University last semester, he posted an update to his followers on social media: "Aaaaand, I've caught my second ChatGPT plagiarist."
Friends and colleagues responded, some with wide-eyed emojis. Others expressed surprise.
"Only 2?! I've caught dozens," said Timothy Main, a writing professor at Conestoga College in Canada. "We're in full-on crisis mode."

Lori Anne Salem, Assistant Vice Provost and Director of the Student Success Center, hosts a faculty teaching circle on artificial intelligence on Wednesday, Aug. 9, 2023, at Temple University in Philadelphia. Educators say they want to embrace the technology’s potential to teach and learn in new ways, but when it comes to assessing students, they see a need to “ChatGPT-proof” test questions and assignments. (AP Photo/Joe Lamberti)
Practically overnight, ChatGPT and other artificial intelligence chatbots have become the go-to source for cheating in college.
Now, educators are rethinking how they'll teach courses this fall from Writing 101 to computer science. Educators say they want to embrace the technology's potential to teach and learn in new ways, but when it comes to assessing students, they see a need to "ChatGPT-proof" test questions and assignments.
People are also reading…
For some instructors that means a return to paper exams, after years of digital-only tests. Some professors will be requiring students to show editing history and drafts to prove their thought process. Other instructors are less concerned. Some students have always found ways to cheat, they say, and this is just the latest option.
An explosion of AI-generated chatbots including ChatGPT, which launched in November, has raised new questions for academics dedicated to making sure that students not only can get the right answer, but also understand how to do the work. Educators say there is agreement at least on some of the most pressing challenges.

Dr. Stephanie Laggini Fiore, Associate Vice Provost and Sr. Director of the Center for the Advancement of Teaching, back, hosts a faculty teaching circle on artificial intelligence on Wednesday, Aug. 9, 2023, at Temple University in Philadelphia. Educators say they want to embrace the technology’s potential to teach and learn in new ways, but when it comes to assessing students, they see a need to “ChatGPT-proof” test questions and assignments. (AP Photo/Joe Lamberti)
Are AI detectors reliable?
Not yet, says Stephanie Laggini Fiore, associate vice provost at Temple University. This summer, Fiore was part of a team at Temple that tested the detector used by Turnitin, a popular plagiarism detection service, and found it to be "incredibly inaccurate." It worked best at confirming human work, she said, but was spotty in identifying chatbot-generated text and least reliable with hybrid work.
Will students get falsely accused of using artificial intelligence platforms to cheat?
Absolutely. In one case last semester, a Texas A&M professor wrongly accused an entire class of using ChatGPT on final assignments. Most of the class was subsequently exonerated.

Robin Kolodny, left, a political science professor, works with Eunsook Ha Rhee, an associate professor of instruction, during a faculty teaching circle on artificial intelligence on Wednesday, Aug. 9, 2023, at Temple University in Philadelphia. Educators say they want to embrace the technology’s potential to teach and learn in new ways, but when it comes to assessing students, they see a need to “ChatGPT-proof” test questions and assignments. (AP Photo/Joe Lamberti)
So, how can educators be certain if a student has used an AI-powered chatbot dishonestly?
It's nearly impossible unless a student confesses, as both of Hicks' students did. Unlike old-school plagiarism where text matches the source it is lifted from, AI-generated text is unique each time.
In some cases, the cheating is obvious, says Main, the writing professor, who has had students turn in assignments that were clearly cut-and-paste jobs. "I had answers come in that said, 'I am just an AI language model, I don't have an opinion on that,'" he said.
In his first-year required writing class last semester, Main logged 57 academic integrity issues, an explosion of academic dishonesty compared to about eight cases in each of the two prior semesters. AI cheating accounted for about half of them.

Lori Anne Salem, Assistant Vice Provost and Director of the Student Success Center, hosts a faculty teaching circle on artificial intelligence on Wednesday, Aug. 9, 2023, at Temple University in Philadelphia. Educators say they want to embrace the technology’s potential to teach and learn in new ways, but when it comes to assessing students, they see a need to “ChatGPT-proof” test questions and assignments. (AP Photo/Joe Lamberti)
What happens next?
This fall, Main and colleagues are overhauling the school's required freshman writing course. Writing assignments will be more personalized to encourage students to write about their own experiences, opinions and perspectives. All assignments and the course syllabi will have strict rules forbidding the use of artificial intelligence.
College administrators have been encouraging instructors to make the ground rules clear.
Many institutions are leaving the decision to use chatbots or not in the classroom to instructors, said Hiroano Okahana, the head of the Education Futures Lab at the American Council on Education.
At Michigan State University, faculty are being given "a small library of statements" to choose from and modify as they see fit on syllabi, said Bill Hart-Davidson, associate dean in MSU's College of Arts and Letters who is leading AI workshops for faculty to help shape new assignments and policy.
"Asking students questions like, 'Tell me in three sentences what is the Krebs cycle in chemistry?' That's not going to work anymore, because ChatGPT will spit out a perfectly fine answer to that question," said Hart-Davidson, who suggests asking questions differently. For example, give a description that has errors and ask students to point them out.

Dr. Stephanie Laggini Fiore, Associate Vice Provost and Sr. Director of the Center for the Advancement of Teaching, hosts a faculty teaching circle on artificial intelligence on Wednesday, Aug. 9, 2023, at Temple University in Philadelphia. Educators say they want to embrace the technology’s potential to teach and learn in new ways, but when it comes to assessing students, they see a need to “ChatGPT-proof” test questions and assignments. (AP Photo/Joe Lamberti)
How AI impacts studying
Evidence is piling up that chatbots have changed study habits and how students seek information.
Chegg Inc., an online company that offers homework help and has been cited in numerous cheating cases, said in May its shares had tumbled nearly 50% in the first quarter of 2023 because of a spike in student usage of ChatGPT, according to Chegg CEO Dan Rosensweig. He said students who normally pay for Chegg's service were now using the AI platform for free.
At Temple this spring, the use of research tools like library databases declined notably following the emergence of chatbots, said Joe Lucia, the university's dean of libraries.
"It seemed like students were seeing this as a quick way of finding information that didn't require the effort or time that it takes to go to a dedicated resource and work with it," he said.
Shortcuts like that are a concern partly because chatbots are prone to making things up, a glitch known as "hallucination." Developers say they are working to make their platforms more reliable but it's unclear when or if that will happen. Educators also worry about what students lose by skipping steps.
"There is going to be a big shift back to paper-based tests," said Bonnie MacKellar, a computer science professor at St. John's University in New York City. The discipline already had a "massive plagiarism problem" with students borrowing computer code from friends or cribbing it from the internet, said MacKellar. She worries intro-level students taking AI shortcuts are cheating themselves out of skills needed for upper-level classes.
"I hear colleagues in humanities courses saying the same thing: It's back to the blue books," MacKellar said. In addition to requiring students in her intro courses to handwrite their code, the paper exams will count for a higher percentage of the grade this fall, she said.

Ronan Takizawa, a student at Colorado College in Colorado Springs, Colo., is shown outside Union Station on the way to boarding a bus for his 65-mile commute to class from downtown Denver early Monday, Aug. 7, 2023, in Denver. (AP Photo/David Zalubowski)
How students are responding
Ronan Takizawa, a sophomore at Colorado College, has never heard of a blue book. As a computer science major, that feels to him like going backward, but he agrees it would force students to learn the material. "Most students aren't disciplined enough to not use ChatGPT," he said. Paper exams "would really force you to understand and learn the concepts."
Takizawa said students are at times confused about when it's OK to use AI and when it's cheating. Using ChatGPT to help with certain homework like summarizing reading seems no different from going to YouTube or other sites that students have used for years, he said.
Other students say the arrival of ChatGPT has made them paranoid about being accused of cheating when they haven't.
Arizona State University sophomore Nathan LeVang says he doublechecks all assignments now by running them through an AI detector.
For one 2,000-word essay, the detector flagged certain paragraphs as "22% written by a human, with mostly AI voicing."
"I was like, 'That is definitely not true because I just sat here and wrote it word for word,'" LeVang said. But he rewrote those paragraphs anyway. "If it takes me 10 minutes after I write my essay to make sure everything checks out, that's fine. It's extra work, but I think that's the reality we live in."
___
The Associated Press education team receives support from the Carnegie Corporation of New York. The AP is solely responsible for all content.
___
Growing demand for AI skills will transform these 10 jobs
Growing demand for AI skills will transform these 10 jobs

In the latest chapter of the evolution of artificial intelligence in society, automation has been increasingly used by businesses and industries in recent years: on factory floors, in logistics warehouses, and in software development.
After decades of development and investment surrounding AI, the last several months have seen the introduction of more complex and conversational AI— a level of automation in which the software makes and executes decisions without humans — accelerating adoption in the labor market. Even a significant portion of teachers say they've used it. The Walton Family Foundation and Impact Research surveyed 1,002 K-12 teachers and found 51% had used ChatGPT within the two months following its launch.
Freshworks analyzed public job posting data from January 2022 to March 2023 that Revelio Labs compiled from popular hiring platforms to identify the jobs that most often mention they're seeking workers with skills in artificial intelligence.
These AI tools take prompts from human users and churn out answers or products in a fraction of the time it would require a human to do the same work. And seemingly every week, entrepreneurs are bringing more complex AI-based tools to market—so much so that a cottage industry of newsletter publications has cropped up to track and share each one that's released.
Some of these tools will generate videos based on text inputs from users describing the video they want to be created. Others will produce fantastical images that never actually existed. Popular stock photo provider Shutterstock recently launched an AI image generator of its own.
Business leaders have touted the tools as potentially able to enhance the productivity of workers, allowing them to create more products with less time and resources. When considering their customers, leaders are exploring the potential of AI-powered chatbots and automations to enhance customer experiences by speeding up response times, personalizing messaging, facilitating outreach, and more.
In an open letter published by Bill Gates in March, the billionaire founder of Microsoft called the development of AI "as fundamental as the creation of the microprocessor, the personal computer, the Internet, and the mobile phone."
Others, including SpaceX founder Elon Musk, have cautioned that moving too fast in developing these tools could be dangerous and harmful to society.
Job postings from the last year show employers in certain industries aren't keen to fall behind if the technology truly ends up being as consequential as the soothsayers preach. Many industries are seeking workers with knowledge of how to best use these tools, including in finance and science. Still, software-related tech jobs have taken the lead in mentioning the technology when hiring new talent.
According to Revelio data, engineering roles and those working with data rank high in terms of jobs in which employers are seeking AI skills in job posting over the last year. And the largest share of job postings for these roles is for jobs in major cities on the U.S. coasts.
Quality assurance tester

- Share of postings mentioning AI: 5.30%
- Top states where QA tester AI job postings originate:
1. West Virginia: 10.64% of total
2. Washington D.C.: 10.22% of total
3. Louisiana: 9.12% of total
Investment specialist

- Share of postings mentioning AI: 5.59%
- Top states where investment specialist AI job postings originate:
1. Virginia: 13.78% of total
2. Delaware: 10.88% of total
3. Iowa: 8.34% of total
Business operations

- Share of postings mentioning AI: 6.24%
- Top states where business operations AI job postings originate:
1. Washington D.C.: 10.34% of total
2. Rhode Island: 10.14% of total
3. Oregon: 9.58% of total
Solutions specialist

- Share of postings mentioning AI: 6.30%
- Top states where solutions specialist AI job postings originate:
1. Washington D.C.: 12.21% of total
2. Rhode Island: 10.39% of total
3. New Jersey: 9.56% of total
Infrastructure engineer

- Share of postings mentioning AI: 6.40%
- Top states where infrastructure engineer AI job postings originate:
1. Rhode Island: 9.56% of total
2. Virginia: 9.34% of total
3. New Mexico: 9.33% of total
Product manager

- Share of postings mentioning AI: 7.21%
- Top states where product engineer AI job postings originate:
1. Washington: 10.92% of total
2. Louisiana: 9.50% of total
3. California: 9.12% of total
Application engineer

- Share of postings mentioning AI: 7.43%
- Top states where application engineer AI job postings originate:
1. Idaho: 40.73% of total
2. Washington D.C.: 12.73% of total
3. Washington: 10.96% of total
Software engineer

- Share of postings mentioning AI: 11.68%
- Top states where software engineer AI job postings originate:
1. Washington: 19.51% of total
2. Washington D.C.: 16.12% of total
3. Maryland: 16.02% of total
Scientist

- Share of postings mentioning AI: 14.22%
- Top states where scientist AI job postings originate:
1. Washington: 36.35% of total
2. Montana: 22.74%of total
3. California: 20.93% of total
Data analyst

- Share of postings mentioning AI: 25.66%
- Top states where data analyst AI job postings originate:
1. Washington: 40.07% of total
2. California: 32.87% of total
3. Virginia: 32.05% of total
This story originally appeared on Freshworks and was produced and distributed in partnership with Stacker Studio.