News
How Old-School Testing Is Helping Stop Modern-Day Cheating
Comments
Link successfully copied
A lecture hall at the University of Texas at Austin on Feb. 22, 2024. (Brandon Bell/Getty Images)
By Aaron Gifford
4/21/2026Updated: 4/21/2026

At some point over the past century, a classic academic tradition was abandoned.

Fast forward past the creation of the calculator, personal computers, the internet, and ubiquitous digital learning environments.

A Cornell University professor, at a time when so many college students are relying on the latest version of ChatGPT for help, found a solution so old that it was new again: oral exams.

“Academic integrity was always a major concern, but the bar to violating it was just so low,” Christopher Schaffer, who teaches biomedical engineering, told The Epoch Times.

He said a return to more personal engagement in the classroom benefits both students and teachers.

“It’s about fairness and assessment,” he said, “not just cracking down on cheating.”

Schaffer developed the idea over three years, comparing students’ biomedical engineering assignment results with ChatGPT’s.

Between 2022 and 2025, new versions of the generative artificial intelligence (AI) tool improved to the point that it consistently outperformed all students. That caused some alarm, the professor said. He added the oral exam requirement to his five-credit, 300-level course ahead of last semester.

Christopher Schaffer, professor of biomedical engineering at Cornell University. (Courtesy of Jason Koski/Cornell University)

Christopher Schaffer, professor of biomedical engineering at Cornell University. (Courtesy of Jason Koski/Cornell University)


Restoring the Circle of Learning


Schaffer’s class provides instruction and labs in which students learn how to design electronic medical devices that operate with signals. Six take-home assignments are still a course requirement, but instead of just submitting a paper explaining a solution to a problem, students must defend their research, concepts, and applications in a 20-minute discussion.

The assignments allow for some early collaboration with others and the use of AI as a starting point to identify sources of information, but all students must explain what they know and how they know it to a professor or teaching assistant. All the written materials used in their research are submitted as well.

For example, Schaffer explained, students might be assigned to design a circuit used in a sensor that detects eyelid spasms, which would include an explanation of a signal-producing algorithm. They can provide a diagram to illustrate their code-writing, but the remainder of the assessment is explained live and out loud.

None of the problems assigned has just one correct design or answer. The technology the students are learning about has “trade-offs and alternatives,” Schaffer said.

The completed oral assessments are scored between one for unsatisfactory and four for excellent.

A lecture hall at the University of Texas at Austin on Feb. 22, 2024. (Brandon Bell/Getty Images)

A lecture hall at the University of Texas at Austin on Feb. 22, 2024. (Brandon Bell/Getty Images)

In addition to ensuring academic integrity, this process of researching, preparing, and rehearsing—a circle of learning—goes a long way in helping students learn more and build confidence. Many students have limited experience in public speaking. Those who struggle because they are nervous are allowed a do-over, Schaffer said.

“Broadly speaking, it went really well,” he said, noting that he was pleased with the class’s performance and positive feedback from students, another professor, and graduate students who helped with this initiative. “The vast majority really appreciate in-the-moment feedback on their understanding.”

These days, oral assessments are mostly limited to learning exercises in some classes or dissertations for doctoral candidates in certain fields, Schaffer said. But he said he believes that interest in this method will grow across higher education, in both STEM (science, technology, engineering, and math) and the humanities. Other faculty members and staff told him they support the concept and may eventually follow suit.

Fighting Fire With Fire


In a related initiative at Cornell aimed at curbing student AI use, German language instructor Grit Matthias Phelps requires students to use manual typewriters in class to avoid any online help with language translation, university officials confirmed.

At New York University, business professor Panos Ipeirotis is “fighting fire with fire” by calling on students and requiring them to use a conversational AI tool that requires oral explanations for assignments, he said in a Dec. 29 blog entry on his personal website.

He called the results of this initiative “illuminating.”

The New York University campus in New York City on April 24, 2012. (The Epoch Times)

The New York University campus in New York City on April 24, 2012. (The Epoch Times)

“Many students who had submitted thoughtful, well-structured work could not explain basic choices in their own submission after two follow-up questions,” Ipeirotis wrote in his blog. “Some could not participate at all. This gap was too consistent to blame on nerves or bad luck. If you cannot defend your own work live, then the written artifact is not measuring what you think it is measuring.”

At Hillsdale College in Michigan, English professor Patricia Bart leads a writing “boot camp” that’s essentially AI-proof.

In the two Great Books core courses, Bart requires students to first write and orally present a prospectus on the papers they’re working on. This includes the defense of main ideas and an explanation of the sources used. Additionally, rough drafts are required, and markups of those drafts must be submitted along with the final papers.

Students also participate in panel discussions during which they are challenged to understand the material before engaging in constructive debate with their peers.

“You can’t outsource your intellect,” she told The Epoch Times.

Bart said she learned this approach as an undergraduate student at the University of Pittsburgh in the 1980s. Her instructor was a Rhodes scholar who shared teaching methods from Oxford University.

The campus of Hillsdale College, in Hillsdale, Mich., on April 6, 2023. (Chris duMond/Getty Images)

The campus of Hillsdale College, in Hillsdale, Mich., on April 6, 2023. (Chris duMond/Getty Images)

Bart has taught courses with high concentrations of writing and oral presentations for decades now, but in recent years, she tweaked some steps to counter the use of AI platforms, particularly Grok.

She wants to restore rigorous levels of reading, writing, public speaking, and civic engagement that have been absent for generations.

“It is extra work, and it can be painful at first,” she said, “but it bears fruit.”

Mixed Opinions on AI


A 2025 California State University system survey of 94,060 students, faculty members, administrators, and staff across 22 public universities in the Golden State found that 95 percent of all respondents used at least one AI tool and that 84 percent of students used ChatGPT for assignments.

Eighty-two percent of student respondents also said they worried about job security because of AI.

“Even though I don’t want to use it, I have to,” an anonymous student majoring in computer science said in the survey, “because if I don’t, I’ll be left behind, and that is the last thing someone would want in this stupid job market.”

Not all students embrace ChatGPT or similar products.

(Oleksii Pydsosonnii/The Epoch Times)

(Oleksii Pydsosonnii/The Epoch Times)

Liberty Holt, a college freshman from Gainesville, Florida, said she experimented with the first version of the generative AI tool when it first came out, mainly for more pinpointed internet queries or a detailed explanation for concepts she didn’t quite grasp. She was a high school student at the time.

“I wasn’t always staunchly against it,” Holt, a linguistics major at Florida’s Santa Fe College, told the Epoch Times, “but now I don’t use it at all.”

As the tool improved and Holt pursued deeper learning, it became an ethical issue for her. Most of her college friends feel the same way, even though some professors at her college tolerate or even advocate some ChatGPT use for assignments. She applauds professors who’ve found creative ways to combat AI use, such as requiring in-class essays to be completed by pen and paper or giving quizzes on platforms that lock out multiple browsers.

She said she’s astounded—and disappointed—by how quickly teachers and learners have come to rely on artificial intelligence.

“You’re relying on something to do the learning for you,” Holt said. “A few years ago, when we didn’t have this, we were still able to do our jobs.

“I don’t think it can ever replicate the qualities we have. Certain qualities will never be replaced—connections and emotions.”

Share This Article:
Aaron Gifford has written for several daily newspapers, magazines, and specialty publications and also served as a federal background investigator and Medicare fraud analyst. He graduated from the University at Buffalo and is based in Upstate New York.