Manufactured thinking: The effects of AI in the classroom
Graphic by Easton Clark, Photo Editor
Few phenomena have had as big an impact on contemporary society as personal generative AI. Beginning with the initial release of ChatGPT by OpenAI in 2022, the market for generative AI has skyrocketed in value, with a projected global worth of $1.3 trillion by 2032.
AI has especially found a home among students. The Pew Research Center found that in 2024, 26% of students ages 13 to 17 used ChatGPT for their schoolwork, doubling the reported 13% in 2023. The percentage only increases with universities, as OpenAI reported to Fortune that more than a third of college students regularly use ChatGPT. And with more AI services being brought into the market, especially from legacy tech companies like NVIDIA and Microsoft, that number of users will likely increase, including at Chapman.
With Chapman’s introduction of Panther AI, a program designed to stay ahead of the curve in providing enhanced security and protecting intellectual property, many students have raised concerns about privacy risks associated with the program.
Rowan Eiselman, a senior majoring in community education studies, took the course Writing in the Legal Context last semester. It was her introduction to generative AI not only being permitted in academia, but also encouraged. To Eiselman’s surprise, her professor made it known that the legal field is increasingly relying on AI to draft memoranda.
She emphasized that AI is being presented as a complement to her work rather than a replacement for her thinking. Those who learn how to leverage AI in their work by utilizing its incredible ability will have a competitive edge in the job market. One method of usage is creating proposals where Eiselman inputs research questions into AI, allowing it to analyze potential limitations and suggest methods to refine the questions.
With its accessibility creating a default response for some students, it is important to recognize that overreliance on generative AI can be a potential danger zone, where it can severely affect levels of critical thinking and conscious decision-making.
“Since taking courses that encourage its use, I have noticed urges to refer to it in other aspects, which is not what I initially wanted,” Eiselman told The Panther. “It is a current issue that will only continue to grow, and I want to work on ensuring I do not let it become a first instinct.”
Amy Andujar, a freshman majoring in film and television production, also regularly incorporates the use of generative AI into her coursework. Most notably in her Film History class, where she often relies on ChatGPT to help break down complex readings and research materials.
She said that she can efficiently complete assignments with increased accuracy, as AI can remove unnecessary information from readings.
Andujar also worries about the potential overuse of AI, which could lead to underdeveloped cognitive skills and the dangers of overindulgence. A recent connectivity analysis conducted by the Massachusetts Institute of Technology (MIT) revealed that writing without assistance leads to a higher cognitive load and stronger executive control, while writing with AI assistance has been shown to reduce overall neural connectivity.
In Andujar’s experience, there is a greater prevalence of professors discouraging students from using AI in class and instead opting for an approach rooted in one-to-one interaction.
“Many of my professors don’t encourage using AI to do our assignments, and they encourage us to reach out to them with questions and help if we feel tempted to use it,” said Andujar.
The ways in which faculty have adapted to generative AI have often been on a case-by-case basis. Political science professor Ronald Steiner detailed his own experiences with modifying both coursework and testing to keep students from relying too heavily on programs like ChatGPT.
Since the proliferation of personal computers and the COVID-19 pandemic, an increasing amount of testing at colleges has been conducted online, including at Chapman. Professors like Steiner have increasingly relied on lockdown browsers to minimize their students’ reliance on generative AI for testing, although he expressed concerns that students may be able to work around those measures.
Steiner also explained that he assigns more in-class writing assignments, as he believes that it allows students to test their own writing and understanding of the material instead of giving them the opportunity to rely on generative AI to do the work for them.
“It's more spontaneous. They only have five minutes or so in class to do it, and I use it to get a reality check on what (their) writing looks like when I'm asking (them) to do it spur of the moment,” said Steiner.
Despite this, Steiner still believes that it's important for students to gain a greater understanding of generative AI in order to benefit their careers, and there is precedent for that belief.
A National University study found that 30% of current jobs in the U.S. could be automated and replaced by AI by 2030. Especially at risk are entry-level jobs, nearly 50 million of which the study found are at-risk. And even for those that remain, workers should expect their salaries to decline as AI integrates further into the workforce.
Steiner expressed the importance of adapting to workforce expectations by learning how to efficiently use generative AI, comparing it to many of the general education standards at Chapman.
“To me, it is maybe the 21st century version of being able to read and write with competency,” said Steiner. “It's just a part of the skill set that you have to have. So we have to be open to that. To deny it (and) to tell students they can't use it would be a disservice to the students.”
At the same time, he emphasized that students shouldn’t shorthand their own education by abusing generative AI for all their coursework.
“On the other hand, you can't learn critical thinking if you don't exercise those muscles, right? If you had a robot around the house that did all of your heavy lifting for you, yeah, that would be easy, convenient and efficient, but your muscles would atrophy,” said Steiner.
Only the future can tell us what is yet to come in the field of generative AI, but as far as students and professors alike are concerned, what has already been displayed will have a lasting impact on education at Chapman.