Northwestern community members discuss use of AI program ChatGPT to write academic papers


Illustration by Iliana Garner

ChatGPT’s ability to efficiently write detailed responses to directed prompts with human-like prose threatens academic integrity and development, according to some Northwestern community members.

Julian Andreone, Reporter

Essay bots may be the next industry to be automated away, thanks to AI program ChatGPT. 

Artificial intelligence can be used to perform a variety of human-like tasks, including turning information from the internet into detailed written responses, such as essays.

AI program ChatGPT, launched by OpenAI last November and already one of the most popular chatbot models, has sparked debate regarding the potential impact of AI writing programs on academic integrity. The bot can generate infinite evidence-based responses to directed prompts.

History Prof. Michael Allen said AI writing programs do not stimulate intellectual activity or offer unique interpretations.

“To me, the whole point of learning (is) to know yourself, to understand the problems you’re confronted with in real time,” Allen said. “It’s not about tests. It’s not about grades. It’s not about degrees. Those things come naturally as part of the process. But the real value is not even practical at all. It’s psychological, emotional (and) cognitive.” 

However, some students believe AI can enhance their writing. 

Weinberg senior Avery Keare, head of the Responsible AI Student Organization Education Committee, said students can use AI programs to brainstorm ideas or proofread writing, which could have a positive impact on students’ learning. RAISO examines ethics and AI.

“I think that these tools can be used as collaborators where, ultimately, the student is still in charge, and is still making the executive, creative and critical thinking decisions,” Keare said.

She said AI’s rising influence in academia and the ability of these programs to efficiently aggregate information calls conventional educational methods into question. 

She also added that universities may need to reassess the way courses engage with students in developing critical thinking skills.

“ChatGPT should be a wake-up call for both students and professors to really examine what they’re trying to teach and what we’re trying to learn,” Keare said.

English Prof. Daisy Hernández said when she taught at Miami University, students in her nonfiction writing class attended a workshop on AI writing programs. 

Hernández said the AI program her class used demonstrated knowledge of conventional storytelling methods.

“It was, on one hand, really pleasurable to just have that experience of having this interaction with the AI system, but then it also raised a lot of questions for us about what assumptions (the) algorithms make in terms of how we tell stories,” Hernández said.

She suggested that her experience with the technology was “fantastic all around,” but that, because AI writing programs rely on basic conventions, they do not produce particularly creative work. 

While ChatGPT aggregates a collection of factual information from the internet and transforms it into a generated essay on any topic, professors are concerned the technology cannot offer unique perspectives.

Allen said using AI programs can misrepresent a student’s individuality.

“You’re merging your name, yourself, your identity, your standing and whatever school or institution you might be part of, with this autonomous technology that’s not your own,” Allen said. “And it strikes me as, in that way, really alienating from yourself.”

By using a computer program instead of individual analysis, students may have trouble applying critical thinking skills after college, he added.

He compared students using the program to adolescents presenting an image of themselves on social media that is inconsistent with reality. 

 “I get some of the appeal, especially for young people who don’t really know who they are,” Allen said. “But, then you’re never going to learn. You’re never going to actually become who you want to be. You’re just going to be kind of purchasing a fake version of yourself.”

Email: [email protected]

Twitter: @JulianAndreone

Related Stories: 

New student group spreads awareness of artificial intelligence ethics, big data injustices

Letter to the Editor: Open letter by Northwestern faculty in support of academic integrity

Assistant dean reprimands students in faculty email for threefold academic integrity violations