The rapid rise of generative artificial intelligence (AI) platforms over the past two years has shifted how some students approach coursework.
From brainstorming essay ideas to reviewing complex patient case studies, AI tools are becoming embedded in academic routines. But as usage grows, so do questions about ethics, authorship and academic integrity.
“I use ChatGPT to help organize my thoughts before writing a paper,” said Truman Roth, a sophomore supply and value chain management major. “Sometimes I have a lot of ideas, but don’t know how to structure them, so I’ll ask it to create an outline or suggest ways to make my argument clearer before I start writing.”
For some students, AI functions as a digital tutor. Blake Burt, a sophomore criminal justice major, said he uses AI to explain difficult assignments in simpler terms.
“It’s like having someone walk you through the assignment step by step,” Burt said. “If a prompt is long or confusing, I’ll have it rewritten in simpler terms so I can clearly see what the essay or research paper is asking. Once I understand the expectations better, it’s easier to organize my ideas and start writing.”
Carly Watson, a sophomore nursing major, said she uses an AI tool to quiz herself on her own notes and generate practice tests before major exams in subjects such as pharmacology and patient care procedures. The technology turned dense material into mock multiple-choice questions and short-answer prompts, helping her identify weak spots and review key concepts in a more interactive way.
“It doesn’t replace my studying,” Watson said. “I still read the chapters and go through all my notes first. But having it create practice quizzes from my notes helps me see what I know and what I need to go back and review.”
Inconsistencies in classroom AI policies
TCU’s academic integrity policy does not ban AI, but students are expected to submit original work, rather than work generated through AI.
Nationally, universities are grappling with similar concerns. Some institutions have integrated AI literacy into coursework, teaching students how to use tools responsibly rather than banning them.
At TCU, some professors encourage students to experiment with AI tools, while others prohibit any sort of AI usage, which creates confusion about what is allowed and what is considered academic misconduct.
Keith Whitworth, a sociology professor at TCU’s AddRan College of Liberal Arts, said he does not view AI use as a major concern. Instead, he believes that the responsibility lies with professors to rethink how assignments are designed.
“It is a new technology, and we are all learning how and when to use it,” Whitworth said.
Rather than banning AI tools, Whitworth said professors should design assignments that incorporate them in ways that still develop students’ critical thinking and writing skills.
“Submitting a paper that was created with a prompt and taking credit for writing it is academic dishonesty,” Whitworth said. “We should not have assignments that provide the opportunity for this to happen. The assignments have to be designed to allow for AI in a controlled manner.”
Whitworth believes universities should focus on helping students become “AI literate” rather than discouraging them from using technologies that are now so common in everyday life.
Sophomore interior design major Gabriela Burgess said she chose not to use AI tools because of her personal beliefs about academic integrity.
“I prefer not to use AI tools because relying on them goes against my personal values and my commitment to producing my own original work,” Burgess said.
Burgess said she has seen different approaches to AI policies in her classes. In one course, she said students were given the option to either incorporate AI tools into a project or complete an alternative assignment without using AI if they felt uncomfortable with it.
What’s next for AI at TCU
TCU is working to provide clearer guidance for both students and faculty, said Provost and Vice Chancellor of Academic Affairs Floyd Wormley Jr.
“Currently, we are establishing a TCU AI Teaching & Learning Committee that will review current and develop new policies to address the use of AI in our teaching and learning,” Wormley wrote in an email. “This will include clarifying student use of AI in the classroom to recommending training, webinars and workshops for instruction on the effective and ethical use of AI in teaching and learning.”
Wormley said the university is also considering guidelines for how to use AI in research and scholarship and on larger institutional-level policies governing the use of AI.
Transparency is key, said Leslie Browning-Samoni, a professor of fashion merchandising.
“If students are using AI, I want them to disclose how,” Browning-Samoni said. “That kind of transparency promotes honesty, accountability and ethical decision-making, which are essential both in the classroom and in their future professions.”
Voss Finkelstein, a first-year finance major, said preparing for a future shaped by AI is part of his motivation.
“If employers are using these tools, we should know how to use them ethically,” he said. “I don’t think ignoring AI is realistic. It’s better to learn how to use it responsibly now, so we’re prepared for the expectations we’ll face in the workplace.”
As AI continues to evolve, the conversation at TCU reflects a broader shift in higher education: balancing innovation with integrity. For now, students and professors alike are navigating new academic territory one prompt at a time.