With increasing AI integration at Pitt, professors across different disciplines have adapted their classroom policies to address use of AI tools.
As AI use increases across higher education institutes, Pitt has expanded its own AI resources on campus with its October deal with Anthropic and, most recently, the public launch of Pitt’s AI assistant Claude for Education. Many professors explained how they are adapting to the new pedagogical landscape, despite skepticism about the environmental impacts and learning challenges AI presents.
According to Annette Vee, a professor of English literature and the faculty liaison for AI enablement with Pitt Digital, AI is not going away.
“This is something that we need to grapple with in education,” Vee said. “I don’t think it’s going to portend some sort of golden age of education. But, I also don’t think that it’s going to be the end of us.”
Scott Andrew, professor of studio arts, said he is concerned about the legal and environmental impacts of AI. However, he also noted that such technology, through imaging and Photoshop programs, can be useful to students.
“I have integrated [AI] into a few of my classes, but in a very small way,” Andrew said. “I’ve found that actually art students, specifically, are very much not interested in using it. They want to be doing their own creative output, and they view [AI] as a sort of attack on that.”
Vee said she believes that Pitt, along with many other universities, must think carefully about how to prepare students for intellectual work in the current technological era.
“AI is part of that landscape, and it’s finding its way into lots of different things,” Vee said. “Students are using it in creative and interesting ways that can augment their learning, but they’re also using it to shortcut some things.”
Andrew said AI can be a “very interesting” application that can help artists with their creative processes, but he noted there is also a lot of fear that AI will steal students’ work and data.
“If we could be at a place where things were accessible, but we knew they were ethically sourced, I think people would be more interested in engaging with these technologies,” Andrew said.
Vee emphasized responsibility and transparency for students’ AI use in her classroom policies.
“I think that the core emphasis on responsibility is really important,” Vee said. “I encourage students to think very carefully about their uses of AI and to just consider whether this is a beneficial use for their learning and growth.”
Danielle Spitzer, professor of biology, said she views AI very skeptically because the consequences of AI use are still not fully understood. She likened increasing AI use to the 18th century growth of factories, before people were able to address the impacts of pollution from industrial manufacturing.
“I see [AI’s] potential, but I feel like we have not caught up with understanding about responsible use and regulation in when or how it should be used,” Spitzer said. “I’m worried that many of the ways AI is being used go against the educational mission of the university.”
The process of completing an assignment and engaging with the material is central to learning, according to Spitzer. She said educational value is lost when students complete them with AI.
“People go to the gym and lift weights to make themselves stronger. Completing assignments will make [students’] brains stronger by forcing them to make new connections to have learning moments,” Spitzer said. “If you skip over the process, just to get the product, then it’s kind of like going to the gym and using a forklift to move weights up and down.”
Spitzer has a more restrictive AI policy in her classrooms, with students generally being unable to use AI for any assignments. She said she does not necessarily want to ban AI in the classroom to make work harder, but AI can interfere with her job — teaching students to think like scientists.
“I’m here to help people learn science, and I do think that if you’re going to take shortcuts, you will lose out on the educational opportunity,” Spitzer said. “It will catch up at some point because you haven’t learned a thing that you’re supposed to learn.”
Spitzer said she sees the potential for AI to be a very powerful and useful technology with positive impacts. However, she said she would rather wait until the risks of AI and how to mitigate them are better understood.
“I really want to emphasize that this is still a new technology,” Spitzer said. “I think that time and regulation will likely change my view on it.”
Ula Lechtenberg, learning design coordinator for the University Library System, said the fast rate at which universities are integrating AI seems to be causing lots of confusion for both faculty and students. She said many faculty are at a loss on how to use new AI resources from Pitt, including Claude for Education and PittGPT.
“I feel like even though we know [Pitt is] providing these resources, we don’t know yet what we can do with them,” Lechtenberg said. “In an educational context, we’re still worried about whether this is helping or hindering [students’] learning.”
Lechtenberg organizes AI prompt events at Hillman to help students and faculty write effective AI prompts. She said there is a responsibility for the University to teach students information literacy and AI literacy skills.
”Once you’ve decided you’re going to use an AI tool to help solve a problem, how do you effectively do that? By writing a prompt, which is a complex idea,” Lechtenberg said. “It really represents how varied our access to information and the information we receive can be.”
Lechtenberg hopes that through more events and education regarding AI literacies, students and faculty can understand how to effectively use these tools moving forward.
“Our understanding of the tool and that technology affects our society,” Lechtenberg said. “I think that only time will really tell the impact it’ll have.”