Pitt is officially setting a national precedent for artificial intelligence in higher education through a new partnership that offers a specialized AI assistant to students and staff.
In October, Pitt announced a partnership with Anthropic and Amazon Web Services. This partnership provides Pitt students, staff and faculty with access to Claude for Education, an artificial intelligence assistant developed by Anthropic and designed specifically for higher education institutions.
Claude for Education uses socratic questioning to help students reach solutions on their own, and beyond assisting with research, writing and document analysis, it’s intended to help “prepare students to utilize AI effectively in their careers.”
Several higher education institutions in America and Europe also made Claude for Education accessible institution-wide, though Pitt is the first university to embed the platform into its own AWS cloud infrastructure. This integration enables the University to build custom, secure applications like PittGPT at an institutional scale.
University spokesperson Jared Stonesifer said Claude for Education is a general-purpose tool that can support work across disciplines — from humanities research to data analysis to health sciences.
“If used appropriately in the service of learning goals, GenAI tools like this can help students to explore complex concepts from different angles,” Stonesifer said. “A philosophy student and an engineering student could both use Claude for Education to explore ethical implications of emerging technologies from their respective disciplinary lenses, for instance.”
Stonesifer highlighted the uses Claude for Education can have for faculty who choose to use generative AI.
“Tools like Claude for Education can potentially support teaching, research and administrative work — whether exploring course design ideas, analyzing scholarly materials, drafting communications or organizing multi-step projects. These tools are being made available because we believe some faculty will find uses that complement their expertise, judgment and creativity,” Stonesifer said.
Michael Colaresi, associate vice provost for data science, highlighted a primary advantage of integrating Claude for Education.
“Through this trial period, students, staff and faculty can continue and deepen a shared dialogue about how to best reach our individual and collective aspirations in a world where GenAI and other AI tools are increasingly a part of economics, politics, society and science,” Colaresi said.
Colaresi highlighted this partnership’s potential to open up more lines of communication on understanding how to best prepare students for the future.
“We need to protect individual choice and agency — including choices not to use GenAI where it would be unhelpful or counter-productive — while simultaneously cultivating conversations that lead to shared understandings of how we as an organization can serve students to the best of our ability,” Colaresi said.
Angus Nicholson, a senior mechanical engineering major, said he does not foresee himself using Claude for Education unless any of his professors encourage him to.
“I cannot see what differentiates Claude for Education from ChatGPT or Gemini, and as with the other forms of artificial intelligence, we cannot know whether it is truly reliable,” Nicholson said. “Beyond reliability, I believe this could negatively impact students’ critical thinking abilities, as increased use may lead them to think for themselves less.”
Nicholson said, despite his general disapproval of Claude for Education, he does see some potential benefits to Pitt’s partnership with Anthropic.
“If the overarching goal is to reduce plagiarism and to have students use this to genuinely enhance learning, giving professors the ability to monitor how the AI is being used would be the only benefit from my perspective,” Nicholson said.
Nicholson remarked that professors might encourage using Claude for Education because they are aware that students will use AI regardless, but they should consider that this encouragement might have consequences for learning.
“At what point does AI replace the need for professors, and at what point is it replacing the need for learning? There are questions that must be considered and lines that must be drawn,” Nicholson said. “I do not understand encouraging Claude for Education but discouraging ChatGPT and other popular forms of AI.”
Annette Vee, associate professor in the Department of English, said she generally thinks Pitt’s partnership with Anthropic is a good idea. In terms of privacy, with this enterprise agreement, student data will not be used to train the models, which is another benefit, Vee said.
“Many students are already using AI platforms, and the platforms they’re using overwhelmingly are taking their input and training their models based on this input, so whatever students feed [AI platforms] is not private,” Vee said. “Given the fact that this is already the landscape, I believe this partnership is a good thing because it makes access to top-notch large language models equitable across the student body.”
Vee said certain concerns arose about the potential consequences of making this program accessible to students.
“Faculty are worried that this will encourage more students to use AI, or that it may encourage them to use it in all their classes,” Vee said. “However, I think the access to Claude for Education is not going to move the needle at all on students’ use of AI, because it is already pretty saturated.”