By SUSAN JONES and SHANNON WELLS

Claude for Education officially landed at Pitt this week, at least for faculty and staff, according to an announcement from Provost Joe McCarthy and Mark Henderson, Pitt’s chief information officer.

There have been several questions raised since the partnership with Anthropic and Amazon Web Services (AWS) to bring an “AI-enabled Campus of the Future” was announced last month. Concerns include: Will faculty be forced to use Claude? What’s Pitt’s overall AI strategy? How will the teaching center be involved in implementation? What happens to PittGPT? And why is it called Claude?

Let’s start with the easy one first. An AI-assisted answer on the internet said that Claude is “in honor of Claude Shannon (1916-2001), who is considered the ‘father of information theory’ for his work on how information can be encoded, transmitted and processed.”

Anthropic, the company behind Claude for Education, was created by a brother and sister team, Dario and Daniela Amodei, who previously worked for Open AI, the creator of ChatGPT.  “They came to not agree with the direction that (Sam) Altman (Open AI CEO) and others were taking in the development of ChatGPT, and so they decided to create their own,” Henderson said.

“They think a lot about security, about guardrails,” he said. “They do very well in the academic environment for these reasons.”

Henderson said he also is impressed with the candor of Dario Amodei, who has been on media outlets like Axios and CBS’ “60 Minutes” talking about all the benefits of AI — GDP growth, balanced budgets, higher productivity — but also the downside, such as possible high unemployment and losing control of the AI model.

“60 Minutes” reported that at Anthropic, about 60 research teams are working to identify threats, build safeguards to mitigate them, and study the potential economic impacts of the technology. 

Another reason Pitt entered this partnership is because of its long relationship with AWS, which is a major investor in Anthropic, Henderson said.

“That’s what’s unique about our partnership,” he said. “We’re bringing not only Anthropic into our midst, with all of the benefits that will accrue to us, but they’re powered, in large measure, by and partnered with AWS.”

In addition, PittGPT, which debuted earlier this year, will still be available. This system operates in isolation from the Internet because it works only for those logging in with their Pitt ID. Pitt Digital also will continue to support Google Gemini, Copilot, Google NotebookLM, and other AI tools. “As these tools continue to evolve, we’ll evaluate them and look at their usefulness within our environment,” Henderson said.

Implementing Claude at Pitt

Henderson said the goal at the University is “to graduate critical thinking students. So how do we use AI to help with that end?”

Claude uses a kind of Socratic method — “where students can use it and it can help by asking probing questions on particular subjects to help students think about answers towards, ideally, mastery of that particular subject,” he said

It also will help prepare students for professional AI tools they might encounter when they enter the workforce, as well as accommodating educational and administrative needs of the University.

“I have used it to get me jump started,” Henderson said. “I can pose several prompts on something I’m thinking about, something will come back, and then that provides at least a running start. … I didn’t have to sit there, staring at the wall until creativity struck me.”

He said a colleague in Pitt Digital has used AI to automate aspects of her job “that now allows her to do three or four times more of what she was tasked. So instead of feeling overwhelmed, she’s utilized this technology to enhance her contributions to Pitt Digital and to the University.”

All the members of the Pitt Digital AI Trend, Translation and Enablement Team have been beta testers with Claude for Education, along with members of PASTA (Pitt AI Scholars and Teachers Alliance). This group brings together about 80 faculty members “from just about every discipline that Pitt provides, whose work is in AI or they’re thinking about AI,” Henderson said. “We come together on a monthly basis and talk about things that are top of mind, and some of these things culminate in initiatives that we’ve worked with members of PASTA on.”

Concerns

One of the questions that has been raised over the past few weeks is if faculty will be required to use this new tool.

Both the provost and Henderson have tried to reassure faculty that is not the case.

“Faculty retain full authority to determine how AI tools may or may not be used in their courses,” the message from the two Pitt leaders said this week. “We encourage you to communicate expectations clearly in your syllabi and assignments in ways that best support your pedagogical goals. Use of Claude is voluntary and intended to serve as an optional resource.”

Henderson also told the Senate Computing and Information Technology Committee meeting on Dec. 1: “The thing that I want to make very clear is … we try to provide various technologies for the betterment of the community. What I would offer is that there is no expectation to pursue this. If there are faculty members who kind of oppose the use of some of these tools, that’s OK. I would like to hear from faculty members as to what the concerns are, but it’s not something that we’re foisting upon anyone.”

Privacy issues: Responding to questions at the committee meeting regarding privacy in AI use — including students using AI platforms as “personal therapists” — Henderson said, “that’s just one of many things in the AI realm that keeps me up at night.”

“For every action, there’s an equal and opposite reaction, so we pay very close attention to that as well,” he said. “We look at it from a 360-degree perspective … to help to advise us on things that we need to be paying attention to, because we’re in large measure in some unchartered territory right now with the advent of AI.”

The announcement from the provost and Henderson reiterated that all work on AI at Pitt is private: “Your work will not be used to train AI models and is private. While system administrators may review aggregate usage statistics to improve services, they do not have access to individual conversation content, chat history or uploaded files, except in rare circumstances, and only as permitted under applicable University policies and federal, state and local laws. This framework is designed to give you the freedom to explore ideas and use the program productively.”

Henderson said that all of the AI resources that Pitt Digital supports have gone through security reviews. “There’s contractual language on what you can and can’t do with our data. They exist in secure environments for our use. Our data does not get uploaded to the mothership, for lack of a better description, to train corporate large language models by the various vendors who are building those things. 

‘Shared governance: Senate President Kristin Kanthak said at this week’s Faculty Assembly meeting that several Senate committees will be looking at AI.

“The AI space is moving fast, and if Pitt is slow, it will not be in the game,” she said. “This creates a special set of difficulties for shared governance. We don’t want to be stuck deliberating while malign actors are taking over what ought to rightfully be Pitt’s leadership position.”

Kanthak said she recently met with Henderson to discuss “his vision of AI and the role of shared governance. Moving forward, I emphasized to him how strongly faculty felt that those who are experts in teaching and learning must be at the center of integrating AI in the classroom. He assured me that he understood the important role for those who know pedagogy, and that he had no interest in issuing dictates to faculty about how they use or choose not to use AI as part of their teaching.”

Environment: Henderson said that everyone needs to worry about the impacts of AI on the climate, “because they have a voracious appetite for energy, which in some instances around the country have caused brown outs for normal folks in their homes. It has caused rates to increase.”

But he said the work being done at Pitt with AI is not the huge energy consumer. “I would offer that where the real energy is consumed is in the training of the large language models. We don’t do much of that, not on the scale of Google or Meta or Anthropic.”

He also noted that there are researchers at Pitt who are working on issues surrounding AI energy use, looking at how to use renewable sources or how to create the next generation of processors that don’t consume as much energy.

Claude’s future at Pitt: Regarding the longer-term viability of Claude, Henderson said Pitt has surveyed the marketplace with the “understanding that there is tremendous change in the market at rates of speed heretofore unseen.”

“Our agreement with Anthropic is one year, so we’re not locked in,” he noted. “We hope to have some tremendous benefit, and at the end of our one-year agreement, we will assess them and their viability going forward, particularly in the context of developments in the broader marketplace.”

Overall AI strategy?

Asked whether Pitt needs to do more work to create an overall AI strategy, Henderson said, “The simple answer is yes, and I would suggest that the provost is thinking very hard about this. There are aspects of it that I can help with, and there are some things that we can drive, but ultimately, particularly as we look at its adoption, or not, in the academic realm, that lies with the provost leadership, and then we help realize his vision and strategy.”

Provost Joe McCarthy said at the October town hall that a University-wide leadership group is being formed “to coordinate efforts in AI, across academics, research, operations, health sciences, etc., and make sure that we have the tools and infrastructure needed to support this ongoing, important work.”

This committee would follow up on the work done by the Ad Hoc Committee on Generative AI in Research and Education, which released a report in 2024 on Considerations for Responsible Use and Recommendations for Generative AI in Research and Education.

Training

Pitt Digital will begin offering trainings next week for faculty and staff. Students will receive access over winter recess. There also will be sessions to train the trainers, Henderson said, “so that we can arm folks who are in the midst of those who they support, who become knowledgeable and can support the technology.”

The other part of the agreement is that Anthropic “has committed to helping us with plans around AI literacy, not only within the University, … but actually to the broader community through our Community Engagement Centers.”

Training opportunities include:

To get started, visit claude.ai, enter your Pitt email address, and select Continue with SSO.

If you already have a personal Claude account with your Pitt email, you’ll see both accounts listed when you log in. Select the account labeled “Pitt Enterprise plan” to access the enhanced features and privacy protections required for University work.

Susan Jones is editor of the University Times. Reach her at suejones@pitt.edu or 724-244-4042. Shannon Wells is a staff writer for the University Times. Reach him at shannonw@pitt.edu.

 

Have a story idea or news to share? Share it with the University Times.

Follow the University Times on Twitter and Facebook.