Artificial intelligence models are increasingly writing their own code, leading to speculation about shifts in the computer science job market.

Some engineers at Anthropic and OpenAI recently said they are using AI to write 100% of their code, causing some to predict changes to the computer science job market. Pitt professors in the computer science field are considering the effects of this shift, adapting their curriculum and emphasizing the value of an education in computer science. 

Adriana Kovashka, associate professor and chair of the department of computer science in the School of Computing Information, said the department is working on creating an official AI policy as well as new courses that highlight professional uses of AI. 

“We will be introducing courses that involve topics such as data science or programming plus a little bit of AI,” she said. “You can think of this as being like a vibe coding course where you don’t have to understand as much about programming, but just kind of feel your way through the AI outputs or have the AI model produce it for you based on some specification of a problem.”

The course CS 1699 Special Topics in Computer Science allows the department to test out a course like the one Kovashka mentioned before permanently deciding to add it to the curriculum.

“We’re trying to keep these courses agile as much as possible,” Kovashka said. “But it is tricky, because course design takes a while.”

Kovashka said she believes students sometimes think they can “offload the thinking part to AI,” but it’s necessary to understand the AI output and be able to fact-check it. 

“I think it would be cool if we could decouple teaching students a program and teaching students to think. For example, we can teach a course that uses no programming but still uses logic,” Kovashka said. “One of the values of the CS education is that you obviously need to know how something works, how something should work, to be able to see when it doesn’t work well.”

Morgan Frank, an assistant professor in the Department of Informatics and Networked Systems in SCI, said he believes teaching the fundamentals of coding is still a valuable practice.

“The effective use of these tools requires that you have some good contextual knowledge,” Frank said. “So when the AI does a poor job, you can catch it and you can be very critical of the results that it gives you and ask what happened to help identify how to do better.”

According to Frank, the effects of AI tools like Claude Code are yet to be seen in research.

“But I think there’s a lot of room for people who are good thinkers and creative and asking important questions to be made more powerful by these tools,” Frank said.

Frank said he believes AI can complete the more tedious aspects of scientific research, like cleaning data sets, running statistical analyses and summarizing results.

“It’s really bad at coming up with interesting questions or ideas, and it’s really bad at taking results and using them to update the hypothesis,” Frank said.

Patrick Skeba, a teaching assistant professor in the CS department in SCI, is teaching his students how to use AI creatively and effectively. 

“What I tell a lot of students is that you need to start thinking about what it is you want to get into,” Skeba said. “Do you want to study environmental science? Do you want to study public health? Do you want to study computers themselves? Then think about how machine learning can help you with that.”

Skeba said there are many applications for AI, but CS students need to know when and how to use them.

“There’s more of a need for people to come up with creative applications and experiments than pioneer the algorithms themselves, although that’s still taking place,” Skeba said.

Todd Underwood, a former lead at Anthropic and OpenAI and former co-site leader at Google’s AI headquarters in Pittsburgh, sees computer science jobs changing, but not necessarily being replaced. Underwood explained that people with computing backgrounds will need to oversee the AI’s code from start to finish, prompting and fact-checking its work, which requires a basic understanding of coding.

“The general principles of computing … are relevant no matter what happens,” Underwood said. “But there will be more managers of AI agents than they will be themselves the authors of software.”

Underwood said he believes universities should focus more on promoting broad principles of computer science rather than solely placing emphasis on coding, believing it’s difficult to know what will be relevant in the technology field in a year.

“[Universities should] de-emphasize mastery of personal skill in writing a lot of software, increase the amount of principles based in broad education and principles of computer science, principles of software and systems architecture,” Underwood said. “I think that would set students up for more success.”

Katie Whiteford, a junior computer science major, said her professors discourage the use of AI when completing coding assignments. But for longer assignments, Whiteford said AI is a time-saver.

“The hardest thing is, I don’t want to use AI because I want to learn. That’s my goal,” Whiteford said. “It makes me feel bad [to use AI], but it is what it is. I need to save time.”

For most of her co-op work, Whiteford said her computer science education was crucial for her understanding. 

“The consensus is, yes, AI is smart, but it’s not perfect, so we do need the skills,” she said. “Also — if you’re totally relying on AI — you’re not really learning, and that’s going to mess you up in the future.”