By SHANNON O. WELLS

A new piece of artificial intelligence-oriented software to be implemented at Pitt in March examines images and figures to make sure there’s no duplication across literature in submitted research experiments and papers.

Designed to assist with quality assurance for publishers and researchers, ProofigAI pinpoints image manipulation, duplication, plagiarism and AI-generated images to preserve “accuracy, trust and research credibility,” according to its website.

Bill Yates, vice chancellor for research protections, delivered presentations on ProofigAI at both the Senate Research committee meeting on Feb. 18 and the Senate Computing and Information Technology Committee meeting on Feb. 23.

As Yates explained to the latter committee, Proofig is essentially used to detect image manipulation.

“Why would there be image manipulation if you’re putting together a paper, and why would you worry about it?” he asked at the computing committee presentation. “What happens, not irregularly, is people will take one of these images and say, ‘This experiment had this result,’ and then they’ll put the same image three or four figures down the line in their paper, saying, ‘It was a different experiment.’”

This creates a concern about research misconduct, which Yates’ office handles, “because you’re saying the same image was the result of two different experiments.

“It’s just because the images look so similar when people are putting together their paper that they just don’t see that,” he explained.

Proofig will help determine duplication of images in your own work. “It will look across the published literature, (and) will allow you to come up with an image library so you can put in your own unpublished papers.” You can then look across your unpublished papers “to see if you duplicate it across those resources.”

Yates explained that this kind of duplication is seen “very frequently” in terms of research misconduct concerns. “Because how can the same image be two different experiments?” he said, noting that this will help investigators determine, in advance, if there was just a mistake involved.

It also will pinpoint manipulations in an image where, if two things were spliced together, “a trainee may give you a document and they innocently thought, ‘Oh, there’s some distortion in the middle here. I’ll just Photoshop it out,’” he explained. “Proofig will immediately tell the mentor looking at the paper, there was a distort. There was something done there.

“Then the mentor can go back to the student and say, ‘That was inappropriate. You shouldn’t have done that. You’ve manipulated your image.’”

The software, Yates clarified, is particularly useful for those in life sciences research, somewhat less so in other disciplines.

“If you’re in the social sciences, the humanities, Proofig isn’t going to help you very much in terms of image manipulation, because you’re not generating images that it would find problems with,” he said. “But for our folks and many other folks in the medical school, biological sciences, other areas that do blots, gels, etc., I think it’s going to be very valuable to prevent errors from occurring.”

One aspect of Proofig that will be helpful for most everyone in academia is its integration with iThenticate plagiarism-detection software.

“We do have iThenticate already, where you can actually look for plagiarism in your own paper,” Yates noted. “Proofig is going to integrate iThenticate, so you (can put) your paper in and look for image manipulation and plagiarism all at the same time in one report. I think that will be very helpful for people that need to do both.”

Later this year, Proofig is expected to integrate reference-checking software designed to ensure the accuracy of all references. “That would be something even in the humanities (or) social sciences that may be useful.”

Yates said concerns that the tools can delay publication of papers are unfounded.

“We’re not making anybody do anything,” he said. “This is a tool like iThenticate. We’re not making you turn things into our office to check them for you. In fact, we’re not going to do that. This is for a mentor (or) investigator to use to self-check their own work, or their trainees’ work before they actually send it to a journal (and) inadvertently may be having image duplication that they just didn’t pick up on.”

Noting Proofig’s current “crude”-looking interface on its website, Yates said the software is available through the My Pitt portal.

“They are generating a much more elegant interface for this in my.pitt.edu, so you can get in that way as well,” he said. “But again, it’s here.”

Responding to a question about reference-checking features, Yates said it makes sure you’re not “putting down a reference to a paper that doesn’t exist.

“It’s going to make sure that you’re giving a complete reference and that you know all the details of the reference are correct,” he said. “Publishers are routinely doing this right now too and doing reference checks. This will save a step in the fact that you don’t have to correct it down in the line. It will flag the error up front.”

Yates said the personnel at Proofig have been “very cooperative. … If you have a problem, they jump in. They help you solve your problem,” he said, noting that the group led by Angie Zack, knowledge integration coordinator in the Health Sciences Library, is “tooled up to help support investigators deal with Proofig.”

“There is some savvy here in terms of using the product,” he said. “There’s a lot of help text, help tutorials on the Proofig website.”

In a similar presentation to Senate Research, Yates said adoption of Proofig is “basically to increase research integrity.”

“I think it will help us a lot in terms of making sure that there are no research integrity problems that are inadvertent. And it’s a neat system in the fact that you can actually upload your own papers into the into the system, and it will image check across your own papers between them,” he said.

Research training options

Shifting to the topic of security training, Yates shared an alternative path to fulfill the CHIPS Act requirement for security training to be done within 12 months prior to any new or competing project application.

“If it’s going to go through peer review, you have to have done research-security training within 12 months before then, and for most faculty, that means you’ve got to do it annually,” he explained, noting many have already gone through the National Science Foundation (NSF)-approved modules, “which take over an hour.”

Now, the government has said equivalent training can be substituted.

“We have put together equivalent training. … I sent it to some trusted people to look at … and that will sort of be our retraining module for research security,” he said, noting he thinks “everyone should do” the government-endorsed NSF training. “Then for annual retraining, they can do our module.”

The Pitt module, Yates noted, will be “more conventional” than NSF’s. “It’s going to be a PDF file that you go through and then you take a quiz on CITI,” he said of CITI Program web training resources.

“It goes through all the major components that we did that are required by the government, in research, security training. Hopefully this is better for people.”

Researchers will be able to download the document, “which I think is also very helpful, because the problem with the NSF training is you couldn’t really go back and refer to it if you had questions,” Yates noted.

“This is something that you can refer to over and over again.”

Shannon O. Wells is a writer for the University Times. Reach him at shannonw@pitt.edu.

 

Have a story idea or news to share? Share it with the University Times.

Follow the University Times on Facebook.