Tim could not believe the headline when he read it.
Even more so because it was his own work.
The former Australian Community Media (ACM) journalist, whose name has been changed to protect his employment prospects, said an internal generative artificial intelligence (AI) model had produced the headline for his news article, which was going to be published in the next day’s printed newspaper.
The headline, according to Tim, was legally problematic.
“It had generated something false from the story,” he said.
“With my knowledge about the story, I knew that could have potentially defamed someone who could have been wrongly identified from what was generated.”
He caught the error just in time, but wondered what might have happened had he not spotted it.
“It made me feel frustrated and a little bit anxious because I wondered what else could have been possibly published in print that had gone unchecked.”
ACM publishes newspapers around Australia. (ABC Riverina: Gary-Jon Lysaght)
ACM reporter Terri, whose name has been changed because she feared speaking out would jeopardise her job, said she was given legal advice about a news story from the AI model.Â
She said the advice was troubling.Â
The legal guidance, she said, had “logic and thinking” but greatly overstated the legal risks the story might pose.
“It was not right,” she said.
“The AI returned a lot of information saying that the story posed a defamation risk [and] going through what it returned, I don’t think it was correct.”
Terri said she did not follow the artificial intelligence’s legal advice but said it set a worrying precedent.
“It’s sort of analogous to Googling your symptoms instead of going to a doctor,” Terri said.
“There’s a reason why you have paid professionals who have to do a lot of schooling to get the qualifications to make those calls.”
Generative AI ‘experiments and testing’
In a leaked email to staff on October 3, seen by the ABC, ACM management said “AI experiments and testing” were happening in its newsrooms in three ways, including story editing and coaching, headline writing, and to help come up with story ideas.
But the ABC understands the rollout of the technology is also being used to analyse a news story’s legal risk.
It is understood the generative AI model being used is Google’s Gemini, and has been adapted for the news company so ACM data on the platform will not be shared with Google.
ACM runs mastheads across large swathes of east coast Australia. (ABC News: Lucas Forbes)
Media, Entertainment and Arts Alliance (MEAA) director Cassie Derrick said union members within ACM claimed, at some newspapers, there had been a directive given to use Gemini for “all aspects of reporting”.Â
“In those instances, Gemini is making things up, misattributing some pretty important facts like charges in court, and generally looking to replace journalists and the ethical practice of those journalists with a more unethical software,” she claimed. Â
Ms Derrick said a member was instructed to write an article based on court proceedings, through ACM’s generative AI model.
“Gemini has attributed charges to the wrong person,” she said.Â
“That journalist caught it, by doing the fact checking, but had they not, it obviously would have been a disaster.
“Not only for the journalist, but also for the person who had been wrongly accused.”Â
Fears of job losses
ACM employee, Sam, whose name has also been changed to protect his identity, feared the technology could be used as a justification for job cuts.Â
“Some people will lose jobs and the ones who are left behind will be left picking up the pieces,” he said.
“AI won’t completely fill the hole that’s been left behind by the people who have left.”Â
Some ACM reporters told the ABC they refused to use the technology.Â
ACM cut 35 jobs last year, blaming Facebook’s parent company, Meta, for withdrawing funds for regional journalism it had previously provided as a result of the News Media Bargaining Code.Â
Sam said the ACM workforce was stretched.Â
“In our newsroom, because we’re using AI pretty lightly, I don’t think that’s really had an impact on our journalism, yet,” he said.
“There is that fear [that] if we were to become increasingly reliant on it [AI], are we going to be churning out stuff that has mistakes?”
The ABC understands AI testing for headlines has occurred for more than 12 months. (ABC News: Sharon Gordon)
AI ‘not a replacement for journalists’
The ABC has found no evidence that any factual errors or legally problematic information, made by AI technology, have been published in print or online by ACM. Â Â
In a statement responding to questions from the ABC, an ACM spokesperson said the assertions around how generative AI was used in its newsrooms were “flawed”.
“We do not use Gemini to write stories or rely on it for legal advice,” the spokesperson said.
“Humans make the decisions on every word we publish. ACM is cautiously, carefully and openly exploring tools that can help us to better serve our communities.Â
“AI is not a replacement for journalists, editors or lawyers. Integrity and accuracy are not negotiable.Â
“We will keep listening to our teams, providing information and training, and driving responsible innovation that supports our journalism.”
AI use ‘widespread’ in journalism
RMIT University AI and media expert TJ Thomson said the technology was increasingly being used in newsrooms for “all sorts of things”.
“I wouldn’t say that these issues or these fears are unique to ACM outlets, I think they’re very widespread,” Dr Thomson said.
“From behind-the-scenes uses to more public-facing uses, it is becoming more and more prevalent.”
But he said that seeking legal advice from AI could be particularly fraught, given its geographic bias.
“These models that they’re turning to have been trained primarily with information scraped from the World Wide Web, a lot of it from North America, which is a very different legal context,” he said.
“It’s a tricky balance that editors and journalists are having to strike with this technology.”
TJ Thomson also pointed to NewsCorp’s introduced an internal model, called NewsGPT. (Supplied: Anthony Wheate)
The ABC has introduced an internal generative AI, called ABC Assist, which helps journalists search for ABC archival information, summarise old ABC content and draft interview questions, among other things.
“In the news division, all uses of AI tools in the production of audience-facing content must be referred to an editorial manager,” the ABC’s editorial guidance note states.
The New York Times has also recently updated information on its website about how its journalists use AI to analyse data and test headlines.
“We don’t use AI to write articles, and journalists are ultimately responsible for everything that we publish,” the New York Times said on its website.
In Australia, News Corp has been hiring AI engineers to join an editorial team.
Google declined to comment.
Loading…