The Albanese government this week revealed its National AI Plan, after years of consulting and a notable pivot to favour what amounts to self-regulation for a global industry worth trillions of dollars.
The new, “opportunity first” approach aims to capitalise on the productivity and innovation that AI is said to deliver. The plan notably rejects the mandatory protective guardrails that were included in a standalone artificial intelligence act proposed by former industry minister Ed Husic.
Critics are wary of the extensive lobbying behind the creation of a “whole of government framework” that seeks to accelerate investment in energy-intensive data centres and training initiatives, without a central regulatory authority.
Industry and Innovation Minister Tim Ayres rejects that framing, emphasising the establishment of a $30 million AI Safety Institute and a ministerial promise of further “refinement”.
“There are clear accountabilities for existing regulators and existing portfolio agencies to do their work, and the AI Safety Institute is there to support that work.”
Of the original plan, Ayres says, “the alternative approach risks not just duplication and being flat-footed in a fast-paced, technological, changing world but also risks undermining the clarity of accountability and the responsibility of parts of government to do the work that is their job to do”.
Ayres calls the plan an “agile” yet “pragmatic” Australian approach to AI.
“There is so much in strategic terms, in economic terms, in terms of our health, in terms of our energy system, and science and research and development that artificial intelligence offers. And I want to make sure that Australia secures that benefit,” Ayres tells The Saturday Paper.
Independent Senator David Pocock is among those accusing both the minister and Assistant Minister for Science, Technology and the Digital Economy Andrew Charlton of ceding too much power to tech companies, including ChatGPT’s owner OpenAI, Meta and Google.
“There’s been huge lobbying from OpenAI [and] others, big movement between ministers’ offices and groups that are clearly involved,” Pocock tells The Saturday Paper.
“They’ve leant on the Trump administration, where they can basically do whatever they want, and I think that pressure is being felt here in Australia, not wanting to rock the boat and stand up to the big AI companies.”
The Tech Council of Australia says it has been “deeply engaged’ with the government throughout the development of the plan, and predicts AI will deliver 200,000 new jobs and $115 billion a year to the Australian economy by 2030.
The peak body’s founding chief executive, Kate Pounder, who is now Australian policy liaison for OpenAI, created the tech consulting firm AlphaBeta Advisors with Andrew Charlton before it was sold to Accenture in 2020. Pounder’s husband, Andrew Dempster, recently left the Prime Minister’s Office after six years as a senior adviser to Anthony Albanese.
The Saturday Paper sought comment from Pounder, but she was unavailable.
The Greens say Australians want the government to push back on big tech.
“If we fail to put proper guardrails and really consider our safety and our risk before we deploy AI, that’s not AI’s fault. That’s our fault.”
“It’s no secret who’s lobbying the government: Meta, Google, Twitter, the tech bros who are dominating the investment, are coming in here and dominating the Labor government,” Senator David Shoebridge of the Greens told reporters on Tuesday.
“They had a meeting here just barely a month ago to come and push, it turns out, push the Labor government to put no handrails around AI to allow them to maximise their profits by driving people down extremism … those profits go offshore and we’re left with the mess here in Australia.”
The minister rejects the assertion he is too close to the tech industry.
“I just don’t think the binary that is presented by some of the commentators is a real thing. This is about lifting government capability at the heart of government,” Ayres says. “Of course we’re engaged with industry, but we’re engaged with civil society, trade unions here too, and focused on the national interest.
“And what is our purpose here? It’s about securing investment and economic opportunity in Australia, spreading the benefits.”
The plan seeks to focus on Australian opportunities over the influence of global tech titans, with multiple references to “sovereign” AI – a term that’s yet to be fully defined.
Data centres globally are expected to attract as much as $7 trillion in investment by 2030, according to management consultancy McKinsey & Company. They are central to the national plan, alongside a focus on “sovereign compute capability” and making sure Australian workplaces are “fully AI capable”.
There is an already announced sovereign AI framework for the public service, with $460 million in existing AI grant and investment funding being pulled into the national plan and an extra $1 billion for critical technologies under the National Reconstruction Fund.
Labor Senator Michelle Ananda-Rajah, a former AI startup founder who has been recognised for her work applying the technology in medicine, is focused on the need for Australia to compete in terms of innovation.
Ananda-Rajah says local AI startups with talent and capability need help to “squeeze through the shoulders of tech titans”.
“I’m fine with us becoming a data centre capital of the world … But I would also like to see capability within that data infrastructure carved out for our own SMEs [small and medium-sized enterprises] and startups to use.
“When we give Australians options, they will flock to our own companies. We have to do this in a way that we keep at bay the US gorillas.”
The Coalition’s objections so far have focused on the energy required to power AI, and the government’s provision for the union movement to have a “strong voice” to consult on and co-design AI systems in workplaces.
“It lays the groundwork for more union interference in Australian workplaces,” Ayres’s shadow minister Alex Hawke said in a statement.
Union concerns about AI relate to workplace surveillance, privacy, job losses and skills training.
“Everybody’s got an interest here, and there’s much more that is to be shared and worked on together than the sort of cartoon of the debate that is sometimes drawn in the national press,” says Ayres, who is a former official with the Australian Manufacturing Workers’ Union.
“Waves of technology have reshaped Australian work and Australian workplaces,” he says, referring to tools such as algorithms to manage workflow. “Artificial intelligence is an add-on that enables some of those pre-existing trends.”
Among the most concerning new tools are the AI agents that are overtaking chatbots, undertaking increasingly complex tasks and making sometimes rogue decisions.
“We are teaching them agency on our behalf,” AI ethicist Bec Johnson tells The Saturday Paper.
“That’s moving AI up 10 times. So instead of just having a chatbot to deal with, and maybe some of the negative implications that could come out of that, maybe racism or whatnot coming from our chatbot, now we’re going to put AI agents in the world. Think about that in a military application. Think about it in a cybersecurity application. We’ve just massively increased our risk.
“We really need to plan for that.”
She says the development and deployment of AI cannot be divorced from politics.
“It is deeply, deeply entwined,” says the Sydney University academic. “Australia wants to jump aboard and seize all these opportunities and build all these data centres to keep us at the economic forefront. Okay, that’s great, but that’s also a political decision, as is the change to the plan.
“Last year, we had one set of ministers in. Now we’ve got another set of ministers. So, we’ve seen our response to AI change in relation to how our politics change.”
Johnson wants the language around AI to be clear: AI is created, modelled, designed and directed by humans. “If we fail to put [in] proper guardrails and really consider our safety and our risk before we deploy AI, that’s not AI’s fault. That’s our fault,” she says.
OpenAI is dropping guardrails around ChatGPT to allow a more flexible, customisable “personality” that is less “sycophantic”. Founder and chief executive Sam Altman announced in October that the platform would soon allow for erotica for age-verified users as part of its “treat adult users like adults” principle.
The restrictions that are being relaxed assisted a “very small percentage of users in mentally fragile states”, Altman said, but made the chatbot “less useful/enjoyable to many users who had no mental health problems”.
He said there are now “enhanced tools” to protect users with mental health issues.
Asked if the Australian government should step in to the space vacated by tech companies and offer greater protections, Ayres says, “we’ll be watching all of these developments very closely”.
“The National AI Centre released a set of voluntary guardrails for Australian businesses that are – I don’t want to mingle the language here – but guidelines as well as guardrails,” the minister says.
“I want to see how those are adopted, work with colleagues to monitor that very closely. Our approach is going to evolve, and it will be supported by the AI Safety Institute.”
The government moved this week to issue guidance to AI developers to “watermark” content so Australians can distinguish AI-generated content. There is no legal requirement to identify such content, which could be used to mislead or even exploit and blackmail people, with potentially devastating consequences.
“We’re getting to the point now where it is getting incredibly hard to tell the difference between a deepfake and a real video or photo,” says Senator Pocock.
“The government’s not doing anything on labelling, on legislating against being able to deepfake someone without their consent in the agentic space. They’re just letting it rip. No safeguards in place to protect young people, to protect vulnerable Australians. I’m really concerned by the lack of political will around this.”
Communications Minister Anika Wells, fresh from promoting Australia’s world-first social media ban for under 16s, is moving to take action against apps and technologies that offer features “used solely to abuse, humiliate and harm people, especially our children”. These include tools for “nudification” and undetectable online stalking.
The industry minister says the tech sector has “enormous responsibilities”.
“If they want to see trust in the products and platforms that they are adopting, transparency, openness, engaging with government, that’s going to be the key,” Ayres says.
He emphasises the vigilance of the new institute. “Here, the AI Safety Institute is focused on Australia’s domestic settings, but it’s internationally focused and cooperating with our partners overseas as well. These safety issues are fundamental to our like-minded partners around the world. This is an evolving set of issues.”
The biggest challenge when it comes to AI is one of enforcement of Australian law, says Ed Santow, a former human rights commissioner and co-director of University of Technology Sydney’s Human Technology Institute.
“What we need to do, more than anything else, is to support regulators to be able to be really effective,” Santow tells The Saturday Paper. “With AI, we’ve not been effective enough in enforcing the law.”
Santow says a sense of urgency is needed.
“I think this gives us a clear chance to hold the federal government to account … We, at this point, we haven’t done much beyond kind of big statements. The government believes that its approach and strategy here will not require some economy-wide piece of legislation. Okay, well, let’s see whether that’s true.
“Let’s have a really rigorous kind of assessment framework that determines whether or not this kind of statute by statute reform will actually deliver the protections that Australians need.”
In its proposal to position Australia as a major international data centre, the plan addresses the extraordinary amounts of energy and water needed to run such infrastructure as issues to be explored with the states and territories.
The national plan notes: “Data centre operators have demonstrated interest in investing in Australia in ways that manage these impacts.
“For example, conventional data centre cooling systems can consume tens of millions of litres annually, but Australian operators are adopting innovative solutions such as highly efficient liquid cooling to significantly reduce water consumption. Many operators are already contributing to additional renewable energy generation and storage as part of their projects.”
The Australian Energy Market Operator expects that data centres will consume 6 per cent of the energy going into the national grid by 2030, but the growth will slow over the next two decades to absorb 12 per cent of the national energy grid by 2050.
The minister says he will have “more to say” on guidelines for data centre investment early next year.
Tech giants will be pushed to build their own energy sources and cooling systems that don’t require too much water.
“I want to see data centre investments that underwrite and pay for more electricity sector generation,” Ayres says. “The recent Microsoft investment in Australia has underpinned a 300-megawatt solar power station just north of Albury at Walla Walla. We want to see more examples like that.”
Less energy will be required for computing power as time goes on, says Ananda-Rajah.
“The software is constantly evolving, but the hardware is also evolving.”
This article was first published in the print edition of The Saturday Paper on
December 6, 2025 as “Sense of self-regulation”.
For almost a decade, The Saturday Paper has published Australia’s leading writers and thinkers.
We have pursued stories that are ignored elsewhere, covering them with sensitivity and depth.
We have done this on refugee policy, on government integrity, on robo-debt, on aged care,
on climate change, on the pandemic.
All our journalism is fiercely independent. It relies on the support of readers.
By subscribing to The Saturday Paper, you are ensuring that we can continue to produce essential,
issue-defining coverage, to dig out stories that take time, to doggedly hold to account
politicians and the political class.
There are very few titles that have the freedom and the space to produce journalism like this.
In a country with a concentration of media ownership unlike anything else in the world,
it is vitally important. Your subscription helps make it possible.
Send this article to a friend for free.
Share this subscriber exclusive article with a friend or family member using share credits.
Used 1 of … credits
use share credits to share this article with friend or family.
You’ve shared all of your credits for this month. They will refresh on January 1. If you would like to share more, you can buy a gift subscription for a friend.
SHARE WITH A FRIEND
? CREDITS REMAIN
SHARE WITH A SUBSCRIBER
UNLIMITED
Loading…