Australia’s eSafety commissioner says none of the big technology companies are doing enough to stop images of “the most heinous abuse to children” from being shared online.

The criticism comes as the commission registers six new industry codes designed to better protect children from “lawful but awful” age-inappropriate content, including the “clear and present” danger posed by AI-driven companion chatbots.

Julie Inman Grant told ABC’s 7.30 that about 100,000 Australians a month have accessed an app that allows users to upload images of other people, including school students, to receive a depiction of what they would look like naked.

Inman Grant said she had not heard a major technology company express regret or shame for their role in enabling the sharing of child exploitation images.

“That’s what makes it all the more disturbing, having worked in the technology sector for 22 years,” Inman Grant said.

Sign up: AU Breaking News email

“I know what they are capable of, and not a single one of them is doing everything they can to stop the most heinous of abuse to children, being tortured and raped, and this imagery being perpetuated online.”

The commissioner said her agency first became concerned about the use of AI companion chatbots late last year, when it was told that children as young as 10 and 11 were spending several hours a day with bots that “were instructing them and engaging in specific sexual acts”.

The chatbots are engineered to simulate human-like interactions by adaptive learning and personalised responses.

Inman Grant said she told industry in April that the draft codes it had developed did not do enough to protect children from the harm posed by chatbots.

“I wanted under-18s to be prevented from using the chatbots, particularly instructing them on suicidal ideation, self-harm, disordered eating and sexualised acts,” Inman Grant told the ABC.

“This will be the first comprehensive law in the world that will require the companies, before they deploy them, to embed the safeguards and use the age assurance.”

The new codes will be implemented at the same time as the federal government introduces a ban on under-16s using social media in March next year. They will apply to app stores, interactive games, chatbots, pornography websites, generative AI tools and software manufacturers and suppliers.

Any service that hosts or facilitates access to content such as pornography, self-harm material, simulated gaming, or very violent material unsuitable for children will need to ensure children are not able to access that content.

“We know there has been a recent proliferation of these apps online and that many of them are free, accessible to children, and advertised on mainstream services, so it’s important these codes include measures to protect children from them,” Inman Grant said in a statement.

“I do not want Australian children and young people serving as casualties of powerful technologies thrust on to the market without guardrails and without regard for their safety and wellbeing.”

These new measures add to earlier codes that require search engines to have age assurance measures for all accounts, and where an account holder is determined to be aged under 18, the search engine would be required to switch on safe search features to filter out content such as pornography from search results.

One of the industry bodies that worked with the eSafety commissioner to develop the codes, Digi, said they would “introduce targeted and proportionate safeguards concerning access to pornography and material rated as unsuitable for minors under 18, such as very violent materials or those advocating or [giving instructions for] suicide, eating disorders or self-harm”.

Companies that do not comply with the codes face a fine similar to that of the social media ban – up to $49.5m for a breach. Other measures such as eSafety requesting sites be delisted from search results are also an option for non-compliance.

In Australia, the crisis support service Lifeline is 13 11 14. Children, young adults, parents and teachers can contact the Kids Helpline on 1800 55 1800. In the UK and Ireland, Samaritans can be contacted on freephone 116 123, or email jo@samaritans.org or jo@samaritans.ie. In the US, you can call or text the 988 Suicide & Crisis Lifeline at 988 or chat at 988lifeline.org. Other international helplines can be found at befrienders.org