Jim Gamble is the chief executive of the INEQE Safeguarding Group.
He said the creation and distribution of AI-generated images is a problem that knows no “geographical bounds”.
He described the ordeal as “traumatising” not just for the victims, but also for their families, and for the young people who created the images.
Gamble said that due to the accessibility of AI-apps, and the fact many are not age restricted, means perpetrators may believe that “because they were able to go online and search for a nudification app and find one, then it may not possibly be unlawful”.
That he said, is not the case, and “the unfortunate thing is that these young people are committing the exact same criminal offence as creating an indecent pornographic image”.
Gamble has called for calm, and urged anyone who may have created an image, or been the victim one, to come forward their school, and to the police.
“Everybody needs to pause and reflect that we are dealing with children and do the right thing so everybody walks away from this with lessons learned and lives intact.”