A number of charities including the Molly Rose Foundation have said Meta’s announcement is almost an acknowledgment that more could be done to protect children on Instagram.
Ged Flynn, chief executive of charity Papyrus Prevention of Young Suicide, said while it welcomed Instagram’s announcement, Meta was “neglecting the real issue that children and young people continue to be sucked into a dark and dangerous online world”.
“Parents contact us every day to say how worried they are about their children online,” he told the BBC.
“They don’t want to be warned after their children search for harmful content, they don’t want it to be spoon-fed to them by unthinking algorithms.”
Meanwhile Leanda Barrington-Leach, executive director at children’s charity 5Rights, said “if Meta is to take child safety seriously, it needs to return to the drawing board and make its systems age-appropriate by design and default”.
Burrows also cited prior research by the Foundation which indicated Instagram still “actively” recommends harmful content about depression, suicide and self-harm to “vulnerable young people”.
“The onus should be on addressing these risks rather than making yet another cynically timed announcement that passes the buck to parents,” he added.
Meta disputed the organisation’s findings published last September, saying it “misrepresents our efforts to empower parents and protect teens”.