Parents using Instagram’s child supervision tools will soon receive alerts if their teen repeatedly searches for suicide or self-harm related terms on the platform.
It is the first time parent company Meta will proactively alert parents to searches by their child on Instagram for harmful material, rather than block searches and direct users to external help.
Parents and teens enrolled in Instagram’s Teen Accounts experience in the UK, US, Australia and Canada will be notified about the alerts from next week, with the rest of the world to follow later.
But suicide prevention charity, the Molly Rose Foundation, has strongly criticised the measures, warning they “could do more harm than good”.
“This clumsy announcement is fraught with risk and we are concerned that forced disclosures could do more harm than good,” said its chief executive Andy Burrows.
The organisation was established by the family of Molly Russell, who took her own life in 2017 at the age of 14 after viewing self-harm and suicide content on platforms including Instagram.
Burrows said “every parent would want to know if their child is struggling, but these flimsy notifications will leave parents panicked and ill-prepared to have the sensitive and difficult conversations that will follow”.
Meta says alerts to parents about their child searching for suicide and self-harm material within a short space of time on Instagram will also be accompanied by expert resources to help them navigate difficult conversations.
However, Molly Russell’s father Ian, who set up the Molly Rose Foundation in her honour, remains sceptical about the alerts.
“Imagine being a parent of a teenager and getting a message at work saying ‘your child is thinking of ending their life’… I don’t know how I’d react,” he told the BBC.
“And even if Meta say they’re going to supply support to that parent, in that moment of panic when you hear that about your child, I don’t think that’s a very sensible way of doing things.”