LONDON, Feb 26 – Parents using Instagram’s child supervision tools will soon receive alerts if their teen repeatedly searches for suicide or self-harm related terms on the platform.It is the first time parent company Meta will proactively alert parents to searches by their child on Instagram for harmful material, rather than block searches and direct users to external help.Parents and teens enrolled in Instagram’s Teen Accounts experience in the UK, US, Australia and Canada will be notified about the alerts from next week, with the rest of the world to follow later.But suicide prevention charity, the Molly Rose Foundation, has strongly criticised the measures, warning they “could do more harm than good”.“This clumsy announcement is fraught with risk and we are concerned that forced disclosures could do more harm than good,” said its chief executive Andy Burrows.The organisation was established by the family of Molly Russell, who took her own life in 2017 at the age of 14 after viewing self-harm and suicide content on platforms including Instagram.Burrows said “every parent would want to know if their child is struggling, but these flimsy notifications will leave parents panicked and ill-prepared to have the sensitive and difficult conversations that will follow”.Meta says alerts to parents about their child searching for suicide and self-harm material within a short space of time on Instagram will also be accompanied by expert resources to help them navigate difficult conversations.However, Molly Russell’s father Ian, who set up the Molly Rose Foundation in her honour, remains sceptical about the alerts.“Imagine being a parent of a teenager and getting a message at work saying ‘your child is thinking of ending their life’… I don’t know how I’d react,” he told the BBC.“And even if Meta say they’re going to supply support to that parent, in that moment of panic when you hear that about your child, I don’t think that’s a very sensible way of doing things.”‘Neglecting the real issue’A number of charities including the Molly Rose Foundation have said Meta’s announcement is almost an acknowledgment that more could be done to protect children on Instagram.Ged Flynn, chief executive of charity Papyrus Prevention of Young Suicide, said while it welcomed Instagram’s announcement, Meta was “neglecting the real issue that children and young people continue to be sucked into a dark and dangerous online world”.“Parents contact us every day to say how worried they are about their children online,” he told the BBC.“They don’t want to be warned after their children search for harmful content, they don’t want it to be spoon-fed to them by unthinking algorithms.”Meanwhile Leanda Barrington-Leach, executive director at children’s charity 5Rights, said “if Meta is to take child safety seriously, it needs to return to the drawing board and make its systems age-appropriate by design and default”.Burrows also cited prior research by the Foundation which indicated Instagram still “actively” recommends harmful content about depression, suicide and self-harm to “vulnerable young people”.“The onus should be on addressing these risks rather than making yet another cynically timed announcement that passes the buck to parents,” he added.Meta disputed the organisation’s findings published last September, saying it “misrepresents our efforts to empower parents and protect teens”.