30 security tools on Instagram completely failed

Oct Sun 2025 02:18:12

663 views

30 security tools on Instagram completely failed

A study has claimed that the security tools released by Instagram to protect teenagers are not effective. A study jointly conducted by various child protection organizations, including Cybersecurity for Democracy, has revealed that content that encourages suicide or self-harming activities is still easily found on the accounts of teenagers.

Instagram launched ‘Teen Accounts’ in 2024 with the aim of making teenagers safer through parental supervision. But out of 47 security tools tested by researchers, 30 were found to be completely unsuccessful or non-existent.

The study also accused Instagram’s algorithm of encouraging children under the age of 13 to engage in sexual and risky activities for ‘likes’ and ‘views’.

The Molly Rose Foundation, which is active in the field of child protection, has strongly criticized this as the result of Meta’s corporate culture that prioritizes engagement and profit over user safety. The foundation was established in memory of Molly Russell, a British teenager who committed suicide in 2017 due to the negative effects of online content.

How did you feel after reading this article?