In late 2024, Meta provided Instagram Teen, a safety network aimed at protecting small minds from sensitive content and ensuring that they have safe online interactions, supported by age detection technology. Adolescent accounts are automatically classified as hidden and hidden words, and messages are banned from strangers.
According to the investigation that was performed by non -profit, it focuses on young people, its design for us, and Responsible technologyDo not cast from the handagram’s teen handrails on their promise. Over two weeks, five test accounts have been tested for teenagers, all of whom have been shown sexually despite Meta promises.
A barrage of sexual content
All test accounts have been provided inappropriate content despite the enabled the sensitive content filter in the application. The report says: “4 out of 5 of the Test Teen accounts are recommended in the form of an algorithm and turbulent food content,” the report says.
Moreover, 80 % of the participants reported that they suffered from distress while using Instagram Teen accounts. Interestingly, only one of the five test accounts is to show educational images and videos.
“[Approximately] 80 % of the content in my summary was associated with raw sexual relationships or jokes. This content has been out of being completely honest or displaying direct graphics images, but he also left only a very little imagination, “and one of the laboratories was quoted as saying.
According to the 26 -page report, 55 % of the content is marked by sexual acts, sexual behavior and sexual images. These videos have accumulated hundreds and thousands of likes, with one of them exceeding 3.3 million likes.
The Instagram algorithm also prompted the content to strengthen harmful concepts such as “ideal” body types, body threads, and eating habits. Another concern topic was videos that strengthened alcohol consumption and videos that prompted users to use steroids and nutritional supplements to achieve a specific type of masculine body.
A complete package of bad media
Despite Meta’s allegations of liquidation of problematic content, especially for teenagers, test calculations have also been shown racist, gay and women’s content. Again, these clips gathered millions of likes. Videos that show arms violence and home abuse were also pushed to adolescent accounts.

“Some of our TEN Teen’s accounts have not received the default protection of Meta. No account has received sensitive controls in the content, while some have not received protection from offensive comments,” the report adds.
This will not be the first time that Instagram (other social media platforms in META, in general), will not be found that serves the content content. In 2021, leaks revealed how Meta is aware of the harmful effect of Instagram, especially on young girls who deal with mental health and body image issues.
In a joint statement with Washington PostMeta claimed that the report’s results are defective and reduce the sensitivity of the mark. A little more than a month ago, the company has also sought to protect teenagers to Facebook and Messenger as well.
“The manufactured report does not change the fact that tens of millions of teenagers now have a safer experience thanks to Instagram Teen’s accounts,” a Meta spokesman was quoted as saying. However, they added that the company was looking into the recommendations of the problematic content.