Six years before Meta incorporated the function that blurs nude images in direct messages (DM) on its social networks, doubts already arose internally about this type of content, which has led the company to question the delay in its implementation.
In April 2024, Instagram introduced a series of tools aimed at protecting users against ‘sextortion’ and grooming’ – sexual harassment from an adult to a minor – through sending intimate images in DMs, which included a function that blurred images in which nudity was detected, so that the user could choose whether they wanted to see them or not.
LOOK: NASA delays the Artemis II mission, which will take humans back to the Moon
The concern about sending these types of photographs through direct messages is much older. According to a statement recently revealed in a federal lawsuit, a 2018 email chain with Meta Vice President and Chief Information Security Officer Guy Rosen already warned that “horrible things” could happen.
“I think it is quite clear that problematic content can be sent in any messaging application, be it Instagram or any other,” the head of Instagram, Adam Mosseri, recently declared in the context of the lawsuit against social networks for their potential addictive and harmful design, and reported in TechCrunch.
Mosseri, questioned by prosecutors who wanted to know why the company had taken so long to implement basic protection measures in direct messages, said the company was trying to find a balance between its interest in user privacy and its interest in security.
His statement also revealed data from a survey of users between 13 and 15 years old: 19.2 percent acknowledged that they had seen nude or sexual images on Instagram that they did not want to see, and 8.4 percent that they had seen someone hurt themselves or threaten to do so in the last seven days of using the application.
This statement is part of the lawsuit that accuses Meta, among other companies in the sector, of deliberately designing addictive Instagram and Facebook functions for users, especially minors, with harmful consequences for their mental health.
Recently, Meta’s CEO, Mark Zuckerberg, testified before a jury in Los Angeles (United States), in which he was questioned about the functions designed for its social networks and its strategy to attract users.
The manager maintained that they are designed to make Instagram “useful” and not to deliberately increase users’ time on the platform, attract minors or cause damage to mental health.
https://myflixer.soap2day.day/whistle-2025/
https://myflixer.soap2day.day/the-running-man-2025/
https://myflixer.soap2day.day/roofman-2025/
https://myflixer.soap2day.day/primate-2025/
https://myflixer.soap2day.day/if-i-had-legs-id-kick-you-2025/
https://myflixer.soap2day.day/cold-storage-2026/
https://myflixer.soap2day.day/greenland-2-migration-2026/
https://myflixer.soap2day.day/weapons-2025/
https://myflixer.soap2day.day/f1-the-movie-2025/
https://myflixer.soap2day.day/no-other-choice-2025/
https://myflixer.soap2day.day/avatar-fire-and-ash-2025/
https://myflixer.soap2day.day/how-to-train-your-dragon-2025/
https://myflixer.soap2day.day/28-years-later-the-bone-temple-2026/
https://myflixer.soap2day.day/is-this-thing-on-2025/
https://myflixer.soap2day.day/anaconda-2025/
https://myflixer.soap2day.day/blue-moon-2025/
https://myflixer.soap2day.day/people-we-meet-on-vacation-2026/
https://myflixer.soap2day.day/return-to-silent-hill-2026/
https://myflixer.soap2day.day/dhurandhar-2025/
https://myflixer.soap2day.day/nuremberg-2025/
https://myflixer.soap2day.day/sentimental-value-2025/
https://myflixer.soap2day.day/the-secret-agent-2025/
https://myflixer.soap2day.day/train-dreams-2025/
https://myflixer.soap2day.day/wake-up-dead-man-2025/
https://myflixer.soap2day.day/shelter-2026/