The Chinese social network TikTok confirmed this Wednesday to the American media The Verge that it will globally prevent those under 18 years of age from using effects that alter appearance, in reference to beauty filters, a measure taken to protect their mental health.
TikTok indicated yesterday in a statement that the measure will be imposed “in the coming weeks” and that it responds to a study it commissioned from the British NGO Internet Matters, published yesterday, which raises concerns about the impact of these filters on the minors’ sense of identity.
The social network, which held a security forum in Dublin this week, said there is a “clear distinction” between filters designed to be “obvious and fun,” such as those that add animal features, and those that “alter your appearance” in a way that is almost undetectable to anyone looking.
TikTok, in its statement on the forum, promised to restrict the use of “some appearance effects” to those under 18 years of age and the head of Public Safety and Wellbeing Policies in Europe, Nikki Soo, confirmed that the measure will be applied throughout the worldpicks up The Verge.
The platform maintains that it already “proactively” tells users when certain effects have been used in the content they see, but now it will give “more information about how an effect can change its appearance” and will help make them understand the “undesired results” of these.
Last month, Fourteen US state prosecutors sued TikTok for harming children’s mental health and they accused her of using an addictive content system to profit from younger users, directly targeting the use of this type of filters.
Specifically, they denounced that “beauty filters” can lower self-esteem, especially that of younger girls, and cited studies according to which 50% do not look pretty without editing their faces and 77% say they try to change or hide some part of their body with that tool.
TikTok revealed in its note that it has 175 million monthly users in Europe and that it eliminates 6 million accounts each month created by children under 13 (its minimum age), which is why it collaborates with NGOs, legislators and regulators to reinforce its barriers, and consider using “machine learning” technology.
It also announced that in the coming weeks it will offer its users in 13 European countries local helplines with experts when they “report content in the app about suicide, self-harm and harassment”, which the platform already investigates and eliminates if it detects that their policies have been violated.