TikTok is introducing new restrictions on beauty filters for users under 18, banning effects that alter features like eye size, lip volume, and skin tone.
The move aims to address concerns over rising anxiety and low self-esteem among teens. Filters such as "Bold Glamour" will be restricted, but those adding features like bunny ears or dog noses will remain unchanged, reports The Guardian.
The effectiveness of these measures relies on accurate age reporting, which isn't always the true. The changes were announced at a safety forum in Dublin, following concerns about the negative emotional impact of beauty filters on teens, especially girls.
TikTok revealed it will tighten its measures to block users under 13, which could lead to thousands of British children being removed from the platform. By the end of the year, it will trial new automated systems using machine learning to detect users evading age restrictions.
The moves come with tougher regulation of underage social media use in the UK looming in the new year, under the Online Safety Act. The platform already removes 20m accounts every quarter worldwide for being underage.
Chloe Setter, TikTok’s lead on child safety public policy, said: “We’re hoping that this will give us the ability to detect and remove more and more quickly.”
People wrongly blocked will be able to appeal. “It can obviously be annoying for some young people,” said Setter, but she added that the platform will take a “safety-first approach”.
Ofcom said in a report last December that from June 2022 to March 2023 about 1% of TikTok’s total UK monthly active user base were removed for being underage.
The regulator has previously warned the effectiveness of TikTok’s age restriction enforcement is “yet to be established”. It is due to start strictly enforcing over-13 age limits for social media users next summer, requiring “highly effective” age checks.
The new “guardrails” around beauty filters and age verification are part of a wave of adjustments to online safety being announced by social media platforms before tougher regulations are enforced in the coming months, with potential heavy fines for breaches of online safety rules.
Last week Roblox, the gaming platform with 90 million daily users, announced it would restrict its youngest users from accessing the more violent, crude and scary content on the platform after warnings about child grooming, exploitation and the sharing of indecent images.
Instagram, which is run by Meta, launched “teen accounts” for under-18s to allow parents greater control over children’s activities, including the ability to block children from viewing the app at night.
Andy Burrows, the chief executive of the Molly Rose Foundation, which was set up to focus on suicide prevention, said: “It will not escape anyone’s attention that these shifts are being announced largely to comply with EU and UK regulation. This makes the case for more ambitious regulation, not less.”
He called for TikTok to be fully transparent about how its age assurance measures will work and their effectiveness at reducing the number of under-13s on the platform.
Burrows added: “TikTok should act quickly to fix the systemic weaknesses in its design that allows a torrent of harmful content to be algorithmically recommended to young people aged 13 or over.”
Bd-pratidin English/ Afia