TikTok to Block Teenagers From Beauty Filters Over Mental Health Concerns



Teenagers are facing wide-ranging new restrictions over the use of beauty filters on TikTok amid concern at rising anxiety and falling self-esteem.

Under-18s will, in the coming weeks, be blocked from artificially making their eyes bigger, plumping their lips and smoothing or changing their skin tone.

The restrictions will apply to filters – such as “Bold Glamour” – that change children’s features in a way that makeup cannot. Comic filters that add bunny ears or dog noses will be unaffected. The billion-user social media company announced the changes during a safety forum at its European headquarters in Dublin.

The effectiveness of the restrictions will depend on people using the platform under their real age, which is not always the case.

There has been widespread concern that the beauty filters – some provided by TikTok, others created by users – have resulted in a pressure on teenagers, particularly girls, to adopt a polished physical appearance with negative emotional repercussions. Some young people have described how after using filters they found their real face ugly.

TikTok also announced it was tightening its systems to block users under 13 from the platform, which could mean that thousands of British children are turfed off the platform. Before the end of the year, it will launch a trial of new automated systems that use machine learning to detect people cheating its age restrictions.

The moves come with tougher regulation of underage social media use in the UK looming in the new year, under the Online Safety Act. The platform already removes 20 million accounts every quarter worldwide for being underage.

“We’re hoping that this will give us the ability to detect and remove more and more quickly,” Chloe Setter, TikTok’s lead on child safety public policy, said.

People wrongly blocked will be able to appeal. “It can obviously be annoying for some young people,” said Setter, but she added that the platform will take a “safety-first approach.”

Ofcom said in a report last December that from June 2022 to March 2023 about 1 percent of TikTok’s total UK monthly active user base were removed for being underage.

The regulator has previously warned the effectiveness of TikTok’s age restriction enforcement is “yet to be established.” It is due to start strictly enforcing over-13 age limits for social media users next summer, requiring “highly effective” age checks.

The new “guardrails” around beauty filters and age verification are part of a wave of adjustments to online safety being announced by social media platforms before tougher regulations are enforced in the coming months, with potential heavy fines for breaches of online safety rules.

Last week Roblox, the gaming platform with 90 million daily users, announced it would restrict its youngest users from accessing the more violent, crude and scary content on the platform after warnings about child grooming, exploitation and the sharing of indecent images.

Instagram, which is run by Meta, launched “teen accounts” for under-18s to allow parents greater control over children’s activities, including the ability to block children from viewing the app at night.

“It will not escape anyone’s attention that these shifts are being announced largely to comply with EU and UK regulation,” Andy Burrows, the chief executive of the Molly Rose Foundation, which was set up to focus on suicide prevention, said. “This makes the case for more ambitious regulation, not less.”

He called for TikTok to be fully transparent about how its age assurance measures will work and their effectiveness at reducing the number of under-13s on the platform.

“TikTok should act quickly to fix the systemic weaknesses in its design that allows a torrent of harmful content to be algorithmically recommended to young people aged 13 or over,” Burrows added.

The NSPCC described the age protection move as “encouraging” but “just the tip of the iceberg.”

“Other social media sites must step up and find effective ways to assess the ages of their users,” said Richard Collard, the charity’s associate head of policy for child safety online. “Ofcom and the government also have an important role to play in compelling tech bosses to deliver age-appropriate experiences for all their users.”



Source link

Related Posts

About The Author

Add Comment