TikTok Announces Changes to Under-18's Using Beauty Filters
Is the needle moving in terms of child online safety?
Last week, TikTok announced that teenagers globally will be facing wide-ranging new restrictions over the use of beauty filters amid concern of rising anxiety and falling self-esteem.
Many teens report that after using filters like ‘Bold Glamour’ they immediately feel less attractive, and yet feel pressured to use the filters in order to conform with everyone else online.
“When you’ve got that filter up all the time … you almost disassociate from that image in the mirror because you have this expectation that you should look like that. Then when you don’t, the self-destructive thoughts start. It’s quite vile the way that you then perceive yourself.”
In a 2016 study, 14-18 year old girls were shown 10 ‘original’ or ‘filtered’ selfies. They were asked to answer questions about their body image immediately afterwards, and results showed that exposure to manipulated photos directly led to lower body image, especially for those with higher self-comparison tendencies.1
Research has also shown that even when filters are labelled as such, it doesn’t seem to make a difference to the viewer unless you show people the actual real version of that person’s appearance.
Under-18s will, “in the coming weeks,” be blocked on TikTok from artificially making their eyes bigger, plumping their lips and smoothing or changing their skin tone- filters that cannot be recreated by makeup alone. Face-altering filters are still currently available on Instagram and Snapchat.
An obvious question for me is how TikTok will age-gate these changes, which is a common question as platforms roll out these apparent ‘safety changes’. TikTok announced they are ‘experimenting’ with age verification technology, and that before the end of the year, it will launch a trial of new automated systems that use machine learning to detect people cheating its age restrictions. Howvever, this will apply for under-13s only. So if a user is 14, what’s stopping them from changing their birth date to say they are over 18 so they can continue to use filters like ‘Bold Glamour’ and other filters? Nothing. The same question arises for Instagram’s new “Teen Accounts” which sound great on the surface, but there has still been no plan for how they will age verify (so the changes may actually be useless).
On the surface, these announcements are promising. Even if they are smaller changes, they’re something. They will likely help some kids, which is a good thing. But it’s also clear many of these announcements are smoke and mirrors without closing the loop about how they will age-verify. It feels like they want the positive publicity upfront, and then hope the follow-through will be forgotten.
I remain steadfast that the most impactful changes happen inside the home. Parents can’t control the safety measures platforms put in place or how they’re enforced (or whether they’ve enforced at all). But they can control, to the best of their ability, how early their kids gain access to the platforms, and the conversations they have around digital safety. The more parents who believe Tiktok and the like aren’t necessary (or healthy) for their child’s development, and come together in choosing to delay, the less kids will need it. And the less they will compare themselves to some unattainable beauty standards these platforms try to shove down their throats.
https://www.researchgate.net/publication/311716652_Picture_Perfect_The_Direct_Effect_of_Manipulated_Instagram_Photos_on_Body_Image_in_Adolescent_Girls