YouTube moves to promote teen mental health by limiting repeated recommendations of content promoting body image issues

Published: 
Listen to this article
  • Google’s video-sharing platform has identified categories of content to limit, such as those idealising certain body types or displaying social aggression
  • Action taken in the US a week after several states accused Facebook and Instagram owner Meta of damaging children’s well-being
Agence France-Presse |
Published: 
Comment

Latest Articles

South China Morning Post wins big at global media awards

Faithful phrases: 9 idioms that will surely add a pious twist to your writing

Companion dogs comfort Hong Kong’s seniors through new programme

Taylor Swift’s storytelling shines in The Tortured Poets Department

Glowing animals go much further back in time than we thought

SOTY 2022/23: Art runs in the family for Visual Artist second runner-up

YouTube has tweaked its recommendation system in the United States to prevent teens from bingeing on videos idealising certain body types. Photo: AFP

A week after several US states accused Facebook and Instagram owner Meta of profiting “from children’s pain”, damaging their mental health and misleading people about the safety of its platforms, YouTube said it tweaked its recommendation system in the United States to prevent teens from bingeing on videos idealising certain body types.

YouTube’s video recommendation engine has been targeted by critics who contend it can lead young viewers to dark or disturbing content.

Google-run YouTube has responded by ramping up safety measures and parental controls on the globally popular platform.

Google-run YouTube has ramped up safety measures and parental controls in the hope of protecting teens. Photo: AFP

Working with an advisory committee, YouTube identified “categories of content that may be innocuous as a single video, but could be problematic for some teens if viewed in repetition”, YouTube director of youth and kids product James Beser said in a blog post.

Categories noted included “content that compares physical features and idealises some types over others, idealises specific fitness levels or body weights, or displays social aggression in the form of non-contact fights and intimidation”.

YouTube now limits repeated recommendations of such videos to teens in the United States and will extend the change to other countries over the coming year, according to Beser.

Meta, owner of Instagram and Facebook, bans ads targeted at teens based on gender

“Teens are more likely than adults to form negative beliefs about themselves when seeing repeated messages about ideal standards in content they consume online,” Beser said.

“These insights led us to develop additional safeguards for content recommendations for teens, while still allowing them to explore the topics they love.”

YouTube community guidelines already ban content involving eating disorders, hate speech, and harassment.

YouTube believes teens are more likely than adults to form negative beliefs about themselves while consuming certain kinds of online content. Photo: Shutterstock Images

“A higher frequency of content that idealises unhealthy standards or behaviours can emphasise potentially problematic messages – and those messages can impact how some teens see themselves,” Youth and Family Advisory Committee member Allison Briscoe-Smith, a clinician, said in the blog post.

“Guardrails can help teens maintain healthy patterns as they naturally compare themselves to others and size up how they want to show up in the world.”

YouTube use has been growing, as has the amount of revenue taken in from advertising on the platform, according to earnings reports by Google-parent Alphabet.

Your Voice: Stop rewarding influencers’ dangerous eating habits (letters)

US Surgeon General Vivek Murthy earlier this year urged action to make sure social media environments are not hurting young users.

“We are in the middle of a national youth mental health crisis, and I am concerned that social media is an important driver of that crisis – one that we must urgently address,” Murthy said in an issued advisory.

A few states have passed laws barring social media from allowing minors without parental permission.

Meta last week said it was “disappointed” by the suit filed against it and that the states should be working with the array of social media companies to create age-appropriate industry standards.

Sign up for the YP Teachers Newsletter
Get updates for teachers sent directly to your inbox
By registering, you agree to our T&C and Privacy Policy
Comment