Skip to Main Content

When my son Conall took his life at age 17, it was devastating to me and our entire family. Conall was a caring, empathetic, and bright young man, but he had also struggled for many years with anxiety and depression. Like many kids his age, much of his life revolved around social media.

While I understand that suicide is complex, with many different factors contributing to someone’s risk for it, I know that social media intensified and fueled Conall’s insecurities and ultimately had a significant impact on his decision to die by suicide.

advertisement

As a pediatrician for more than 30 years, this is something I’ve seen happen more and more often over the years. Throughout the course of my work, I’ve seen the evolution of the internet and its impact on children from long before the days when everyone had a smartphone. I’ve witnessed tremendous changes in the kids I’ve taken care of — there’s no question that children in the U.S. are growing up in a fundamentally different world than the one their parents knew.

Today, kids live in a fully digital ecosystem. Few adults stop to think about the fact that the internet and social media platforms were never designed with kids in mind. I’ve been seeing more and more children with anxiety, depression, social isolation, insomnia, and other mental health challenges that were rare among youngsters when I began my career. While these cannot be wholly attributed to social media, it’s hard to ignore the impact these platforms have had on kids and teens.

Most of the parents who come into my office are doing their best to protect their kids. And while parents and pediatricians know that something needs to change, we also know we can’t do it alone. Social media platforms are designed to keep kids engaged — pushing them toward in-app purchases, targeting them with a mix of images and videos to keep them scrolling, serving up content they are keen to see but that is inappropriate for their development. Congress could help parents and pediatricians keep children safe online, now and in the future. But so far it hasn’t.

advertisement

Lawmakers on Capitol Hill are now closer than ever to taking meaningful action to make social media safer for kids. For the first time in years, there is real momentum behind a bill, the Kids Online Safety Act (KOSA), that would establish common-sense safeguards to protect kids from digital features that push harmful content to minors. This includes requiring online platforms to enable the strongest privacy and safety settings for minors by default, giving parents new controls, and setting a high standard to hold platforms accountable by creating a duty of care to keep kids safe from clearly defined harms.

The bill already has overwhelming bipartisan support in the Senate — in fact, there are enough Senators supporting it that it can’t be blocked from a vote — and it has just been introduced in the House of Representatives. And as another sign of momentum, it’s about to be considered in a Congressional hearing on Wednesday alongside important legislation that updates the Children’s Online Privacy Protection Act, putting an end to data practices that prey on young people. This is the most traction I have seen for legislation of this magnitude and this level of importance to kids’ digital health.

The internet is, and should continue to be, a part of kids’ lives. There are many positive benefits of growing up in a digital age, when it’s now possible to connect with the world with the click of a button. For many kids, the internet has meant being able to find supportive community spaces, new ways to connect with their friends, and gain perspective on the world as they find their place in it.

But the very real negative effects of what kids experience online every day can no longer be ignored. Right now, too many kids spend a lot of time online in spaces programmed to feed them highly engaging and customized content that takes advantage of their developing brains for commercial gain. Many of these platforms are intentionally built to keep kids’ eyes on the screens for as long as possible, leading to excessive use and preventing them from developing healthy relationships and self-control. What’s more, these algorithms continuously amplify harmful content — leading children and teens down dangerous rabbit holes of videos and images that promote eating disorders, glorify self-harm, and more.

A study from the Center for Countering Digital Hate, for instance, found that within eight minutes of account creation, TikTok recommended to 13-year-olds who briefly paused on videos about body image and mental health content in their feeds about eating disorders.

That shouldn’t be acceptable.

There are millions of parents like me who have watched our children come of age in an unrestricted and unprotected digital universe devoid of any guardrails that can keep them safe. It’s time to take action to protect children online.

I know I can’t bring Conall back, or take away the pain of losing him. But there are many kids like him out there who need help now. On behalf of pediatricians and parents across America, I urge Congress to do the right thing and help us help kids by passing the Kids Online Safety Act.

Molly O’Shea, M.D., is a pediatrician in Beverly Hills, Michigan.


If you are thinking about suicide, or concerned about a loved one, contact the 988 Suicide and Crisis Lifeline by calling or texting 9-8-8 or visiting www.988lifeline.org.

Have an opinion on this essay? Submit a letter to the editor here.

STAT encourages you to share your voice. We welcome your commentary, criticism, and expertise on our subscriber-only platform, STAT+ Connect

To submit a correction request, please visit our Contact Us page.