The Washington PostDemocracy Dies in Darkness

Utah governor signs bill to curb children’s social media use

The sweeping restrictions aim to curtail kid and teen use of social media apps such as Instagram and TikTok

Updated March 24, 2023 at 12:14 a.m. EDT|Published March 23, 2023 at 7:13 p.m. EDT
(Richard Drew/AP)
6 min

Utah Gov. Spencer Cox (R) signed two bills into law Thursday that would impose sweeping restrictions on kid and teen use of social media apps such as Instagram and TikTok — a move proponents say will protect youth from the detrimental effects of internet platforms.

One law aims to force social media companies to verify that users who are Utah residents are over the age of 18. The bill also requires platforms to obtain parental consent before letting minors use their services, and guardians must be given access to their child’s account. A default curfew must also be set.

The Utah regulations are some of the most aggressive laws passed by any state to curb the use of social media by young people, at a time when experts have been raising alarm bells about worsening mental health among American adolescents. Congress has struggled to pass stricter bills on online child safety despite bipartisan concern about the effects social media has on kids.

The two bills previously passed in Utah’s state legislature.

Americans have concerns over TikTok, poll finds

“We’re no longer willing to let social media companies continue to harm the mental health of our youth,” Cox tweeted Thursday. “Utah’s leading the way in holding social media companies accountable — and we’re not slowing down anytime soon.”

The bill’s passage coincided with TikTok CEO Shou Zi Chew’s first appearance before Congress, during which he faced extensive grilling by lawmakers who say they are worried that the extraordinarily popular video app is hurting the welfare of children. They also said the company represented a national security threat because it is owned by Beijing-based ByteDance.

Tech companies have been facing increasing scrutiny by lawmakers and advocates over the effect of their services on adolescents. Last year, California state lawmakers passed the California Age-Appropriate Design Code Act, which requires digital platforms to vet whether new products may pose harm to minors and to offer privacy guardrails to younger users by default. But the tech industry group NetChoice sued to block the law, arguing that it violates the First Amendment and that tech companies have the right under the Constitution to make “editorial decisions” about what content they publish or remove.

Efforts to bolster federal rules governing how tech companies handle minors’ data and protect their mental and physical safety have stalled. Late last year, Senate lawmakers attempted to urge Congress to pass new online privacy and safety protections for children as part of an omnibus spending package.

Under the new Utah measures, tech companies must block children’s access to social media apps between 10:30 p.m. and 6:30 a.m., although parents would be allowed to adjust those limits. The platforms also must prohibit direct messaging by anyone the child hasn’t followed or friended, and they must block underage accounts from search results.

The Utah restrictions additionally bar companies from collecting children’s data and targeting their accounts with advertising. The effort also attempts to prohibit tech companies from designing features in their services that would lead to social media addiction among kids.

Privacy advocates say the bills go too far, and could put LGBTQ children or kids living in abusive homes at risk.

“These bills radically undermine the constitutional and human rights of young people in Utah, but they also just don’t really make any sense,” said Evan Greer, director of digital advocacy group Fight for the Future. “I’m not sure anyone has actually thought about how any of this will work in practice. How will a tech company determine whether someone is someone else’s parent or legal guardian? What about in situations where there is a custody battle or allegations of abuse, and an abusive parent is attempting to obtain access to a child’s social media messages?”

Common Sense Media, a media advocacy group for families, has a mixed reaction to Thursday’s news. In a statement on its site, the group says it supports one of the laws Utah passed, HB 311, which requires design changes to protect minors. The group does not support the second law, SB 152, which gives parents monitoring capabilities and requires parental consent to create social media accounts.

“Unfortunately, Governor Cox also signed SB 152 into law, which would give parents access to their minor children’s posts and all the messages they send and receive. This would deprive kids of the online privacy protections we advocate for.”

Industry groups have signaled that they have First Amendment concerns about the rules. NetChoice vice president and general counsel Carl Szabo said the group was evaluating next steps on the Utah law and was talking to other allies in the tech industry.

“This law violates the First Amendment by infringing on adults’ lawful access to constitutionally protected speech while mandating massive data collection and tracking of all Utahns,” Szabo said. In the past, NetChoice has teamed up with industry groups to challenge social media laws in Florida and Texas.

Social media platforms have been increasingly facing scrutiny for exposing young people to toxic content and dangerous predators. Earlier this year, the Centers for Disease Control and Prevention found that nearly 1 in 3 high school girls reported in 2021 that they seriously considered suicide — up nearly 60 percent from a decade ago. And some experts and schools argue that social media is contributing to a mental health crisis among young people.

It’s unclear how tech companies would be able to enforce the age restrictions on their apps. The social media companies already bar children under the age of 13 from using most of their services, but advocates, parents and experts say kids can easily bypass those rules by lying about their age or using an older person’s account.

Tech companies such as Meta, TikTok and Snapchat have also increasingly been tailoring their services to offer more parental control and moderation for minors.

Meta global head of safety Antigone Davis said in a statement that the company has already invested in “age verification technology” to ensure “teens have age-appropriate experiences” on its social networks. On Instagram, the company automatically set teens’ accounts to private when they join and send notifications encouraging them to take regular breaks.

“We don’t allow content that promotes suicide, self-harm or eating disorders, and of the content we remove or take action on, we identify over 99% of it before it’s reported to us,” Davis said. “We’ll continue to work closely with experts, policymakers and parents on these important issues.”

Snap declined to comment.