Trade Law Daily is a Warren News publication.
‘Lot of Momentum’

Senate Commerce Poised for Potential Kids Privacy Markup

The Senate Commerce Committee is poised to potentially mark up legislation that would establish a duty of care for social media platforms to protect children’s online privacy (see 2202160055), bill supporters told us.

Sign up for a free preview to unlock the rest of this article

Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.

The Kids Online Safety Act, from Senate Consumer Protection Subcommittee Chairman Richard Blumenthal, D-Conn., and ranking member Marsha Blackburn, R-Tenn., has the bipartisan support it needs to advance through full committee, observers said.

Chair Maria Cantwell, D-Wash., has been eyeing a markup of Big Tech legislation since late May (see 2205250066). Blumenthal’s office directed scheduling questions to her office. Blackburn’s office didn’t comment. Four Democrats and four Republicans signed on to S-3663.

I’ve heard that they do want to get to it, so we’ll see if we’re actually able to,” said American Principles Project Policy Director Jon Schweppe. His organization supports the bill, along with Common Sense Media, the American Psychological Association, the 5Rights Foundation, American Compass, the Internet Accountability Project and the Digital Progress Institute. “I imagine it wouldn’t be that controversial in committee,” said Schweppe. “I got the impression you might get one or two stray objectors, but Blumenthal seems to have the Democrats on this, too.”

That’s the scuttlebutt -- that it’s going to come to a markup,” said Digital Progress Institute President Joel Thayer. “I think there’s a lot of momentum mainly because it has bipartisan support.” Protecting kids' privacy is a “popular privacy objective,” and this bill provides a narrowly tailored approach without getting caught up in broader privacy issues, he said.

The bill would help ensure platforms are correcting products that are harmful by design, said Common Sense Media Policy Counsel Irene Ly. Platforms are aware there’s content that promotes eating disorders and suicidal ideation, she said. Facebook whistleblower Frances Haugen told the Senate Consumer Protection Subcommittee that when she testified (see 2110050062). Blumenthal and Blackburn have been pursuing legislation since the hearing. “It’s really easy for someone to look at one post or one video that’s just promoting healthy eating or like a fitness routine and then you end up down a rabbit hole of progressively more problematic stuff, going from healthy eating to disorder eating,” said Ly. “This happens a lot, and they’re aware of this.”

Meta, Snap and TikTok are facing lawsuits from attorneys that sued the companies earlier this year over the suicide of an 11-year-old Connecticut girl and her social media addiction. The platforms are “unreasonably dangerous” because they’re designed to hook children and subject them to predators and sexual exploitation, Social Media Victims Law Center founder and attorney Matthew Bergman told us. He filed three similar cases and is preparing additional complaints that deal with self-harm and suicide. According to filings, Selena Rodriguez was on social media at “all hours of the day and night” communicating with some 2,500 users. This included conversations with adults who “bullied” the 11-year-old into trading sexually explicit images. She posted a six-second suicide video in July 2021 on Snapchat, in which she ingested a fatal dose of Wellbutrin.

A Meta spokesperson highlighted mental health resources for parents and children on the company’s platforms. That includes parental controls on Instagram like time limits for usage. The company cited tools for preventing suicide and self-harm, including crisis hotlines and decreased visibility of objectionable content. The platforms also have controls for removing "like" counts on posts, Meta said. The company declined to comment about ongoing litigation. Snap and TikTok didn’t comment.

Research indicates social media could be altering brain development for young users, said American Psychological Association Chief Scientist Mitch Prinstein. Pending research shows deactivation of the prefrontal cortex, which controls inhibition, for young social media users, he said. It corresponds with heightened activation of areas of the brain that make kids focus on self-motivated rewards and dopamine hits. “That is very much a model that is consistent with addiction -- a hypersensitivity and craving for rewards and a decreased inhibition to avoid pursuing those rewards,” he said.