Trade Law Daily is a Warren News publication.
‘Earned’ S. 230 Immunity

Instagram Head Wants Industry Best Practices on Kid Safety

The tech industry should create a regulatory body to set best practices for protecting children, and Communications Decency Act Section 230 immunity should be earned through adhering to those protections, Instagram Head Adam Mosseri told the Senate Consumer Protection Subcommittee at a Wednesday hearing. That regulatory body should gather input from civil society and regulators about universal protections, including age verification, age-appropriate design and parental controls, Mosseri said. TikTok Public Policy Head Michael Beckerman backed standardized age verification in November (see 2111090076).

Sign up for a free preview to unlock the rest of this article

Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.

Chairman Richard Blumenthal, D-Conn., told reporters the subcommittee plans more hearings on child safety, potentially with former Facebook employees. “Legislation is coming,” said Blumenthal during the hearing. “We’re here to do more than shake fists. We really are seeking solutions.” He welcomed input from the tech industry but criticized Instagram for announcing new parental controls the night before the hearing, which he considered a public relations stunt. The parental controls should have been announced years ago, he said.

Congress must pass a new federal privacy law and kid-specific legislation to keep minors safe online, said ranking member Marsha Blackburn, R-Tenn. She said her staff created a mock account on Instagram for a 15-year-old user. Though Instagram is supposed to default accounts for minors as private, the account was immediately set to public, she said. Mosseri called it an oversight that will be fixed.

Instagram’s new parental controls allow parents to set a three-hour daily time limit for young users, noted Sen. Amy Klobuchar, D-Minn., questioning whether that’s an appropriate amount of time. Mosseri said he understands the concerns about screen time, but ultimately it’s the parent’s discretion.

Mosseri told Sen. Ed Markey, D-Mass., he supports restricting advertising targeting for young users: "It's valuable for ads to be relevant, but I do believe that some measures need to be taken to keep children safe.” Markey is seeking to ban all targeted ads for teens and children through legislation. Mosseri wouldn't commit to such a ban, or to scrapping plans to build an Instagram for children. The project remains “paused,” said Mosseri.

Sen. John Thune, R-S.D., questioned Mosseri about the need for algorithmic transparency and user choice over content, citing his Pact Act and Filter Bubble Transparency Act. The Pact Act (S.797), which he reintroduced with Sen. Brian Schatz, D-Hawaii, picked up two more sponsors this week: Sens. Shelley Moore Capito, R-W.Va., and John Hickenlooper, D-Colo.

Blumenthal asked Mosseri if he would support a legal requirement for platforms to allow researchers access to study algorithms. Mosseri said he supports requirements and standards about transparency and algorithms.

House Communications Subcommittee member Rep. Peter Welch, D-Vt., said he believes there’s an opening for Congress to act on his proposal for a new national digital authority to rein in Big Tech (see 2110080041), citing the continued lack of lawmaker consensus on how to revamp Section 230. There’s clearly an “appetite for reform” on 230 and other matters involving Big Tech, but the lack of consensus on display at legislative hearings like the one House Communications held last week (see 2112010058) and the sheer scope of issues involved shows it’s going to be difficult to address them all in one bill, Welch said during a Practicing Law Institute event. Congress would also inevitably find itself playing “legislative Whac-A-Mole” to address emerging social media issues.

An independent federal agency is needed now because the U.S. is facing the same type of reckoning with social media as it did at “previous points in our history” involving “the emergence of a new technology” or transportation system that was resulting in fundamental changes to society, Welch told PLI (see 2112080036). An agency could be “empowered” with sufficient staff resources, rulemaking and investigatory authority that would allow it to engage in “a more comprehensive approach” on social media and other Big Tech entities, he said. He acknowledged House Consumer Protection Subcommittee Chair Jan Schakowsky, D-Ill., is among those who strongly back creation of a privacy-focused FTC bureau to tackle some Big Tech issues, but believes “we need a dedicated” entity just focused on the tech sector to fully address the situation.