Trade Law Daily is a service of Warren Communications News.
Age-Verification Issues

House Lawmakers Urged to Maintain Focus on Privacy in Online Safety Bills

Industry and consumer advocates have weighed in on nearly 20 kids privacy and safety bills set for a subcommittee hearing Tuesday in the House Commerce Committee (see 2511250080). In written testimony posted over the weekend, some witnesses additionally warned the lawmakers against inadvertently weakening privacy protections in an effort to promote online safety.

Sign up for a free preview to unlock the rest of this article

Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.

Among the many bills that will be heard is the App Store Accountability Act, an age-verification proposal similar to laws in Texas and Utah (see 2510070020 and 2503050052).

Joel Thayer, president of the Digital Progress Institute, said in his testimony that the app store bill -- along with the Screen Act and Kids Online Safety Act (KOSA) -- is “poised to resolve many of the challenges parents are facing in today’s digital age with respect to child safety.”

“Despite what the tech companies trumpet in their press releases, parents are left with almost no resources to combat their encroachments” on children’s safety, and those companies are even “perpetuating the problem,” Thayer said. “Big Tech’s form of child exploitation pays well,” as “our children are not only Big Tech’s users but are also their product.” Kids are “feeding Big Tech’s algorithms to sell to advertisers” and being “used to inform their respective AI programs.”

Though tech lobbyists are “instilling fears of consumers forfeiting privacy and the stifling of speech … this is all a farse,” Thayer added. “Courts, regulators, and consumers have found every one of these companies to have violated their users’ personal privacy,” so “these companies are hardly an authority on proper privacy hygienics.”

Testimony from Kate Ruane, director of the Free Expression Project at the Center for Democracy & Technology, raised five “critical points” in protecting kids online. They included calls for Congress to address the “root causes of online harms including ... privacy”; to remember that “age assurance and verification create significant privacy risks that should be mitigated in legislation if the government requires or incentivizes its use”; and not to “unduly restrict states’ ability to act." In particular, Congress "must reject any false deal that conditions kids' online safety on preempting states' ability to regulate AI.”

She also argued that the dependency on advertising sales “is harmful because it is privacy invasive and increases the risks of data breaches and inappropriate government access to people’s private thoughts,” so data minimization principles should be enacted.

But the current versions of COPPA 2.0 and KOSA raise “new concerns around weakening existing privacy protections for minors,” she added. COPPA 2.0’s preemption standard would likely lead to harms by preventing the enforcement of state privacy laws, Ruane said. Instead, the wording should be changed so only “conflicting” state laws are preempted.

The age-verification requirements in KOSA and the Reset Act (HR-5837) are problematic as well, as currently available methods “have not eliminated the privacy and efficacy concerns raised by their use,” and age verification places an undue legal burden on access to speech, Ruane said. However, she noted her group's support for the Don’t Sell Kids’ Data Act (HR-6292)

Paul Lekas, executive vice president of global public policy for the Software & Information Industry Association (SIIA), wrote in his testimony that he appreciated the subcommittee “recognizing that there is no single solution to the concerns that have been raised involving youth privacy and safety."

He noted that SIIA is “encouraged” that many of the bills on the hearing agenda have “taken to heart the fundamental tension between free expression and the interests of youth online privacy and safety.” He encouraged legislators considering proposed bills to lean on court decisions such as the U.S. Supreme Court ruling in Free Speech Coalition v. Paxton (see 2506270041), the 9th U.S. Circuit Court of Appeals’ opinion in NetChoice v. Bonta (see 2408160015) and the recent 11th U.S. Circuit Court of Appeals ruling in CCIA & NetChoice v. Uthmeier (see 2511260042).

Age verification “requires robust data collection,” which “creates the opportunity for bad actors to attack and access that information,” whether it be “mass data breaches” or “the creation of centralized identity dossiers,” Lekas said. Methods like age estimation are “less certain than verification and prone to bias” but do “not create the same privacy risks as using hard identity documents.”

Though “age verification requirements are appropriate for websites with a significant amount of sexually explicit material," Lekas said that after the ruling in Free Speech Coalition, “Congress should look to different methods to incentivize the creation of more age-appropriate experiences and protect young people online.” That could include “creating tools that empower parents and youth with control over their data and their online experience” and “using signals to estimate ages and trigger additional guardrails based on company policies.”

Lekas said he appreciated the efforts “to bring COPPA into the 21st century,” such as aligning the law with “industry best practices of data minimization and ensuring that platforms only gather data essential to preserving child safety,” but more can be done.

For example, SIIA believes “targeted advertising -- as opposed to contextual advertising -- should not be allowed for youth,” he said. COPPA 2.0 should also “clarify that educational data may only be used for educational purposes when technology is used at the direction of the school,” which would resolve a conflict with the Family Educational Rights and Privacy Act.

Testimony will also be heard Tuesday on 16 other bills, including Sammy’s Law (HR-2657), which would require large social media platforms to make programming interfaces accessible to third-party safety software providers registered with the FTC that will alert parents when dangerous or unsafe content is shared on their child’s social media account.

The current process of manually reviewing children’s devices and social media accounts is “incredibly invasive” and can reveal “private conversations and content not connected to any harm or risk,” said submitted testimony from Marc Berkman, CEO of the Organization for Social Media Safety. In contrast, third-party safety software is “more targeted, less invasive, and proven to work.”

“Sammy’s Law also strictly limits what safety software may share with parents,” Berkman said. “Data may only be disclosed if it relates to a current or imminent risk of harm,” ensuring that “children’s privacy is strengthened while enabling parents to act against real threats.”