Trade Law Daily is a Warren News publication.
‘Personal Responsibility’

Wyden: Section 230 Doesn’t Protect AI-Generated Content

Generative-AI content shouldn’t be protected by Communications Decency Act Section 230, Sen. Ron Wyden, D-Ore., told a New America event Tuesday.

Sign up for a free preview to unlock the rest of this article

Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.

Legislators on Capitol Hill have been discussing how to regulate AI products and the harms created by AI-generated content. Some have drawn attention to AI chatbots like ChatGPT spreading misinformation that creates real-world harm. Section 230 isn’t about protecting generated content, Wyden said, noting he’s on the record saying ChatGPT shouldn’t enjoy the liability protections. Wyden co-authored Section 230 with then-Rep. Chris Cox, R-Calif., who also stated his opposition to applying liability protection to AI models.

Sens. Josh Hawley, R-Mo., and Richard Blumenthal, D-Conn., introduced legislation (see 2306150059) earlier this month seeking to clarify that Section 230 doesn’t apply to claims based on generative AI activity. It would allow individuals to sue AI companies in federal and state court. Wyden said Tuesday he doesn't support protecting generative AI content from liability, but Congress should wait and consider the implications. Generative AI first needs to be defined, he said, noting it can be interpreted to include search engines. “I don’t see a rush to move here,” he said, arguing privacy is the bigger concern on the tech front.

Section 230 protects the right for consumers to make choices online and take “personal responsibility” for those choices, said Wyden: Because of the statute, platforms can moderate content without fear of lawsuit. He opposed efforts to chip away at the statute, using the Stop Enabling Sex Traffickers Act (SESTA) and the Fight Online Sex Trafficking Act (FOSTA) as an example of where things can go wrong (see 2305040062). SESTA-FOSTA has done nothing to protect victims or bring sex traffickers to justice, he said: It’s driven sex work into the dark web and enabled violence against sex workers. It’s a preview of a world without Section 230 and the silencing of online conversations, he said. A repeal of the statute would result in immense pressure for websites to take down content that offends “people with power” and anything else outside of the mainstream, said Wyden.

Several panelists cited the importance of Section 230 to websites like Wikipedia and library services offering online resources. Section 230 makes it possible for Wikipedia to host content without having to intervene with the volunteer work that makes the website possible, said Wikimedia Foundation Vice President-Global Advocacy Rebecca MacKinnon. Digital media strategist Andrew Lih noted 99% of Wikipedia’s content is curated and created by volunteers, which allowed it to become the No. 1 reference site in almost every language in the world. That would be possible only with Section 230, he said.

Because libraries are regarded as interactive computer services under Section 230, they rely on the statute’s liability protections to foster intellectual and political discourse, said Katherine Klosek, Association of Research Libraries information policy and federal relations director. She said ARL has asked congressional staff for libraries to have a seat at the table in discussions about Section 230 changes.