Trade Law Daily is a Warren News publication.
Section 230 co-Author

Tech Industry, Academia Slam Trump’s Possible FCC-FTC Content Proposal

The White House’s potential draft executive order directing the FCC and the FTC to police social media content moderation blatantly violates the First Amendment and is likely to face years of legal battle, representatives from the tech industry and academia told us. Days before President Donald Trump’s proposal emerged, senators from both parties said the tech industry’s content liability shield, which the draft order seeks to modify (see 1908090053), needs re-examining.

Sign up for a free preview to unlock the rest of this article

Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.

I don’t think government ought to be censoring what people can see and not see and read and not read,” Sen. John Kennedy, R-La., said just before the August recess. Kennedy does believe Section 230 of the Communications Decency Act can be improved, given the lack of transparency for the industry’s moderation algorithms: “They say trust us, but their track record does not merit the trust.”

It’s time for platforms to “open their books” to external audits, said Sen. Josh Hawley, R-Mo. He favors anything that “shines light on social media’s evasive practices.” Hawley introduced legislation in June (see 1906190047) to remove Section 230 immunity for big tech companies unless they prove to the FTC every two years that content removal decisions are politically neutral. On Hawley’s bill, Kennedy said, “I’m not ready to go there yet,” though he cited anecdotal evidence of Facebook purging conservative content and platform-critical stories. The company denies it's biased against any particular group.

Sen. Richard Blumenthal, D-Conn., told us he’s considering what Section 230 “changes might be appropriate.” Sens. Brian Schatz, D-Hawaii, and Mark Warner, D-Va., previously said they’re doing their own liability shield reviews. Senate Judiciary Committee Chairman Lindsey Graham, R-S.C., floated a legislative proposal in July (see 1907090062), saying tech companies should be required to follow industry-written best practices for protecting children’s online safety to “earn” the liability protections.

For months, Sen. Joe Manchin, D-W.Va., has threatened the tech industry for illegal opioid sales (see 1811270032). Asked if he will push his long-talked-about bill to amend 230, similar to recently passed anti-sex trafficking legislation, Manchin said, “Absolutely, whatever it takes if they don’t come to their senses.” A Manchin aide told us legislation remains on the table, but there’s currently no text. Manchin has the capability to introduce something quickly, but there’s been no formal movement yet, the aide said. “Absolutely, they should be held liable,” Manchin said. If “they’re pushing a product that’s killing West Virginians, I’m going to hold them liable.”

Ex-Rep. Chris Cox, R-Calif., who co-wrote the portion of the Communications Decency Act with Sen. Ron Wyden, D-Ore., warned Congress against weakening incentives for content moderation, which could make the internet “more of a jungle” filled with objectionable content. The Good Samaritan exception, which allows platforms to block or remove that type of content, is one important provision Cox cited during an interview.

It’s possible for Congress to improve 230, but it’s essential not to undermine the positive incentives for moderation, and that's a risk as any bill moves through the legislative process, Cox said. Lawmakers can start out with a good bill, but by the time it makes its way through committees, markups, floor amendments and conference reports, you mightn't recognize the final product, he said.

Moderating content to ensure political neutrality is “not an area of specialization” for the FTC, but the FCC has history with the fairness doctrine, Cox continued. The difference with the since-rescinded doctrine is the FCC tracked a relatively small number of broadcasters with a correspondingly manageable number of content providers, in contrast to the billions of content providers on the internet today, he said. But the FTC has authority over unfair and deceptive practices if platforms are claiming political neutrality, he noted, calling it a tried and true method for addressing businesses failing to fulfill advertising promises.

Trump’s reported draft EO contradicts judicial interpretations of 230, would run into a “buzz saw” of First Amendment challenges and is “flatly inconsistent” with how Republicans view the fairness doctrine, said University of Minnesota Law School professor William McGeveran. Section 230 isn’t precise about whether platforms are closer to media outlets than to utilities with much less editorial discretion, said McGeveran. If they’re more like a newspaper, the prospect of directing the FCC and the FTC to control how platforms moderate content is “dead on arrival,” he said. Best-case scenario for the draft order is it’s held up in court for years, he said.

Section 230, though, isn’t “sacrosanct,” McGeveran said in support of a “thoughtful re-examination” from Congress. No one anticipated the current shape of the internet when the statute was written, and it could use some fine-tuning, he said. There are valid criticisms about privacy, hate speech and violence, he said, citing recent scrutiny of unmoderated platforms like 8chan (see 1908050052).

"The president announced at the social media summit that we were going to address" content moderation issues, a White House spokesperson emailed Friday. "The administration is exploring all policy solutions." Trump offered anecdotal evidence of social media bias and alleged throttling of his Twitter followers and activity at his Presidential Social Media Summit in July (see 1907110066).

Others are skeptical a congressional review and overhaul of the portion would produce a positive result. Subjecting fundamental internet laws to a “partisan maelstrom leading up to an election is of concern,” said Computer & Communications Industry Association CEO Ed Black. The liability shield provides proper balance for platforms to handle objectionable content with varying moderation methods, he said.

Censoring content by proxy isn't in the FCC’s purview, Black said of an EO, and it would be difficult to articulate the document's premise into the FTC’s review process. “It doesn’t seem a lot of the things the order wants would be easily undertaken at the agencies.” He added that there’s little doubt there would be significant legal challenge. The FCC and FTC didn't comment.