Trade Law Daily is a Warren News publication.
Pay to Push Misinformation?

Before Hearing, Facebook Bans Some Deepfake Videos, Democrats Lash Out

Facebook wants uniform tech platform standards for deepfakes, Vice President-Global Policy Management Monika Bickert plans to tell the House Consumer Protection Subcommittee Wednesday (see 1912310003). Her testimony was released the same day Facebook announced it’s banning manipulated content like deepfakes, which fabricate a person’s words. The announcement initially indicated political candidates would be allowed to include manipulated media in ads, according to Democrats who immediately criticized the policy change. The platform issued a correction.

Sign up for a free preview to unlock the rest of this article

Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.

House Antitrust Subcommittee Chairman David Cicilline, D-R.I., called the new policy “insane,” claiming the policy change initially allowed politicians and candidates to pay to push misinformation through ads. Sen. Mark Warner, D-Va., also criticized the new policy, saying it ignores “low-tech synthetic media.”

Facebook spokesperson Andy Stone said he issued “inaccurate information” in the initial announcement. He clarified that “whether posted by a politician or anyone else, we do NOT permit in ads manipulated media content.”

Taking effect immediately, the change targets content edited “in ways that aren’t apparent to an average person and would likely mislead someone,” plus related content driven by artificial intelligence, wrote Bickert. The policy exempts parody, satire and video edited “solely to omit or change the order of words,” Bickert said. “The policy is designed to prohibit the most sophisticated attempts to mislead people,” she wrote in her remarks.

Like radio towers with “amplification power,” platforms have “public interest obligations” regarding false content, Harvard Research Director-Technology and Social Change Project Joan Donovan will testify. Doctored videos like the one of House Speaker Nancy Pelosi, D-Calif., (see 1906130048) pose societal threats, the academic says. “Malicious actors jeopardize how we make informed decisions about who to vote for and what causes we support.”

Regulation may be warranted to address dark patterns, University of Nebraska assistant law professor Gus Hurwitz will testify. Dark patterns are manipulative online tactics used to get internet users to carry out actions they normally wouldn’t, he says: For instance, a website might make it much harder to opt out of a service than opt in. It’s an area well-suited for industry self-regulation, Hurwitz wrote. FTC authority could be bolstered so the agency could report to Congress about problematic practices, he says.