Facebook Acknowledges Hate Speech Moderation Technology Lagging
Facebook content moderation technology for hate speech is lagging compared with systems for flagging adult and violent content, the company said Tuesday. Facebook took down 21 million pieces of adult content in Q1, took down or applied warnings to about…
Sign up for a free preview to unlock the rest of this article
Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.
3.5 million pieces of violent content and removed 2.5 million pieces of hate speech. Only about 38 percent of hate speech was flagged by Facebook technology, the platform said. Its technology identified about 96 percent of adult content before it was reported and 86 percent of violent content.