Lines Blur on Censorship, CDT's Llanso Says; State Official Agrees Some Go Too Far
Some governments are using Internet companies' terms of service -- typically more restrictive of the kinds of speech allowed on a platform -- to take down content that governments don't like, "blurring" the lines between actions taken by the public and private sectors, said Emma Llansó, head of the Center for Democracy and Technology's free expression project. This allows governments to circumvent laws and legal processes to get what they want in their effort to curb terrorist use of online platforms for propaganda, recruitment, financing and planning activities, said Llansó during a George Washington University discussion Wednesday. That day, a House committee passed on a party-line vote legislation targeting terrorist content online (see 1603230056).
Sign up for a free preview to unlock the rest of this article
Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.
An official UK government program allows London's Metropolitan Police to use companies' content flagging systems just like any other user within the Internet community to flag speech or other material and then have it removed by the company (see 1602090020), Llansó said. She said companies such as Facebook, Google and Twitter have a terms of service policy for a lot of good reasons: it's a way to create a certain kind of community or topic and deal with problems like online harassment. “When governments start using systems that are designed to enforce those terms, we’re seeing government action that is going well beyond what they would actually be able to seek removal of from the Web if they went through due process," said Llansó. If the public begins to accept a government's skirting of due process rights, court action or independent review, "there's no stopping this spread of this sort of approach ... by a number of other governments and to cover a number of different kinds of content," she said.
Jason Pielemeier, who heads a State Department team focusing on Internet freedom, business and human rights, said what the UK is doing goes beyond what the U.S. has done, though he said he didn't think the UK was violating any human rights commitments. But he said the U.S. worries such efforts could be used by other governments to justify more restrictive content blocking or censorship. He also said some segments of the U.S. government -- whose job is to track terrorists -- are concerned about taking information off the Internet because they want to identify how terrorists communicate and then counter that messaging or use legal law enforcement action. “We also recognize suppression of expression can be counterproductive, can raise the profile of offensive speech, and it can also cause it to fester in dangerous ways and move to deeper and darker places,” said Pielemeier.
Google engages with government and with many different organizations and individuals to help them better use online platforms to "amplify their messages," said Alex Walden, public policy and government relations counsel. She said the company doesn't view that as working with government to get its message out, but helping all users strengthen their messages -- and put more good content out there than harmful information. She said someone who looks for the term "jihad" may find imams who are talking about positive values.
Walden said Google tries to keep YouTube "free from dangerous content" and has community guidelines that spell out what is and isn't allowed. She said the company doesn't allow extremist propaganda, incitement or glorification of violent acts, and videos that contain threats or promote hatred, terrorist recruitment, camps and beheadings. When a member of the community flags a video, the company's reviewers around the world examine the video and make a decision whether to take it down. She said the Google team acts on 100,000 videos per day, each in under an hour. There are exceptions. Walden said some may find videos offensive due to violence but Google may deem they have value in documenting atrocities and human rights violations.
Llansó said the most effective counter messaging comes from "credible and authentic voices." She said the recent meetings between technology companies and the White House -- which she said has been dubbed "the Madison Valley Wood Project" for Madison Avenue, Silicon Valley and Hollywood -- to develop counter messaging "doesn't strike anybody" as necessarily credible or authentic. She said if government ratchets up pressure on private companies to change policies and be more responsible for third-party content, then both sides need to be more transparent. Otherwise, she said, the "ultimate risk is that we really just undermine people's faith in Internet platforms."