Trade Law Daily is a Warren News publication.
Algorithms' Role a Focus

House Subcommittees to Take Aim at Data Privacy, Content Filtering Bias, in Hearing

House Communications and Digital Commerce subcommittees are expected to delve further into their concerns about tech companies' data privacy policies during a Wednesday hearing, we are told. The hearing, on how use of algorithms affects consumer privacy and choice with online content, follows scrutiny elsewhere on Capitol Hill last month about tech firms' handling of online advertising in Russian-led disinformation campaigns during the 2016 U.S. presidential election (see 1711210025, 1711020001 and 1711210025). The hearing will begin at 10 a.m. in 2123 Rayburn.

Sign up for a free preview to unlock the rest of this article

Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.

House Commerce Committee Republicans appear interested primarily in the tech firms' communications with users about their data collection policies, how they collect users' data and how that data informs the companies'' decisions about providing content to users, sector lobbyists said. Some in the industry will be closely watching the extent to which lawmakers target specific companies' practices, especially those of Facebook, Google and Twitter. The companies were under scrutiny in the online political ads hearings and recently faced criticism from Capitol Hill on other issues. One lobbyist noted FCC Chairman Ajit Pai's Tuesday R Street Institute-Lincoln Network speech, in which he said edge providers like Twitter are a bigger threat to net neutrality than ISPs (see 1711280024).

A hearing potentially hostile toward tech wouldn't be “unexpected” given recent additional scrutiny the industry received on Capitol Hill in recent months, including on online ad disclosures and online sex trafficking, a lobbyist for the industry said: “The next logical step” was for lawmakers to move on to data policies. “We'll be watching to see whether leadership indicates they believe there should be additional regulation” on those issues, including whether House Communications Chairman Marsha Blackburn, R-Tenn., decides to take the opportunity to push her Balancing the Rights of Web Surfers Equally and Responsibly Act (HR-2520), the lobbyist said. HR-2520 would make the FTC the privacy regulator for ISPs and edge companies and would require opt-in consent even for web browsing data. Advancement of the bill stalled earlier this year amid widespread Democratic opposition (see 1705190053 and 1706280058).

A House Commerce GOP staff highlighted issues about net firms' data policies, including incidents like the Equifax data breach, evolving use of cookies and other tools to track users' online behavior and companies' privacy protection practices. Understanding those issues is important because of concerns about “how content is shaped over private social media platforms,” the memo said: Content prioritization and filtering via algorithms raised “concerns about intentional or unintentional bias being built into these machine-based decision-makers during their development.” Many other “controversial decisions regarding content moderation are made not by algorithms, but by employees enforcing or developing internal guidelines, which may or may not be public,” the memo said. “In the context of concerns about the diversity of the employees responsible for making these decisions, questions of bias, influence, and control are magnified.”

House Commerce Democrats also will have questions about content filtering bias and privacy concerns, a Democratic House aide said. Ranking member Frank Pallone, D-N.J., sent a letter to Google parent Alphabet, Facebook and Twitter in October in which he raised concerns that content policies may be tilted toward increasing page views and ad clicks (see 1710230061). Democrats feel “the way that content is being handled feels a little opaque and not neutral, so there will be an effort to better understand that,” the aide said.

Advance Testimony

Academic experts set to testify offered a policy recommendations in written testimony, including some aimed at mitigating the impact of data breaches.

Congress should give the FTC and other consumer protection agencies “more powerful regulatory tools and stronger enforcement authority,” says Laura Moy, Georgetown University Law Center deputy director-Center for Privacy and Technology. “As Congress considers establishing new privacy and data security protections for consumers’ private information, it should not eliminate existing protections.” Responding to Equifax's hack, Moy says Congress “should enhance the authority of federal agencies to oversee the data security practices of consumer reporting agencies, to promulgate rules governing the data security obligations of financial institutions, and to enforce those obligations with civil penalties.” Congress “should also consider giving consumers better tools for redress when their personal information is compromised in a future breach by streamlining the credit freeze process,” she says.

University of Pennsylvania computer science professor Michael Kearns argues “for a privacy framework that comprehensively covers the diverse range of data being used commercially, and applies consistent privacy requirements.” Policymakers should "not overly focus on specific data types or practices” since they're "likely to become obsolete shortly due to the rapidly changing nature of technology,” Kearns says. “A technology-neutral approach can adapt quickly.”

Algorithms “can be made more accountable, respecting rights of fairness and dignity for which generations have fought,” University of Maryland Carey School of Law professor Frank Pasquale says. “The challenge is not technical, but political, and the first step is law that empowers people to see and challenge what algorithms are saying about us.” Yale Law School Information Society Project resident fellow Kate Klonick warns that any revamp of Communications Decency Act Section 230 “should be considered with extreme caution.” She notes “major social media platforms’ self-regulation has met” the statute's goals, including “removing content that users find normatively unpalatable, while keeping up as much content as possible.”

Straightforward data usage restrictions impose costs on both firms and consumers,” says MIT Sloane School of Management professor Catherine Tucker. “Identifying an economically optimal approach to data protection is hard because it is difficult to measure what consumers actually want regarding privacy. However, my research suggests that giving consumers a sense of control over how their data is used is welfare-enhancing. Congress should recognize that different types of data have very different types of consequences for consumers.”

There's “no one-size-fits-all solution, no new panacea” on data privacy rules, according to University of Chicago law School professor Omri Ben-Shahar. Cyber break-ins “are not harmful unless the data is used for fraudulent transaction,” Ben-Shahar says. “A legal scheme insuring consumers against such losses may be necessary to the extent that consumers are not already protected or insured.”