Trade Law Daily is a Warren News publication.
Research Safe Harbor?

Lawmakers Eye Data Access Rules to Aid Misinfo Research

House Science Oversight Subcommittee leaders debated Tuesday whether to institute mandates or offer legal protections to social media platforms in a bid to increase the amount of non-identifiable user data researchers can access as they examine ways to address the online spread of misinformation. There was disagreement over whether giving consumers more direct ownership of the data that platforms collect would help here. Data ownership, access and portability have been among thorny issues lawmakers eye via privacy legislation (see 2108170073).

Sign up for a free preview to unlock the rest of this article

Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.

Social media offers fertile ground for these falsehoods,” said Science Committee Chair Eddie Bernice Johnson, D-Texas. “Navigating the difficulties in extending access to data will not be easy, but failing to do so will have devastating consequences.” She cited the spread of misinformation about the COVID-19 vaccines as “drowning out expert voices” that has meant many social media users being “dissuaded from getting the shot.”

There are “limitations to full data transparency by social media companies,” including platforms’ argument that “some information should be protected as trade secrets” and users’ entitlement to “privacy, particularly of personally identifiable information,” said Oversight Chairman Bill Foster, D-Ill. “These concerns cannot be broad excuses to shield social media companies from a full outside accounting of how their platforms may be endangering public health and safety.”

Foster backs setting digital ad data portability standards, but “we’re going to have to scale the mandates” so requirements aren’t so “burdensome” for smaller companies that they “squeeze everyone but the big players out.” One way to make the rules “less of a burden” is to let smaller players build software around portability and access standards, he said. Reps. Sean Casten, D-Ill., and Ed Perlmutter, D-Colo., also backed portability.

Science Oversight ranking member Jay Obernolte, R-Calif., said he is “very skeptical” that instituting portability requirements will “allow us to solve” larger misinformation problems. “The problem here is not data,” he said. It’s that social media companies have built their business around a lucrative model “that is based on consumer engagement.” There’s “such a strong commercial incentive” that “no matter what we do, it’s going to be an uphill battle,” Obernolte said.

Platforms need “a seat at the table” to effectively combat misinformation, Obernolte said. “We cannot expect them to go it alone, and we should likewise not expect to stop the spread of harmful misinformation without them.” Obernolte said a proposed “legal safe harbor” to shield researchers using non-identifiable user data from social media should also apply to the platforms themselves.

New York University’s Cybersecurity for Democracy co-Director Laura Edelson believes a researcher safe harbor would protect those engaged in “direct” research that could help stem misinformation’s spread. The safe harbor, which Columbia University’s Knight First Amendment Institute also backs, would give researchers cover for work that handles data “responsibility” and within “ethical boundaries,” Edelson said.

Northeastern University's Khoury College of Computer Sciences interim Dean Alan Mislove backed the Social Media Disclosure and Transparency of Advertisements Act (HR-3451) and Algorithmic Justice and Online Platform Transparency Act (HR-3611/S-1896). He called them ways to “take meaningful steps towards ensuring researchers continue to have sufficient access.” Edelson and the University of Illinois Urbana-Champaign also backed varying levels of data transparency.

All three witnesses said the amount of data platforms provide the public is inadequate. Mislove noted his research team ended up spending more than $25,000 on Facebook advertisements so they could access more detailed algorithmic data than “coarse-grain” publicly available statistics. The public information deals with only active ads, while advertisers can access information on past campaigns and get “finer-grain data,” including audience demographic makeup, Mislove said. It's “odd that researchers are in the position of having to pay the subject of their study in order to gain sufficient access to crucial data,” Foster said.