Privacy Advocates Laud FTC's Big Data Guidance, but One Questions Impact
Privacy advocates said the FTC big data report is significant because it focuses on ensuring that poor and underserved communities aren't discriminated against through the use of automated, data-driven predictive technologies even as those technologies are supposed to help them improve their lives (see 1601060042). The report was right to outline possible harms to such communities with the growing adoption of big data analytics, they said, despite criticism from some industry allies that potential misuses were hypothetical and overstated. But at least one advocate said she was unsure of the report's larger impact on companies that use big data analytics in this area, though most said they believed companies want -- and try -- to do the right thing.
Sign up for a free preview to unlock the rest of this article
Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.
The FTC clearly said it would enforce any violation through existing laws such as the Fair Credit Reporting Act, Equal Credit Opportunity Act and Section 5 of the FTC Act. But the privacy advocates and others interviewed Monday and Tuesday said regulators want companies to think more broadly and conscientiously about how they use big data to help such communities and caution them about the potential risks. "It's an early laying out of the issues and use of the bully pulpit to talk to people in doing the right thing," said John Simpson, Consumer Watchdog's privacy project director. "In the background, it says if you don't do the right thing we're going after you."
Ali Lange, Center for Democracy and Technology policy analyst, said the report is effectively a policy primer and a "new issue in the world for a lot of people." She said the report makes people "proactively think and ask critical questions" -- some philosophical and abstract -- so the technology "is pushed toward the side of the angels." She said the FTC is warning companies that the technology creates a "huge asymmetry" in information that could contain bias and have a disparate impact and treatment of such communities. While there's some new, interesting information in the report, she said, "I also feel that... a lot of people for whom this will be relevant will not find novel insight."
"It’s a genuinely exciting report," said Seeta Peña Gangadharan, program fellow with New America's Open Technology Institute. She said last week's report, which is mainly based on a September 2014 big data workshop, a seminar and research, signals the continuity and concerns raised by the White House's big data report released in May 2014. That report (see 1405020034) said big data analytics could potentially "eclipse longstanding civil rights protections in how personal information is used in housing, credit, employment, health, education, and the marketplace."
Gangadharan said the FTC report's four questions are important because they carry on the White House's discussion about how exclusivity and unfairness could happen and what can be done about it. "I think there was some ... small tremors that reverberated around various sectors that there are these potential side effects that we haven’t thought of," she said, "and the report from the FTC develops a set of questions that begins to get their heads around" such concerns. But she said the "discovery of harms on a particular population is tricky and it takes time." It means more research is needed on the consumer experience, she said.
"Profiling is obviously a concern," said Marc Rotenberg, president of the Electronic Privacy Information Center, on the report's significance and impact. "That is why EPIC has recommended 'algorithmic transparency.' But the FTC has been reluctant to impose any meaningful transparency obligations." He said an "even bigger problem" with big data is that companies just can't shield the information they collect, which means consumers could suffer identity theft and financial fraud.
Although big data could possibly be used in discriminatory and unfair ways toward underserved consumers, privacy lawyer Christopher Wolf of Hogan Lovells was heartened "to see that the concerns about the misuse of Big Data largely are hypothetical, which is to say that there are exceedingly few real life examples of Big Data being misused." The "vast majority of companies using Big Data understand the obligations to be fair, both to avoid legal issues but also to behave ethically and to maintain their reputations in the marketplace," he emailed.
Wolf, who is also founder and board president of Future of Privacy Forum, one of the groups that said the FTC report overstated some hypothetical harms, wrote society gains many benefits in healthcare, urban planning, education, safety and security that "outweigh the costs -- the potential costs -- of Big Data." But he added the FTC is "right to be on guard against such harm, but as the saying goes, it should not ‘throw the baby out with the bathwater.’”
Still, Lange said the report also provided "hypothetical benefits that don't exist. They're not wrong to worry about concerns and harms." While there are some bad actors, Gangadharan said, most companies genuinely want to serve consumers and they don't want to be seen as "unintentionally racist and unintentionally elitist."
"This is not a reprimand on companies for not doing their jobs," said Nicol Turner-Lee, vice president of the Multicultural Media, Telecom and Internet Council, who participated in the FTC September 2014 big data workshop that the report is partly based on. "I don’t see anywhere in the report -- outside of third-party data brokers -- the companies are being shamed for not contributing to a healthy ecosystem. This is reminder that these harms can and will exist."
The report is an extension of the FTC's "privacy by design" principle, Turner-Lee said. "It is good public policy to ensure that all consumers are protected even in hypothetical cases [in which] the data can be used to perpetuate both stereotypes and discriminatory practices." Just because there haven't been any "highly detrimental impacts on vulnerable groups yet" doesn't mean it won't happen, she said, saying the report is a reminder to companies to keep such issues "front and center. This is one of those cases to better be ahead of the curve in terms of mitigating some of the risks and implement best practices before the harm is actually done."
While the report contains some "useful reminders' on best practices, Wolf disagreed with its "undue emphasis on data minimization, since Big Data by its nature relies on discoveries from unexpected correlations from data, and broad-scale data minimization could prevent such discoveries. There are ways to strip out personal data or to anonymize retained data that should satisfy the FTC’s concerns and allow for optimal Big Data usage" (see 1601070044).