Privacy Research Affects FTC Work, Officials Say
New FTC Chief Technologist Lorrie Faith Cranor said sometimes there's a "mismatch" between what policymakers are asking and what researchers are trying to answer. "Stronger dialogues" are needed between the groups "so that our academic research can be more relevant and useful" to government policymakers and corporate decision-makers, she said, speaking Wednesday at a Future of Privacy Forum event on academic research on privacy. Hours later, the FTC held an all-day privacy event (see 1601140062 and 1601140029).
Sign up for a free preview to unlock the rest of this article
Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.
FTC Commissioner Julie Brill said privacy research affects her work in different ways. Theoretical work can set the terms of a debate by offering an account of basic privacy issues, while research can confirm or dispel notions, she said. Research can also identify persistent issues in privacy debates, point out sources of the disagreement, and provide recommendations, she said. She said privacy policies, de-identification and encryption were of particular interest to her.
In his paper, Ryan Calo, an assistant professor at the University of Washington School of Law, cited the interdependency between privacy and traditional economics or the market. He said the market needs information to work more efficiently, but is "very hostile" to privacy, which "hides" that information, while privacy supporters are "equally skeptical" of the market because it destroys privacy or turns it into a "commodity."
The opposing sides can help one another, and the conflict has relevance to policymakers in two ways, wrote Calo. By raising privacy as an issue and "placing limits on data promiscuity," the FTC is saving the market from itself, said Calo. The commission "is limiting the amount of information asymmetry that would cause consumers to get very nervous, taken advantage of," he said. "It is, in essence, protecting the market by elevating privacy as a substantive value." He predicted other agencies like "the Consumer Financial Protection Bureau would start to pursue straight up privacy claims as the agency comes to understand what it is that drives consumer vulnerability in the first place."
Another paper said privacy notices need to be more transparent, useful and informative to consumers, especially as new technologies such as wearables and IoT become more prevalent and exacerbate the problem, since it's difficult to display such notices on those devices. Florian Schaub, a postdoctoral fellow at Carnegie Mellon University's computer science school, said such notices are typically written to ensure and assess legal and regulatory compliance. They're too complex, too long, full of jargon and lack meaningful choices, but notices are still important, he said.
"Privacy policies really need to be accompanied by shorter notices that are not hidden away somewhere behind a privacy policy link," said Schaub, who co-wrote the paper with FTC's Cranor, RAND information scientist Rebecca Belebako and Google privacy engineer Adam Durity. Instead, Schaub said the policies should be integrated and interact with a system, provide relevant information to a user for his or her current activity and be actionable so users can make real choices. This is happening, he said, pointing to permission requests seen on Apple iPhones as a good example. The paper identified real-world examples, best practices, and design dimensions and opportunities.
Privacy law expert Neil Richards said privacy should be viewed in terms of trust, in a paper he co-wrote with Stanford University Associate professor Woodrow Hartzog. "The importance of trust matters because without trust society begins to unravel, the economy begins to unravel, users share less information with customers, they're skeptical of law enforcement and regulators," Richards said. But most of the discourse on privacy, he said, has been "pessimistic."
To promote trust, companies should protect data and be honest about how it's used; they should be discreet and shouldn't share data "willy-nilly" and be loyal and act in the best interests of their consumers, said Richards, a Washington University law professor.
The risk to sensitive data protected by ad hoc de-identification is unknowable and promoted a "weak version of the precautionary approach," said a paper presented by Princeton University Assistant professor Arvind Narayanan, who wrote it with co-authors Joanna Huey, associate director of Princeton's Center for Information Technology Policy, and Deputy U.S. Chief Technology Officer Edward Felten. Narayanan said de-identification can't be the only tool that companies or organizations use. They should take additional steps such as access control or "provable privacy techniques" to provide "affirmative evidence that your data release can potentially resist adversarial attempts at re-identification," he added.
Georgia Institute of Technology professor Peter Swire disputed the FBI's alarms about criminals and terrorists "going dark" through encrypted communications. Instead, he said, "we're really in a golden age of surveillance." People carry "tracking devices," typically smartphones, which allow police to know individuals' locations from moment to moment, said Swire. Police also know a person's "confederates and co-conspirators" through "highly useful" metadata, and then there are numerous databases available, he said. "The idea that there's nothing to see about suspects when we identify them is just bizarre and counterfactual."