The European Commission’s proposed digital “right to be forgotten” won’t...
The European Commission’s proposed digital “right to be forgotten” won’t be enforceable by technology alone and must rely on a mix of technical and international legal provisions, the European Network and Information Security Agency (ENISA) said in a report published…
Sign up for a free preview to unlock the rest of this article
Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.
Tuesday. The EC raised the idea in a January proposal for updating EU data protection rules, saying Internet users should be allowed to ask for digitally held personal information to be deleted. ENISA’s report (http://bit.ly/Sbqc6c) focused on how to achieve forgetfulness in information systems. One problem is that the EC doesn’t precisely define what constitutes personal data, who can request their deletion and how deletion can acceptably take place, ENISA said. Personal data are broadly defined as information that can be linked, either alone or in combination with other available data, to uniquely identify a natural person, but it’s unclear whether that includes information that can be used to identify someone with high probability or that focuses on someone as a member of a small set of individuals such as a family, it said. Another question is how aggregated forms of data might be affected when some of the raw data from which statistics are derived is erased, it said. The problem is that developing technical ways to ensure the right to be forgotten requires an exact definition of the data and the circumstances to which the right should apply, when EU regulations and laws tend to be deliberately broad and general, it said. The question of who has the right to demand the deletion of data is also unclear, it said. For example, if Bob incorporates part of a tweet he received from Alice into a longer blog post of his own, and Alice then exercises her right to remove her tweet, must Bob take down his entire blog post or remove the tweet and rewrite the post? A related question is how to balance the right to be forgotten against the public interest in accountability, journalism, history and scientific inquiry, ENISA said. Another question is what constitutes “forgetting.” A strict interpretation would mean that all copies of the data would be erased and removed from any derived or aggregated representations. A slightly weaker and possibly more practical approach would allow encrypted copies of the information to survive as long as they can’t be deciphered by any unauthorized parties. An even weaker interpretation would allow clear text copies of the data to survive as long as it no longer appears in public indices, database query results or search engine listings, it said. The main technical challenges surrounding the right to be forgotten are how to allow a person to identify and locate personal data stored about her; how to track all copies of an item and all copies of information derived from it; how to determine whether someone has the right to seek removal of a data item; and how to erase all exact and derived copies, ENISA said. In a completely open system like the Web, it’s generally not possible for anyone to find all personal data items stored about her, or to determine whether she has the right to demand their removal, and no single person has the authority to effect deletion of all copies, it said. “Therefore, enforcing the right to be forgotten is impossible in an open, global system, in general,” it said. It could be technically feasible in closed systems such as corporate networks, but that would mean that users and providers would be “strongly authenticated using a form of electronic identity that can be linked to natural persons,” it said. But regardless of the type of information system, unauthorized copying of information by human observers is ultimately impossible to prevent by technical means, it said. Nor can digital duplication be stopped in open networks. There’s ongoing research on encrypting personal data with an expiration date; securing privacy-sensitive content posted on social networks by storing it on external, trusted servers; and developing owner-centric architecture that establishes dedicated storage locations for all data to which only the legitimate data owner has access, the report said. But these solutions offer only a partial technological solution, it said. Among other things, ENISA recommended that the EC consider the possible “pragmatic” approach of ordering search engine operators and sharing services within the EU to filter references to forgotten information stored inside and outside the EU region.