The General Data Protection Regulation for the first time in the European Union addresses Data Protection by Design as a legal obligation for data controllers and processors, making an explicit reference to data minimization and the possible use of pseudonymisation. On top of this, it introduces the obligation of Data Protection by Default, going a step further into the adoption of privacy-friendly default settings. Those principles are codified in Art. 25, through which the European Commission has acknowledged that privacy cannot be ensured only by legislative measures, but that it has to be concretely ensured by organisations implementing appropriate technical and organisational measures to ensure data protection principles such as data minimisation are met.
It is not the first time the European Union realises that legislative measures are not enough.
The European Union Agency for Network and Information Security (ENISA) on 2004 already acknowledged in its report on Privacy and Data Protection by Design – from policy to engineering that “Privacy by Design is a multifaceted concept: in legal documents on one hand, it is generally described in very broad terms as a general principle; by computer scientists and engineers on the other hand it is often equated with the use of specific privacy-enhancing technologies (PETs). However, privacy by design is neither a collection of mere general principles nor can it be reduced to the implementation of PETs. In fact, it is a process involving various technological and organisational components, which implement privacy and data protection principles. These principles and requirements are often derived from law, even though they are often underspecified in the legal sources”.
From an engineering stand point, in the 90´s Ann Cavoukian, Ontario’s Information & Privacy Commissioner, gave a definition of Privacy by Design as “the philosophy of embedding privacy proactively into technology itself – making it the default”, defining that privacy should be taken into account throughout the entire engineering process from the earliest design stages to the operation of the productive system.
The proposal was structured around a set of ‘foundational principles’:
- PbD anticipates and prevents privacy invasive events before they happen. (Proactive not Reactive/Preventative not Remedial).
- No action is required on the part of the individual to protect their privacy — it is built into the system, by default (Privacy as the Default Setting).
- Privacy is integral to the system, without diminishing functionality (Privacy Embedded into Design).
- PbD avoids the pretence of false dichotomies, such as privacy vs. security, demonstrating that it is possible to have both (Full Functionality/Positive-Sum, not Zero-Sum).
- PbD ensures that all data are securely retained, and then securely destroyed at the end of the process, in a timely fashion (End-to-End Security/Full Lifecycle Protection).
- PbD seeks to assure all stakeholders that whatever the business practice or technology involved, it is in fact, operating according to the stated promises and objectives, subject to independent verification (Visibility and Transparency/Keep it Open).
- Above all, PbD requires system architects and operators to keep the interests of the individual uppermost by offering such measures as strong privacy defaults, appropriate notice, and empowering user-friendly options (Respect for User Privacy/Keep it User-Centric).
Nowadays, despite the big steps forward done in Europe and in the world to try to address the privacy concerns, questions such as how to translate ethics and law related principles and concretely apply them in order to create legal and technological solutions that work in practice are still part of a vast multidisciplinary dialogue between law and engineering communities, facing the issue of deciding how best to ensure the right balance between for instance security and privacy, automatisation and respect of human autonomy.
This article has the ambition to provide a little contribution to narrow such a gap between law and computer science experts, with the aim of illustrating some of the legal principles embedded in the GDPR, related in particular to data protection with an engineering approach.
The whole article grounds its roots in several years of participation to research projects and in particular in the most recent analysis carried out in the research project co-funded by the European Commission called DEFENDER – Defending the European Energy Infrastructures, where a multidisciplinary team including law, energy domain and engineering experts interacts in order to build a privacy and security by design platform for incidents and countermeasures information exchange on critical energy infrastructures at European level, sharing cyber security and proactive/preventive countermeasures to mitigate the physical and cyber risks.
A multidisciplinary approach to Data Protection by Design
The GDPR introduces a set of principles and constraints to govern how personal data should be collected, stored, processed and retained (and finally shared). With reference to those principles, a range of requirements, constraints and controls can be derived in terms of specifications for a software system managing personal data.
Individual rights: the reform introduced by the GDPR is designed to provide EU citizens with more control over their own personal data. Individuals shall have the possibility of effectively and conveniently exercise their rights to access and rectify as well as to block and erase their personal data. Individuals shall have easier access to their data, enabling them to review what data is stored about them and how it is processed, who it is shared with, along with the ability to migrate that data between service providers without restriction.
Lawfulness, fairness and transparency: it implies having legitimate grounds for collecting and using the personal data, including allowing individuals to provide explicit an unambiguous consent to data collection – “consent by default” is no longer valid – as well as not using the data in ways that have unjustified adverse effects on the individuals concerned.
The organization seeking consent shall also provide clear information on how that data will be used, for how long it will be retained, and how it will be shared with third parties. Individuals can retract consent at any time, without prejudice. Additional permissions have to be requested from the individual if the data is to be used for processing purposes beyond the original consent.
Purpose limitation: the collection of personal data shall be limited to what is directly relevant and necessary to accomplish the system specified purpose. The purpose shall be legitimate, and it shall be specified and made explicit before collecting personal data.
Data minimisation: only the minimum amount of personal data which is needed to achieve the specific system purpose shall be collected, used and retained. Moreover, data shall be adequate and relevant in relation to the objectives of the system.
Accuracy: data shall be the right value, it shall to precisely represent the value in consistent form and it shall be up-to-date. Procedures to keep data up-to-date should be implemented.
Storage limitation: personal data shall be retained only for as long as is necessary to fulfil the declared purpose in the system objectives. Procedures for erasing or effectively anonymising personal data as soon as it is not anymore needed for the given purpose shall be implemented.
Integrity and confidentiality: appropriate measures for data security shall be implemented: information security addresses integrity, confidentiality and availability concerns. They are important also from a privacy and data protection perspective, requiring a set of rules to be applied to limit access to personal data only to authorized people, and to ensure that the data is trustworthy and accurate. Therefore, data should be kept secure applying Privacy Enhancing Technologies (PETs), preventing accidental disclosure of personal data, securing communications with external stakeholders (such as for instance external systems). Privacy-Enhancing Technologies (PETs) are for instance encryption, protocols for anonymous communications, attribute based credentials and private search of databases.
Accountability: examples of accountability measures are related to tracking of personal data access and of communications with external systems in order to be able to demonstrate the compliance with privacy and data protection principles and legal requirements. From an organizational stand point, clear responsibilities and internal audits shall be defined prior the processing of personal data. A means for demonstrating compliance is the Data Protection Impact Assessment shall be carried out to understand personal data risk exposure. It should be noticed that at a national level, accountability is supported by independent Data Protection Authorities for monitoring and checking as supervisory bodies.
Conclusions
Laid down as principles in the Charter of Fundamental Rights and the Treaty on the Functioning of the European Union, privacy and data protection are fundamental rights which need to be protected at all time. Currently, privacy and data protection by design are regarded by engineers and computer scientists as an obligation, while lawyers seem to miss the opportunity of taking advantage from a dialogue with computer scientists to rethink the privacy concept in relation to some digital contexts, such as for instance the Internet of Things. A more strict cooperation between engineers and lawyers will definitely aid in protecting those human rights for the benefit of the end users commu