Facial recognition: Safety and Privacy issues

0

The most critical issues in the realm of facial recognition technologies are connected to processing the relative personal data, especially biometric data, or data that allow a person to be individually identified and therefore are classified as “super-sensitive” data under the GDPR.

In general, based on Article 9 of the GDPR, processing this data is forbidden except under certain specific conditions. Normally, for such processing to be legal it must be necessary for purposes in the public interest, and European Union and national law must allow it by identifying the related purposes and protective measures to safeguard the fundamental rights of data subjects. Processing may also take place with consent, but the prerequisites of necessity and proportionality must always be met, even more so when facial recognition technologies are employed.

Risks inherent in the use of facial recognition technologies range from mass surveillance to identity theft to discrimination for ethnic and social reasons.

The use of such technologies for safety and law enforcement purposes is not exempt from application of the protective principles provided by regulations on protecting personal data. The Italian Personal Data Protection Authority expressed this recently (in February this year) in reacting to one Italian municipality’s plan to employ a facial recognition system for the purpose of policing.

Based on EU Directive 2016/680 (“Police Directive”) implemented in Italy in 2018, which basically extends the principles provided in the GDPR to cover processing for law enforcement purposes, the Authority stated that this system could not be implemented unless a national law existed to authorize it and provided limitations and conditions for using it while protecting personal rights.

In this case, an innovative video surveillance system had been created by the Municipality of Como, which intended to install video-surveillance cameras in a public park to perform facial recognition on passersby, as well as automatically to issue alerts regarding loitering, abandoned or removed objects, and tripwires (i.e., trespassing in areas that are off-limits).

The Authority explained that the technology used by the municipality created what could only be considered an actual facial recognition system, meaning it was more than mere video surveillance. Additionally, it believed that the legal underpinnings provided by the municipality regarding related personal data processing could not be considered suitable to justify use of the system.

Specifically, in the Data Processing Impact Assessment conducted by the Municipality of Como, the latter  cited Law No. 38, Art. 6, pars. 7 and 8 of April 23, 2009 (on municipalities’ right to use video surveillance systems in public places) and EU Directive 2016/680 regarding protection of natural persons with regard to the processing of personal data by the relevant authorities for the purposes of prevention, investigation, detection, or prosecution of criminal offenses and the execution of criminal penalties (the “Policing directive” or “Enforcement directive”) implemented nationally via Legislative Decree No. 51/2018.

While the Authority correctly stated that Legislative Decree No. 51/2018 was applicable to the case in question, it also specified that said law clearly provides that processing data in specific personal data categories, as per Article 9 of European Regulation 2016/679 and therefore including biometric data, “is authorized only when strictly necessary and accompanied by sufficient guarantees protecting the rights and freedom of the interested party,” and further, that such processing must be specifically provided under EU or domestic law or regulations. As of yet—the Authority continued—there are no regulatory measures extant on this matter, just the general indications provided in Article 5, par. 2 of Legislative Decree 51/2018 regarding the opportunity to establish the terms, storage modalities, lawful subjects, and other items useful for processing personal data (including biometric data) by issuing a presidential decree.

For this reason, at the end of the discovery period, the Authority stated that the Municipality of Como is not authorized to process biometric data using the methods provided by the system and therefore ordered it to comply with existing measures by limiting video-surveillance activity to usage alone and solely for the purposes permitted by law.

The issue of facial recognition and its inherent risks has been debated and considered at length, particularly in recent weeks, as some Big Tech players (specifically IBM, Microsoft, Google, and Amazon) publicly announced that they will abruptly put the brakes on implementing and using facial recognition. Amazon mentioned a one-year moratorium; IBM stated that it opposes the use of such technologies for mass surveillance and racial profiling; Microsoft announced that it shall not provide such technologies to police forces until regulations are implemented to govern their use; and Google discussed the possibility of banning their use, at least for the moment.

The European Commission, too, in its White Paper on Artificial Intelligence dated February 19 of this year, looked at the use of biometric technology and announced that there was a need to open a debate across Europe to identify specific circumstances that could justify the use of facial recognition technologies.

As this recent flurry of players in the technology and artificial intelligence sector staking out their positions demonstrates, such a debate must be handled systematically and in a coordinated manner in order to ward off abusive use and the progressive erosion of human dignity and fundamental rights.

Share this article!
Share.

About Author

Leave A Reply