Understanding the Data Protection White Paper Part VII: The ubiquitous use of biometric data raises new issues when it comes to processing this data

This article is Part 7 of a multi-part series explaining the recently issued white paper on data protection in India. The responses to the white paper will help in the formulation of India’s future data protection laws.

Part 1 of this series discusses defining ‘sensitive personal data’ or SPD. Another key issue in this regard is identifying the protection applicable to it. Internationally, this takes the form of heightened protections, such as the requirement of express consent. The increasingly ubiquitous use and availability of biometric data, however, raise a completely new set of issues. The EU’s General Data Protection Regulation (GDPR) has acknowledged this by including, for the first time, rules specific to the use of biometric data. This is an issue that needs to be looked into by the White Paper in more detail while framing rules on SPD.

Increasing use of biometric data

The use of biometric data for various purposes has become increasingly popular. The most common is for authentication, be it fingerprints in Aadhaar for eKYC, or for UPI, or facial recognition technology through Apple’s FaceID. Facebooks’ photo tagging is another inconspicuous collection and use of facial recognition technology. Facial recognition is also being used for surveillance, such as through identifying people through CCTV cameras.

New forms of biometric data

Traditionally, SPD includes biometric data, which is defined under the IT (Sensitive Personal Data) Rules, 2011 as technologies that measure and analyse human body characteristics. Iris scans, face maps and fingerprints are the obvious inclusions. However, advancement in technology is enabling the exploration and analysis of a number of new forms of biometric data. This includes, for example, behavioural traits like voice, gait and even handwriting analysis (including signatures).

This includes new physiological characteristics like the outline of a hand, body odour, and even ear canal shapes,used for headphone authentication. Reports have also arisen on the heartbeat as a password, raising questions on the use of wearable technologies like fitness devices. An interesting new form of biometric data being explored is the identification of ‘fashion fingerprints’, where people can be recognised based on their clothes.

Biometric data in the public domain

The rapidly expanding scope of this definition puts a question on the widespread availability and subsequent collection of biometric data. For example, has a person who gives a handwritten letter voluntarily disclosed biometric data? Has a person who accidentally captures a third person while taking a photograph collected his biometric data without his consent? It is interesting that nowadays, every photo put up on the internet is the equivalent of putting up sensitive biometric data in the public domain.

Under most laws, processing of information that has voluntarily been put into the public domain by the individual does not need separate consent. This calls into question whether bodies like the government have the right to collect and identify biometric data from such data, since it is in the public domain. Its impact on using facial recognition on CCTVs must also be considered, a practice that has been discouraged in some jurisdictions due to the privacy invasion.

It is true that biometric data is unique to an individual, but it remains data, which once disclosed, cannot be replaced with something new. Technically, the processing of converting such data in the public domain into biometric data will be illegal, without meeting other requirements of the data protection law such as consent. However, there is nothing to stop unscrupulous actors from taking advantage of this availability of the data.

Is its use for authentication secure?

For that matter, how secure really is the use of face recognition for authentication purposes? Previously, biometric data for authentication was commonly used in places requiring higher levels of security, such as nuclear facilities and bank vaults. Its use has now become more widespread, whether as payment authentication or on Apple’s FaceID. Biometric data is also commonly being used in the workplace as a method of authentication and identification.

Thus, the use of biometric data as authentication results in the widespread availability of authentication data in the public domain, which in turn leads to risks like hacking and identity theft. Fingerprints, for example, are left everywhere in the physical world. Their availability in photographs also makes them vulnerable there, such as the hacker who faked a German minister’s fingerprint using a photograph of her hands.

Moreover, once the data is lost, people will be prevented from reusing that data again, considering the security risk. For example, an individual whose face map details are stolen, cannot reuse a technology like Mastercard’s ‘selfie-pay’ technology, which uses face data from selfies to authenticate payments.

A villager goes through the process of a fingerprint scanner for the Unique Identification (UID) database system at an enrolment centre at Merta district in the desert Indian state of Rajasthan February 22, 2013. In a more ambitious version of programmes that have slashed poverty in Brazil and Mexico, the Indian government has begun to use the UID database, known as Aadhaar, to make direct cash transfers to the poor, in an attempt to cut out frauds who siphon billions of dollars from welfare schemes. Picture taken February 22, 2013. REUTERS/Mansi Thapliyal (INDIA - Tags: BUSINESS SOCIETY POVERTY SCIENCE TECHNOLOGY) - GM1E92S1B2G01

A villager goes through the process of a fingerprint scanner for the Unique Identification (UID) database system at an enrolment centre. Image: Reuters

‘Anonymising’ biometric data

Another issue that arises is with the prescription of anonymisation as a solution. For example, use of SPD for research purposes can be permitted if the data is anonymised. If anonymised data, in general, can easily be de-anonymised, this becomes that much easier with respect to biometric data, given their inherent identifiability. Anonymisation of biometric data is thus possible only to a limited extent, such as through separating the data itself, say the DNA, from other information like the name and address. This makes other factors like obtaining prior consent before the use of the data for research a more important factor.

Biometric databanks

Biometric databanks, be it Aadhaar’s Central Identities Data Repository (CIDR) holding fingerprints, or the proposed DNA databanks holding DNA, are new developments. Such databanks necessitate new rules for the collection and storage of this highly sensitive data.

Key questions raised in the White Paper

The White Paper is focussed on the identification of SPD and is contemplating heightened protections for it. At present, however, there is no specific focus on biometric data and the particular risks with it. Given the number of risks arising with biometric data, it is an issue that deserves specific attention.

The White Paper has presently sought comments on the following key questions with respect to the processing of SPD:

  • How should processing of SPD be done?
  • What should be included as categories of SPD given India’s specific socio-economic requirements?
  • What are additional safeguards required to prevent the unlawful processing of SPD?
  • Should the law allow sector specific protections for sensitive data like medical and health information, or financial information?
  • Any other views?

Part I of the series explores the definitions of personal data and sensitive personal data, Part II of the series examines the jurisdiction and territorial scope of data protection laws, Part III of the series explores cross-border data flows and data localisation, Part IV deals with exemptions to data protection law, Part V deals with notice and consent, and Part VI deals with the big data challenge to privacy principles.

Leave a Reply

Your email address will not be published. Required fields are marked *