Monday, May 20, 2024

News Destination For The Global Indian Community

News Destination For The Global Indian Community

TECHNOLOGY
LifeMag
Face read tech is at a tipping point

Face read tech is at a tipping point

For society to derive the most benefit from AI, the core of technology must be the means to serve the end of human needs, not otherwise

George Orwell, in his dystopian masterpiece 1984, had envisioned a society which had progressed technologically. But the individual, constantly supervised and with little privacy, had become a cog in the wheel of a larger, albeit advanced machine. In his novel, Orwell introduced the concept of a device known as the “telescreen”, situated in homes and public spaces, which could identify people and monitor and analyse their physiological and facial expressions. This device uncannily mirrors the conception and progression of Artificial Intelligence (AI) today, specifically the Facial Recognition Technology (FRT). It is a biometric technology that utilises unique facial features to identify people, the reading of which is then compared with the database in existence. The Centre has approved the setting up of a national Automated Facial Recognition System (AFRS) under the National Crime Records Bureau (NCRB) for “criminal identification, verification among police organisations.” Thus, the need of the hour is to regulate and govern both the collection and analysis of personal data for FRT on the basis of the rule of law. As of today, there is no Act or regulation in place in the country which controls such data capture, storage and analysis. This is alarming considering the size of the population in India and the proposed exponential use of FRT in various sectors such as commerce, airports, identification and crime control by the police. For instance, the Airports Authority of India has planned to launch “Digi Yatra” or facial biometric-based air travel by December end. This disruptive technology brings to light two major concerns: First, whether the data collected in an effort to increase the accuracy of the FRT is secure? Second, whether the right to privacy of the individual is compromised in such a society wherein wide scale surveillance feeds the FRT?

The Supreme Court, in the KS Puttaswamy vs Union of India case,  affirmed the Right to Privacy and stated that it extended to public spaces, and that citizens cannot be subject to unlawful collection or exploitation of their personal data. The court clarified that any infringement on the Right to Privacy, such as collecting personal data for law enforcement purposes, must be done keeping in mind the proportionality standard, which dictates that any collection and analysis of data must be done in pursuit of a legitimate aim, which is both necessary and proportionate. India’s Personal Data Protection Bill, 2019, includes “facial images” as part of the biometric data. Clause 3(7) and such biometric data is categorised as “sensitive personal data” in Clause 3(36). Personal data cannot be processed without the consent of the data principle as per Clause 11. The Bill has addressed some of the privacy concerns and has incorporated laudatory provisions such as the rights of correction and erasure (Clause 18 and 58) in cases of biometric data (FRT) obtained without proper consent or such data being incorrect. Further, Clause 9 of the 2019 Bill requires data fiduciaries to not retain any personal data beyond the period necessary to satisfy the purpose for which it is processed and to delete the personal data at the end of the processing.

While formulating our home-grown regulations regarding data security and privacy of FRT data, one can consider the General Data Protection Regulation (GDPR) of the European Union wherein the processing of biometric data for FRT requires explicit consent. The European data protection Board under the GDPR has issued guidelines on “processing of personal data through video devices.” These guidelines incorporate Article 13 of the GDPR and have made video surveillance specific to a purpose and not a “roaming exercise impinging on the rights of the data subjects.” The data subjects should be made aware of the fact that video surveillance is in operation (via warning signs over monitored premises). Thus, while safeguarding our own data security, India, too, can consider such transparent and full-disclosure methods of the GDPR and establishment of a regulatory board in formulating its own FRT Regulation regime. Globally, countries are becoming increasingly cautious while utilising FRT. The Information Commissioner of the UK has issued directions to the police to “justify” the use of FRT and has called on the Government for a statutory binding code of practice. In the US, the cities of San Francisco, Berkeley, Portland, Boston have banned the use of FRT by Government authorities and private businesses. However, Washington has struck a middle ground and has passed a law which regulates FRT through “meaningful human review.” AI is subjected to human review so that any discrepancies or errors by the machine can be rectified and the data securely managed. Algorithms evolve in such a manner that the legal framework and regulations need to take the sophistication of the machine into account while formulating policies for its deployment. Thus, it can be compared to creating regulatory embankments on a constant digital flow of interconnections. To prevent such an intrusive evolution of the technology, a regulation driven Act supervised by a body similar to the European data protection board would be a good move. These regulations must entail room for constant review and upgradation, keeping at their core the idea of protection of an individual’s data and legitimate processing of such data as per the terms of consent. AI is at a tipping point. For society and individuals to derive the most benefit from it, the core of the technology must be the means to serve the end of human needs and not to make citizens and consumers means to an end.                                          

 (The writer is a lawyer)

Face read tech is at a tipping point

Face read tech is at a tipping point

For society to derive the most benefit from AI, the core of technology must be the means to serve the end of human needs, not otherwise

George Orwell, in his dystopian masterpiece 1984, had envisioned a society which had progressed technologically. But the individual, constantly supervised and with little privacy, had become a cog in the wheel of a larger, albeit advanced machine. In his novel, Orwell introduced the concept of a device known as the “telescreen”, situated in homes and public spaces, which could identify people and monitor and analyse their physiological and facial expressions. This device uncannily mirrors the conception and progression of Artificial Intelligence (AI) today, specifically the Facial Recognition Technology (FRT). It is a biometric technology that utilises unique facial features to identify people, the reading of which is then compared with the database in existence. The Centre has approved the setting up of a national Automated Facial Recognition System (AFRS) under the National Crime Records Bureau (NCRB) for “criminal identification, verification among police organisations.” Thus, the need of the hour is to regulate and govern both the collection and analysis of personal data for FRT on the basis of the rule of law. As of today, there is no Act or regulation in place in the country which controls such data capture, storage and analysis. This is alarming considering the size of the population in India and the proposed exponential use of FRT in various sectors such as commerce, airports, identification and crime control by the police. For instance, the Airports Authority of India has planned to launch “Digi Yatra” or facial biometric-based air travel by December end. This disruptive technology brings to light two major concerns: First, whether the data collected in an effort to increase the accuracy of the FRT is secure? Second, whether the right to privacy of the individual is compromised in such a society wherein wide scale surveillance feeds the FRT?

The Supreme Court, in the KS Puttaswamy vs Union of India case,  affirmed the Right to Privacy and stated that it extended to public spaces, and that citizens cannot be subject to unlawful collection or exploitation of their personal data. The court clarified that any infringement on the Right to Privacy, such as collecting personal data for law enforcement purposes, must be done keeping in mind the proportionality standard, which dictates that any collection and analysis of data must be done in pursuit of a legitimate aim, which is both necessary and proportionate. India’s Personal Data Protection Bill, 2019, includes “facial images” as part of the biometric data. Clause 3(7) and such biometric data is categorised as “sensitive personal data” in Clause 3(36). Personal data cannot be processed without the consent of the data principle as per Clause 11. The Bill has addressed some of the privacy concerns and has incorporated laudatory provisions such as the rights of correction and erasure (Clause 18 and 58) in cases of biometric data (FRT) obtained without proper consent or such data being incorrect. Further, Clause 9 of the 2019 Bill requires data fiduciaries to not retain any personal data beyond the period necessary to satisfy the purpose for which it is processed and to delete the personal data at the end of the processing.

While formulating our home-grown regulations regarding data security and privacy of FRT data, one can consider the General Data Protection Regulation (GDPR) of the European Union wherein the processing of biometric data for FRT requires explicit consent. The European data protection Board under the GDPR has issued guidelines on “processing of personal data through video devices.” These guidelines incorporate Article 13 of the GDPR and have made video surveillance specific to a purpose and not a “roaming exercise impinging on the rights of the data subjects.” The data subjects should be made aware of the fact that video surveillance is in operation (via warning signs over monitored premises). Thus, while safeguarding our own data security, India, too, can consider such transparent and full-disclosure methods of the GDPR and establishment of a regulatory board in formulating its own FRT Regulation regime. Globally, countries are becoming increasingly cautious while utilising FRT. The Information Commissioner of the UK has issued directions to the police to “justify” the use of FRT and has called on the Government for a statutory binding code of practice. In the US, the cities of San Francisco, Berkeley, Portland, Boston have banned the use of FRT by Government authorities and private businesses. However, Washington has struck a middle ground and has passed a law which regulates FRT through “meaningful human review.” AI is subjected to human review so that any discrepancies or errors by the machine can be rectified and the data securely managed. Algorithms evolve in such a manner that the legal framework and regulations need to take the sophistication of the machine into account while formulating policies for its deployment. Thus, it can be compared to creating regulatory embankments on a constant digital flow of interconnections. To prevent such an intrusive evolution of the technology, a regulation driven Act supervised by a body similar to the European data protection board would be a good move. These regulations must entail room for constant review and upgradation, keeping at their core the idea of protection of an individual’s data and legitimate processing of such data as per the terms of consent. AI is at a tipping point. For society and individuals to derive the most benefit from it, the core of the technology must be the means to serve the end of human needs and not to make citizens and consumers means to an end.                                          

 (The writer is a lawyer)

Leave a comment

Comments (0)

Opinion Express TV

Shapoorji Pallonji

SUNGROW

GOVNEXT INDIA FOUNDATION

CAMBIUM NETWORKS TECHNOLOGY

Opinion Express Magazine