top of page
Writer's pictureDeepak Jha

Facial Recognition, a Violation of Privacy!



Even beyond the security benefits that the technology is intended to provide, Facial Recognition Technology (FRT) can be considered a violation of people’s privacy and discriminatory. Although facial recognition has many good uses, it can also be used for much more personal violations of privacy. From unlocking an iPhone to auto-tagging Facebook photos to employers tracking productivity and police forces surveilling protests, facial recognition technology (FRT) is becoming increasingly embedded into everyday life.


In a survey when asked if there should be more oversight to regulate the use of facial recognition technology, when it comes to improving security systems in the public and private sector, more than 71 percent of people approved the use of more oversight, 19 percent said there should not be more oversight, and 9.5 percent saying that maybe there should be more control over its use.

Privacy Concerns in Facial Recognition Technology

  • Lack of consent: A basic principle of any data privacy law is that organizations must let users know what biometric data is being collected and receive their consent to do the same. The most significant privacy implication of FRT is the use of the technology to identify individuals without their consent. This includes using applications such as real-time public surveillance or an aggregation of databases that are not lawfully constructed.

  • Lack of transparency: Using FRT to identify individuals without their knowledge or consent raises privacy concerns, especially since biometrics are unique to an individual. Furthermore, it poses additional concerns because, unlike other biometrics (e.g., fingerprints), facial scans can be captured easily, remotely, and secretly.

  • Unencrypted faces: Faces are becoming easier to capture from remote distances and cheaper to collect and store. Unlike many other forms of data, faces cannot be encrypted. Data breaches involving facial recognition data increase the potential for identity theft, stalking, and harassment because, unlike passwords and credit card information, faces cannot easily be changed.

  • Inaccuracy: Inaccuracy is another common critique of FRT. A captured facial scan that misidentifies someone could have long-term consequences. Moreover, accuracy varies by demographic, with false positive rates being highest among women and people of color, that can lead to unjust arrests in the criminal context.

  • Technical vulnerabilities: It may be possible to spoof a system (i.e., masquerade as a victim) by using pictures or three-dimensional masks created from imagery of a victim. In addition, FRT can be prone to presentation attacks or the use of physical or digital spoofs, such as masks or deepfakes, respectively.

Privacy Principles for FRT:

Organizations are working relentlessly to streamline the use of FRT in their commercial applications to ensure security, safety, authentication, access, storage identification and management, and accessibility to personally identifiable information (PII). Privacy principles for FRT have been designed to drive responsible data use by the enterprises and online platforms developing and using FRT in commercial settings. These principles include:

  • Consent: Enterprises should obtain express, affirmative consent when enrolling an individual in a program that uses FRT for verification or identification purposes and/or identifying an individual to third parties that would not otherwise have known that individual’s identity.

  • Use: Enterprises should commit to collecting, using and sharing facial recognition data in ways that are compatible with reasonable consumer expectations for the context in which the data was collected.

  • Transparency: Enterprises should provide consumers with meaningful notice about how the facial recognition software templates are created and how such data will be used, stored, shared, maintained, and destroyed.

  • Privacy by design: Enterprises should seek to implement technological controls that support or enforce compliance with these principles in addition to policy, legal and administrative measures.

  • Accountability: Enterprises should take reasonable steps to ensure that use of FRT and data by the organization, and in partnership with all third-party service providers or business partners, adhere to these principles.

  • Data security: Enterprises should maintain a comprehensive data security program that is reasonably designed to protect the privacy, security, confidentiality, and integrity of personal information against risk using technological, administrative, and physical safeguards appropriate to the sensitivity of the information.

  • Integrity and access: Reasonable measures should be implemented by enterprises to maintain the accuracy of facial recognition data. Offer individuals reasonable access to review or request correction of inaccurate identity labeling, and the ability to request deletion of facial recognition data.

The development of FRT has become a prominent global issue. It appears the regulatory road forward in this prospering area will be focused on ensuring that adequate safeguards are in place to prevent abuse of FRT and protect privacy.

30 views0 comments

Commentaires


Les commentaires ont été désactivés.
bottom of page