Facial recognition technology in surveillance

DEFINITION: Computerized biometric identification technology that matches scans of facial features against images of faces stored in databases to identify individuals.

SIGNIFICANCE: Facial recognition technology is an advanced tool used by law-enforcement agencies and security specialists to verify the identities of individuals and to conduct surveillance. Along with other biometric identification technologies, such as fingerprint- and eye-scanning systems, facial recognition systems are becoming increasingly important components of security measures in a number of environments.

Facial recognition technology (FRT), also known as face biometrics, is employed as a security measure in many public venues, particularly in airports, sporting arenas, and high-crime areas. Several countries—including Australia, the United Kingdom, and the United States—use digital cameras and closed-circuit television (CCTV) surveillance in strategically chosen public venues to capture images of faces, which are then reduced to digital codes and compared with images in databases of wanted persons. In addition to identifying and tracking known criminals, such surveillance along with FRT may act as a general deterrent to would-be criminals. Security specialists also use FRT in various business and government organizations to limit access to sensitive projects, technologies, and products.

89312149-73895.jpg

After an FRT device captures an image of a face using a digital camera or CCTV, the image is scanned and facial recognition software converts it to a face code, also known as a face print. Facial recognition software applies geometric and mathematical calculations based on a number of points identified on the facial image; these calculations vary depending on the software, the mathematical program implemented, and the number of points entered into the calculations. Generally, the face print is based on the angles of a face and the distances between the eyes, nose, and mouth. Some calculations are more resistant to errors because of changes in angle, lighting, or expression or because of changes in appearance such as a beard or glasses.

After a face print is established, the software uses artificial intelligence (AI) to interface with a database of stored facial images established by the Federal Bureau of Investigation (FBI) or with another database created by a security firm or by a state or local law-enforcement agency. For the FRT to identify an individual, a matching face print must exist in the database for comparison. Increasingly, software is used that enables internet-based searches for facial matches. Such databases contain billions of images gathered from social media and other online sources, meaning any individual with a photo online can potentially be pulled into a criminal investigation. When FRT is used for surveillance, after a face print is matched to a face in the database, the system alerts or security personnel so that they can detain or track the person until his or her identity can be established.

One criticism of FRT is that all systems developed thus far have consistent error rates because of differences between the face prints in the databases and the face prints calculated based on the images captured by the systems’ cameras. These differences are often caused by changes in the angles of people’s faces as they pass surveillance points, by changes in facial expressions, and by facial disguises. The consequence of an error can be a false alarm or a miss. A false alarm identifies an individual as wanted when he or she is not, and a miss allows a person who is wanted (or banned or otherwise under excluded status) to avoid detection. In addition, facial recognition has different accuracy rates for different races. Black Americans specifically were shown to experience a higher error rate than their White counterparts. The 2018 project Gender Shades, which tested the accuracy of several classification algorithms, showed that the algorithms performed with error rates up to 34 percent higher when recognizing darker-skinned females than when recognizing lighter-skinned males. Critics explain that this and other reports prove a bias within FRT that, when paired with discriminatory law-enforcement practices, could prove detrimental to already marginalized populations. In the 2020s, several incidents of Black people wrongly arrested based on the use of FRT came to light. In 2023, at least five lawsuits for wrongful arrests based on the use of facial recognition technology were filed against police departments in various cities across the US, three of which were filed in Detroit, Michigan. A 2024 Washington Post investigation caused further controversy, as it revealed that the use of facial recognition software is often not disclosed by law enforcement to people who have been arrested, raising questions of fairness, especially given the technology's unreliability and history of false arrests. Some states have passed legislation that requires police to disclose the use of FRT; however, such legislation is often not enforced.

Supporters of the use of facial recognition systems note that FRT provides a relatively nonintrusive way to verify and authenticate a person’s identity and assert that it is a valuable tool against terrorist attacks and other criminal activity. Opponents of the use of FRT point to of high error rates and general ineffectiveness; they suggest that the technology in its current state is unacceptable, given its potential for abuse and the threat that its use poses to individuals’ privacy.

Bibliography

"Black Plaintiffs File Lawsuits for Wrongful Arrests or Jailing Due to Facial Recognition Technology." NBC News, 25 Sept. 2023, www.nbcnews.com/news/nbcblk/black-plaintiffs-file-lawsuits-wrongful-arrests-jailing-due-facial-rec-rcna117168. Accessed 21 Nov. 2023.

Harnois, Meena N. Facial Recognition Technology. Nova Science Publishers, 2013.

Hill, Kashmir. "Eight Months Pregnant and Arrested After False Facial Recognition Match." The New York Times, 6 Aug. 2023, www.nytimes.com/2023/08/06/business/facial-recognition-false-arrest.html. Accessed 21 Nov. 2023.

MacMillan, Douglas, et al. "Police Seldom Disclose Use of Facial Recognition Despite False Arrests." The Washington Post, 6 Oct. 2024, www.washingtonpost.com/business/2024/10/06/police-facial-recognition-secret-false-arrest/. Accessed 13 Nov. 2024.

Markowitz, Eric. “A Face Only Big Brother Could Love.” Newsweek Global, 29 Apr. 2016, pp.20–23. Canadian Reference Centre, search.ebscohost.com/login.aspx?direct=true&db=rch&AN=114614724&site=eds-live. Accessed 30 Dec. 2016.

Najibi, Alex. "Racial Discrimination in Face Recognition Technology." Harvard University, 24 Oct. 2020, sitn.hms.harvard.edu/flash/2020/racial-discrimination-in-face-recognition-technology/. Accessed 31 Oct. 2022.

Torr, James D., ed. Homeland Security: Opposing Viewpoints. Thomson Gale, 2004.

Vacca, John R. Biometric Technologies and Verification Systems. Elsevier, 2007.

Woodward, John D., Jr., Christopher Horn, Julius Gatune, and Aryn Thomas. Biometrics: A Look at Facial Recognition. RAND, 2003.