|
---|
Facial recognition is a biometric security system capable of identifying or verifying a person from a digital image or a video frame from a video source. There are multiple methods in which facial recognition systems work, but in general, they work by comparing selected facial features from given image with faces within a database.[1][2][3]
History of facial recognition technology[]
During 1964 and 1965, Bledsoe, along with Helen Chan and Charles Bisson, worked on using the computer to recognize human faces. He was proud of this work, but because the funding was provided by an unnamed intelligence agency that did not allow much publicity, little of the work was published.[4] Based on the available references, it was revealed that the Bledsoe's initial approach involved the manual marking of various landmarks on the face such as the eye centers, mouth, etc., and these were mathematically rotated by computer to compensate for pose variation.[4] The distances between landmarks were also automatically computed and compared between images to determine identity.[4]
Given a large database of images and a photograph, the problem was to select from the database a small set of records such that one of the image records matched the photograph. The success of the method could be measured in terms of the ratio of the answer list to the number of records in the database. Bledsoe (1966a) described the following difficulties:
“This recognition problem is made difficult by the great variability in head rotation and tilt, lighting intensity and angle, facial expression, aging, etc. Some other attempts at face recognition by machine have allowed for little or no variability in these quantities. Yet the method of correlation (or pattern matching) of unprocessed optical data, which is often used by some researchers, is certain to fail in cases where the variability is great. In particular, the correlation is very low between two pictures of the same person with two different head rotations.” | |
— Woody Bledsoe, 1966 |
This project was labeled man-machine because the human extracted the coordinates of a set of features from the photographs, which were then used by the computer for recognition. Using a graphics tablet, the operator would extract the coordinates of features such as the center of pupils, the inside corner of eyes, the outside corner of eyes, point of graphics tablet, and so on. From these coordinates, a list of 20 distances, such as the width of mouth and width of eyes, pupil to pupil, were computed. These operators could process about 40 pictures an hour. When building the database, the name of the person in the photograph was associated with the list of computed distances and stored in the comp uter. In the recognition phase, the set of distances was compared with the corresponding distance for each photograph, yielding a distance between the photograph and the database record. The closest records are returned.
Because it is unlikely that any two pictures would match in head rotation, lean, tilt, and scale (distance from the camera), each set of distances is normalized to represent the face in a frontal orientation. To accomplish this normalization, the program first tries to determine the tilt, the lean, and the rotation. Then, using these angles, the computer undoes the effect of these transformations on the computed distances. To compute these angles, the computer must know the three-dimensional geometry of the head. Because the actual heads were unavailable, Bledsoe (1964) used a standard head derived from measurements on seven heads.
After Bledsoe left PRI in 1966, this work was continued at the Stanford Research Institute, primarily by Peter Hart. In experiments performed on a database of over 2000 photographs, the computer consistently outperformed humans when presented with the same recognition tasks (Bledsoe 1968). Peter Hart (1996) enthusiastically recalled the project with the exclamation, "It really worked!"
By about 1997, the system developed by Christoph von der Malsburg and graduate students of the University of Bochum in Germany and the University of Southern California in the United States outperformed most systems with those of Massachusetts Institute of Technology and the University of Maryland rated next. The Bochum system was developed through funding by the United States Army Research Laboratory. The software was sold as ZN-Face and used by customers such as Deutsche Bank and operators of airports and other busy locations. The software was "robust enough to make identifications from less-than-perfect face views. It can also often see through such impediments to identification as mustaches, beards, changed hairstyles and glasses—even sunglasses".[5]
In 2006, the performance of the latest face recognition algorithms was evaluated in the Face Recognition Grand Challenge (FRGC). High-resolution face images, 3-D face scans, and iris images were used in the tests. The results indicated that the new algorithms are 10 times more accurate than the face recognition algorithms of 2002 and 100 times more accurate than those of 1995. Some of the algorithms were able to outperform human participants in recognizing faces and could uniquely identify identical twins.[6][7]
U.S. Government-sponsored evaluations and challenge problems[8] have helped spur over two orders-of-magnitude in face-recognition system performance. Since 1993, the error rate of automatic face-recognition systems has decreased by a factor of 272. The reduction applies to systems that match people with face images captured in studio or mugshot environments. In Moore's law terms, the error rate decreased by one-half every two years.[9]
Low-resolution images of faces can be enhanced using face hallucination.
Face ID[]
- Main article: Face ID
Apple introduced Face ID on the iPhone X as a biometric authentication successor to Touch ID, a fingerprint based system. Face ID has a facial recognition sensor that consists of two parts: a "Romeo" module that projects more than 30,000 infrared dots onto the user's face, and a "Juliet" module that reads the pattern.[10] The pattern is sent to a local "Secure Enclave" in the device's central processing unit (CPU) to confirm a match with the phone owner's face.[11] The facial pattern is not accessible by Apple. The system will not work with eyes closed, in an effort to prevent unauthorized access.[11]
The technology learns from changes in a user's appearance, and therefore works with hats, scarves, glasses, and many sunglasses, beard and makeup.[12]
It also works in the dark. This is done by using a "Flood Illuminator", which is a dedicated infrared flash that throws out invisible infrared light onto the user's face to properly read the 30,000 facial points.[13]
Advantages and disadvantages[]
Compared to other biometric systems[]
One key advantage of a facial recognition system that it is able to person mass identification as it does not require the cooperation of the test subject to work. Properly designed systems installed in airports, multiplexes, and other public places can identify individuals among the crowd, without passers-by even being aware of the system.[14]
However, as compared to other biometric techniques, face recognition may not be most reliable and efficient. Quality measures are very important in facial recognition systems as large degrees of variations are possible in face images. Factors such as illumination, expression, pose and noise during face capture can affect the performance of facial recognition systems.[14] Among all biometric systems, facial recognition has the highest false acceptance and rejection rates,[14] thus questions have been raised on the effectiveness of face recognition software in cases of railway and airport security.[citation needed]
Controversies[]
Privacy violations[]
Civil rights organizations and privacy campaigners such as the Electronic Frontier Foundation,[15] Big Brother Watch[16] and the ACLU[17] express concern that privacy is being compromised by the use of surveillance technologies. Some fear that it could lead to a “total surveillance society,” with the government and other authorities having the ability to know the whereabouts and activities of all citizens around the clock. This knowledge has been, is being, and could continue to be deployed to prevent the lawful exercise of rights of citizens to criticize those in office, specific government policies or corporate practices. Many centralized power structures with such surveillance capabilities have abused their privileged access to maintain control of the political and economic apparatus, and to curtail populist reforms.[18]
Face recognition can be used not just to identify an individual, but also to unearth other personal data associated with an individual – such as other photos featuring the individual, blog posts, social networking profiles, Internet behavior, travel patterns, etc. – all through facial features alone.[19] Concerns have been raised over who would have access to the knowledge of one's whereabouts and people with them at any given time.[20] Moreover, individuals have limited ability to avoid or thwart face recognition tracking unless they hide their faces. This fundamentally changes the dynamic of day-to-day privacy by enabling any marketer, government agency, or random stranger to secretly collect the identities and associated personal information of any individual captured by the face recognition system.[19] Consumers may not understand or be aware of what their data is being used for, which denies them the ability to consent to how their personal information gets shared.[20]
Face recognition was used in Russia to harass women allegedly involved in online pornography.[21] In Russia there is an app 'FindFace' which can identify faces with about 70% accuracy using the social media app called VK. This app would not be possible in other countries which do not use VK as their social media platform photos are not stored the same way as with VK.[22]
In July 2012, a hearing was held before the Subcommittee on Privacy, Technology and the Law of the Committee on the Judiciary, United States Senate, to address issues surrounding what face recognition technology means for privacy and civil liberties.[23]
In 2014, the National Telecommunications and Information Association (NTIA) began a multi-stakeholder process to engage privacy advocates and industry representatives to establish guidelines regarding the use of face recognition technology by private companies.[24] In June 2015, privacy advocates left the bargaining table over what they felt was an impasse based on the industry representatives being unwilling to agree to consent requirements for the collection of face recognition data.[25] The NTIA and industry representatives continued without the privacy representatives, and draft rules are expected to be presented in the spring of 2016.[26]
In July 2015, the United States Government Accountability Office conducted a Report to the Ranking Member, Subcommittee on Privacy, Technology and the Law, Committee on the Judiciary, U.S. Senate. The report discussed facial recognition technology's commercial uses, privacy issues, and the applicable federal law. It states that previously, issues concerning facial recognition technology were discussed and represent the need for updated federal privacy laws that continually match the degree and impact of advanced technologies. Also, some industry, government, and private organizations are in the process of developing, or have developed, "voluntary privacy guidelines". These guidelines vary between the groups, but overall aim to gain consent and inform citizens of the intended use of facial recognition technology. This helps counteract the privacy issues that arise when citizens are unaware of where their personal, privacy data gets put to use as the report indicates as a prevalent issue.[20]
The largest concern with the development of biometric technology, and more specifically facial recognition has to do with privacy. The rise in facial recognition technologies has led people to be concerned that large companies, such as Google or Apple, or even Government agencies will be using it for mass surveillance of the public. Regardless of whether or not they have committed a crime, in general people do not wish to have their every action watched or track. People tend to believe that, since we live in a free society[citation needed], we should be able to go out in public without the fear of being identified and surveilled. People worry that with the rising prevalence of facial recognition, they will begin to lose their anonymity.[citation needed]
On August 11, 2020, a UK court ruled that facial recognition technology violates human rights. The ruling does not suspend the use of all facial recognition technology, but rather, states that better parameters need to be put in place as to when it can be used.[27]
Facebook DeepFace[]
Social media web sites such as Facebook have very large numbers of photographs of people, annotated with names. This represents a database which may be abused by governments for face recognition purposes.[28] Facebook's DeepFace has become the subject of several class action lawsuits under the Biometric Information Privacy Act, with claims alleging that Facebook is collecting and storing face recognition data of its users without obtaining informed consent, in direct violation of the Biometric Information Privacy Act.[29] The most recent case was dismissed in January 2016 because the court lacked jurisdiction.[30] Therefore, it is still unclear if the Biometric Information Privacy Act will be effective in protecting biometric data privacy rights.
In December 2017, Facebook rolled out a new feature that notifies a user when someone uploads a photo that includes what Facebook thinks is their face, even if they are not tagged. Facebook has attempted to frame the new functionality in a positive light, amidst prior backlashes.[31] Facebook's head of privacy, Rob Sherman, addressed this new feature as one that gives people more control over their photos online. “We’ve thought about this as a really empowering feature,” he says. “There may be photos that exist that you don’t know about.” [32]
Imperfect technology in law enforcement[]
All over the world, law enforcement agencies have begun using facial recognition software to aid in the identifying of criminals. For example, the Chinese police force were able to identify twenty-five wanted suspects using facial recognition equipment at the Qingdao International Beer Festival, one of which had been on the run for 10 years.[33] The equipment works by recording a 15-second video clip and taking multiple snapshots of the subject. That data is compared and analyzed with images from the police department's database and within 20 minutes, the subject can be identified with a 98.1% accuracy.[34]
It is still contested as to whether or not facial recognition technology works less accurately on people of color.[35] One study by Joy Buolamwini (MIT Media Lab) and Timnit Gebru (Microsoft Research) found that the error rate for gender recognition for women of color within three commercial facial recognition systems ranged from 23.8% to 36%, whereas for lighter-skinned men it was between 0.0 and 1.6%. Overall accuracy rates for identifying men (91.9%) were higher than for women (79.4%), and none of the systems accommodated a non-binary understanding of gender.[36] However, another study showed that several commercial facial recognition software sold to law enforcement offices around the country had a lower false non-match rate for black people than for white people.[37]
Experts fear that the new technology may actually be hurting the communities the police claims they are trying to protect.[38] It is considered an imperfect biometric, and in a study conducted by Georgetown University researcher Clare Garvie, she concluded that "there’s no consensus in the scientific community that it provides a positive identification of somebody.”[39]
It is believed that with such large margins of error in this technology, both legal advocates and facial recognition software companies say that the technology should only supply a portion of the case – no evidence that can lead to an arrest of an individual.[39]
The lack of regulations holding facial recognition technology companies to requirements of racially biased testing can be a significant flaw in the adoption of use in law enforcement. CyberExtruder, a company that markets itself to law enforcement said that they had not performed testing or research on bias in their software. CyberExtruder did note that some skin colors are more difficult for the software to recognize with current limitations of the technology. “Just as individuals with very dark skin are hard to identify with high significance via facial recognition, individuals with very pale skin are the same,” said Blake Senftner, a senior software engineer at CyberExtruder.[39]
In 2018, the Scottish government created a code of practice which dealt with privacy issues and won praise of the Open Rights Group.[40][41]
UK Information Commissioner's King's Cross investigation[]
In 2019 the Financial Times first reported that facial recognition software was in use in the King's Cross area of London.[42] The development around London's King's Cross mainline station includes shops, offices, Google's UK HQ and part of St Martin's College. The BBC reported, the ICO said: "Scanning people's faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all."[43][44] Elizabeth Denham, the UK Information Commissioner launched an investigation into the use of the King's Cross facial recognition system, operated by the company Argent. "This is inherently a surveillance tool that bends towards authoritarianism," said Silkie Carlo of Big Brother Watch. In September 2019 it was announced by Argent that facial recognition software would no longer be used at King's Cross. Argent claimed that the software had been deployed between May 2016 and March 2018 on two cameras covering a pedestrian street running through the centre of the development. The Guardian reported that the decision to switch off the facial recognition system was a result of public concern about its deployment.[45] Facial recognition software has also been used at Meadowhall shopping centre in Sheffield, the World Museum in Liverpool and Millennium Point complex in Birmingham.[46]
Sweden 2019 GDPR Violation[]
The Swedish Data Protection Authority (DPA) issued its first ever financial penalty for a violation of the EU's General Data Protection Regulation (GDPR) against a school that was using the technology to replace time-consuming roll calls during class. The DPA found that the school illegally obtained the biometric data of its students without completing an impact assessment. In addition the school did not make the DPA aware of the pilot scheme. A 200,000 SEK fine (€19,000/$21,000) was issued.[47]
Bans[]
In May 2019, San Francisco, California became the first major United States city to ban the use of facial recognition software for police and other local government agencies' usage.[48] San Francisco Supervisor, Aaron Peskin, introduced regulations that will require agencies to gain approval from the San Francisco Board of Supervisors to purchase surveillance technology.[49] The regulations also require that agencies publicly disclose the intended use for new surveillance technology.[49] In June 2019, Somerville, Massachusetts became the first city on the East Coast to ban face surveillance software for government use,[50] specifically in police investigations and municipal surveillance.[51] In July 2019, Oakland, California banned the usage of facial recognition technology by city departments.[52]
The American Civil Liberties Union (ACLU) has campaigned across the United States for transparency in surveillance technology[51] and has supported both San Francisco and Somerville's ban on facial recognition software. The ACLU works to challenge the secrecy and surveillance with this technology.[citation needed]
In January 2020, the European Union suggested, but then quickly scrapped, a proposed moratorium on facial recognition in public spaces.[53][54]
During the George Floyd protests, use of facial recognition by city government was banned in Boston, Massachusetts.[55]
As of June 10, 2020, municipal use has been banned in:[56]
- Berkeley, California
- Oakland, California
- Boston, Massachusetts - June 30, 2020[57]
- Brookline, Massachusetts
- Cambridge, Massachusetts
- Northampton, Massachusetts
- Springfield, Massachusetts
- Somerville, Massachusetts
- Portland, Oregon - September, 2020[58]
References[]
- ↑ "What is Facial Recognition? - Definition from Techopedia", Techopedia.com. (in en)
- ↑ Andrew Heinzman. How Does Facial Recognition Work? (en-US).
- ↑ How does facial recognition work? (en).
- ↑ 4.0 4.1 4.2 Error on call to Template:cite book: Parameter title must be specified pp. 264–265. Elsevier (2007).
- ↑ Mugspot Can Find A Face In The Crowd -- Face-Recognition Software Prepares To Go To Work In The Streets (12 November 1997). Retrieved on 2007-11-06.
- ↑ Williams, Mark. Better Face-Recognition Software. Retrieved on 2008-06-02.
- ↑ R. Kimmel and G. Sapiro (30 April 2003). The Mathematics of Face Recognition. SIAM News.
- ↑ Face Homepage. nist.gov (2011-01-21).
- ↑ Crawford, Mark. Facial recognition progress report. SPIE Newsroom. Retrieved on 2011-10-06.
- ↑ Kubota, Yoko. "Apple iPhone X Production Woe Sparked by Juliet and Her Romeo", Wall Street Journal, 2017-09-27. (in en-US)
- ↑ 11.0 11.1 "The five biggest questions about Apple's new facial recognition system", The Verge.
- ↑ "Apple's Face ID Feature Works With Most Sunglasses, Can Be Quickly Disabled to Thwart Thieves". (in en)
- ↑ Heisler, Yoni. "Infrared video shows off the iPhone X's new Face ID feature in action", BGR, 2017-11-03. (in en-US)
- ↑ 14.0 14.1 14.2 "Top Five Biometrics: Face, Fingerprint, Iris, Palm and Voice", Bayometric, 2017-01-23. (in en-US)
- ↑ EFF Sues FBI For Access to Facial-Recognition Records. Electronic Frontier Foundation (2013-06-26).
- ↑ Face Off: The lawless growth of facial recognition in UK policing.
- ↑ Q&A On Face-Recognition. American Civil Liberties Union.
- ↑ Civil Liberties & Facial Recognition Software pp. 2. About.com, The New York Times Company. Archived from the original on 1 March 2006. Retrieved on 2007-09-17. “A few examples which have already arisen from surveillance video are: using license plates to blackmail gay married people, stalking women, tracking estranged spouses...”
- ↑ 19.0 19.1 Harley Geiger (6 December 2011). Facial Recognition and Privacy. Center for Democracy & Technology. Retrieved on 2012-01-10.
- ↑ 20.0 20.1 20.2 Cackley, Alicia Puente (July 2015). FACIAL RECOGNITION TECHNOLOGY Commercial Uses, Privacy Issues, and Applicable Federal Law.
- ↑ Facial Recognition is getting really accurate, and we have not prepared (11 October 2016).
- ↑ This creepy facial recognition app is taking Russia by storm (18 May 2016).
- ↑ What Facial Recognition Technology Means for Privacy and Civil Liberties: Hearing before the Subcommittee on Privacy, Technology and the Law of the Committee on the Judiciary, United States Senate, One Hundred Twelfth Congress, Second Session, July 18, 2012
- ↑ Privacy Multistakeholder Process: Facial Recognition Technology.
- ↑ McCabe, David (2015-06-16). Facial recognition talks break down as privacy advocates withdraw.
- ↑ Weaver, Dustin (2016-03-18). Business eyes facial recognition guidelines.
- ↑ Facial Recognition Violates Human Rights, Court Rules. Retrieved on 13 August 2020.
- ↑ Martin Koste. "A Look Into Facebook's Potential To Recognize Anybody's Face", 28 October 2013.
- ↑ Facebook Keeps Getting Sued Over Face-Recognition Software, And Privacy Groups Say We Should Be Paying More Attention (2015-09-03).
- ↑ Herra, Dana. Judge tosses Illinois privacy law class action vs Facebook over photo tagging; California cases still pending.
- ↑ "Singel-Minded: Anatomy of a Backlash, or How Facebook Got an 'F' for Facial Recognition", WIRED. (in en-US)
- ↑ "Facebook Can Now Find Your Face, Even When It's Not Tagged", WIRED. (in en-US)
- ↑ Beijing, Agence France-Presse in (2017-09-01). From ale to jail: facial recognition catches criminals at China beer festival (en).
- ↑ Police use facial recognition technology to detect wanted criminals during beer festival in Chinese city of Qingdao | OpenGovAsia (en).
- ↑ "Photo Algorithms ID White Men Fine—Black Women, Not So Much", WIRED. (in en-US)
- ↑ "Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification", 2018, pp. 77–91. Retrieved on 8 March 2018. (in English)
- ↑ Report on the Evaluation of 2D Still-Image Face Recognition Algorithms (August 24, 2011).
- ↑ Buranyi, Stephen (2017-08-08). Rise of the racist robots – how AI is learning all our worst impulses (en).
- ↑ 39.0 39.1 39.2 Brel, Ali (2017-12-04). How white engineers built racist code – and why it's dangerous for black people (en).
- ↑ Scottish government proposes immediate deletion of expired facial images in new biometrics code
- ↑ CODE OF PRACTICE On the acquisition, use, retention and disposal of biometric data for justiceand community safetypurposes in Scotland
- ↑ Murgia, Madhumita (2019-08-12). London's King's Cross uses facial recognition in security cameras (en-GB).
- ↑ "King's Cross facial recognition investigated", BBC News, 15 August 2019. (in en-GB)
- ↑ Cellan-Jones, Rory. "Tech Tent: Is your face on a watch list?", BBC News, 16 August 2019. (in en-GB)
- ↑ Sabbagh, Dan. "Facial recognition technology scrapped at King's Cross site", The Guardian, 2 September 2019. (in en-GB)
- ↑ "Facial recognition test run on unwitting shoppers", BBC News, 16 August 2019. (in en-GB)
- ↑ News, GDPR (2019-09-01). Unlawful Use of Facial Recognition Technology Lead to GDPR Penalty in Sweden (en-US).
- ↑ "San Francisco Bans Facial Recognition Technology", The New York Times, 2019-05-14. (in en-US)
- ↑ 49.0 49.1 "San Francisco Bans Agency Use of Facial Recognition Tech", Wired. (in en)
- ↑ Somerville Bans Government Use Of Facial Recognition Tech (en).
- ↑ 51.0 51.1 Somerville City Council passes facial recognition ban - The Boston Globe (en-US).
- ↑ Haskins, Caroline (2019-07-17). Oakland Becomes Third U.S. City to Ban Facial Recognition (en).
- ↑ "EU drops idea of facial recognition ban in public areas: paper", Reuters, 29 January 2020. Retrieved on 12 April 2020. (in en)
- ↑ "Facial recognition: EU considers ban", BBC News, 17 January 2020. Retrieved on 12 April 2020.
- ↑ Boston mayor OKs ban on facial recognition tech
- ↑ IBM bows out of facial recognition market
- ↑ Boston mayor OKs ban on facial recognition tech
- ↑ Business, Rachel Metz, CNN. Portland passes broadest facial recognition ban in the US.
- Tucker, Jennifer. "How faicial recognition technology came to be", 23 November 2014.
Further reading[]
- "Near infrared face recognition by combining Zernike moments and undecimated discrete wavelet transform", pp. 13–27.
- "The Face Detection Algorithm Set to Revolutionize Image Search" (Feb. 2015), MIT Technology Review
- Error on call to Template:cite book: Parameter title must be specifiedPerpetual Line Up: Unregulated Police Face Recognition in America. Center on Privacy & Technology at Georgetown Law (18 October 2016).
- "Facial Recognition Software 'Sounds Like Science Fiction,' but May Affect Half of Americans", As It Happens, Canadian Broadcasting Corporation, 20 October 2016. Interview with Alvaro Bedoya, executive director of the Center on Privacy & Technology at Georgetown Law and co-author of Perpetual Line Up: Unregulated Police Face Recognition in America.
External links[]
- Facial recognition system at Wikipedia.
Articles[]
- A Photometric Stereo Approach to Face Recognition". The University of the West of England. http://www1.uwe.ac.uk/et/mvl/projects/facerecognition.aspx