Facial Recognition will Outlast COVID-19

The COVID-19 pandemic has led to an unprecedented spread of facial coverings while simultaneously accelerating the adoption of digital surveillance tools, including facial recognition systems (FRS). However, whereas the facemasks will disappear again, FRS are not only poised to stay, but to keep on expanding. Consequently, governments should address the issues of bias and robustness by testing and certifying FRS.

by Sara Rodriguez Martinez
Surveillance
Image courtesy of StockSnap/Pixabay

Faces are the most common patterns used by humans to identify others in everyday life. Consequently, faces are a part of governmental identity documents, such as passports or driver’s licenses, as well as more informal ones, such as membership cards or social media profiles. In Western societies, there is an implicit agreement to reveal our face as a prerequisite to social relationships. However, because the face is such a central feature of identity and identification, we want to control it and its use. Hence, most of us would find it unacceptable if a stranger were to take a photograph of our face in the street without any further explanation.

FRS detect human faces in image data and match them with facial structures in a databank. Facial authentication or verification matches a captured face with the stored unique facial characteristics of an individual to verify that a person is who he or she claims to be (1:1). Facial recognition or identification is capable of uniquely identifying a person based on sensor data of facial contours (1:N). FRS have made significant progress in the last years due to the adoption of deep neural networks. More specifically, the lowest false negative identification rate in external pagetests by the National Institute of Standards and Technology (NIST) has decreased about 27-fold since 2014. At the same time, the global number of surveillance cameras is growing by about 300’000 per day and is expected to external pagesurpass 1 billion in 2021.

The Visual Cortex of Leviathan

As James Scott argues in external pageSeeing Like A State, governments have a long history of trying to make their citizenry more legible to them. As an example, even last names were originally forced onto people to make it easier to collect taxes and draft individuals into the armed forces. FRS can be seen as a continuation of this trend and framed as part of the metaphorical “visual cortex” of the state, allowing it to make sense of the exponentially growing input from its “eyes”, turning Hobbes’ Leviathan increasingly into Argus Panoptes, the many-eyed giant. Unlike other biometric technologies, such as DNA tests or fingerprint scanners, FRS neither require the active participation nor the consent of the subject. They are non-intrusive, contact-free processes and relatively inexpensive, which makes them an increasingly effective and widespread surveillance tool.

Pushing back against this, in 2019, protesters against governments in places such as Hong Kong and Chile have not only masked their faces but actively tried to external pageinterfere external pagewith and external pagedisable external pagesurveillance cameras in order to protect their anonymity. In return, Hong Kong’s Chief Executive Carrie Lam invoked emergency powers external pagein October 2019 to prohibit all kinds of facial coverings at public gatherings. Violations of the ban were punished with a hefty fine and up to a year in prison.

Pandemic Adaption and Adoption

By early 2020, following the spread of COVID-19, nearly everyone in Hong Kong wore facemasks and, eventually, external pagein July, the government made their use mandatory in public places. This 180-degree turn is emblematic for the worldwide shift towards masking faces during the pandemic. Accordingly, the symbolic meaning of masks was also temporarily reversed, changing an anonymity tool worn by revolutionaries into a symbol of conformism. However, this reversal is unlikely to last for long, and it hasn’t stopped governments from using FRS to identify citizens in public spaces.

The false non-match rate of pre-COVID-19 face verification algorithms is on average aroundexternal pageone order of magnitude higher on masked faces (ca. 5 percent) than on unmasked ones (ca. 0.3 percent). However, suppliers such as SenseTime, Baidu, or FaceGo have been very quick external pageto retrain their algorithms on masked faces. As governments have adopted a large variety of digital tools to monitor and enforce compliance with social distancing and quarantine rules, they have also turned to facial recognition. For example, the city of Shanghai even installs the technology external pageat gates and in the elevators of residential buildings to reduce contact with shared physical surfaces, and FRS partnered with temperature checks are used widely across China. Furthermore, several countries have used FRS to monitor quarantined citizens, either to identify their faces in citywide CCTV networks, such as external pagein Moscow, or to verify that they are at their homes via their smartphones, such as external pagein Poland and external pageIndia.

Tech Backlash

Whereas the pandemic has led to external pagecalls to double down on FRS and surveillance systems in China, there has been a massive backlash against the technology in the US in the wake of COVID-19 and the George Floyd protests. Specifically, citizens criticized the use of FRS for contact tracing at the protests, its use for border control, and the external pagesupposedly discriminatory effects on law enforcement due to a lower ability to identify external pagewomen and people of color. Subsequently, in June 2020, external pageIBM decided to abandon its research in this field, whereas external pageMicrosoft and external pageAmazon halted their collaboration with law enforcement. The EU had external pagemulled a moratorium on FRS as well, but eventually recommended in its external pageAI whitepaper that FRS “should be only used when subject to adequate safeguards” as “by analyzing large amounts of data and identifying links among them, AI may be used to de-anonymize data […] creating new personal data protection risks”.

In democracies, such an approach is preferable to a wholesale moratorium, as FRS unquestionably has applications that create social value, such as finding missing children. At the same time, there is a need to build up capacities to test, and possibly certify, FRS in terms of accuracy, including for different demographics. This would help the population to trust that the technology is working properly and would show that gender and racial biases external pageare less pronounced and more solvable than the public discourse on the subject might indicate.

The larger and more challenging questions arise from the new possibilities that a loss of anonymity in public spaces gives to different actors. It is easy to see how a technology that helps to track individuals in public spaces in real time could favor the centralization of power and allow for more stringent enforcement of norms and laws, as well as the targeted surveillance and suppression of dissenters. This is of course particularly problematic in places that have no rule of law or respect for human rights. For example, there are still countries with massive discrimination against homosexuals, including capital punishment. Hypothetically, external pagethey might use FRS to predict the sexual identity of individuals. In an example that is already a reality, China has specifically trained FRS to detect members of its persecuted Uighur minority in public spaces, external page“ushering in a new era of automated racism”.

Consequently, legally binding rules are needed to protect citizens’ rights and avoid disproportionate surveillance actions. Specifically, there should be transparency regarding how and for what purpose authorities use FRS and CCTV networks. However, the domestic and international debates on which norms and legislation around FRS are needed specifically to keep public authorities and private companies accountable will take time. Hence, in democracies, it is also up to civil society to check that the FRS that are now being developed and deployed – in part due to the extraordinary circumstance of controlling a global pandemic – will still be subject to proportionate and necessary measures in a post-COVID era. Unfortunately, the autocratic systems that embrace this technology already have little to no civil society, and those individuals that remain part of it might soon wake up to an FRS installed on their door.

About the authors

Alejandra Bringas has an International Master in Security, Intelligence and Strategic Studies by the University of Glasgow, Dublin City University and Charles University.

Kevin Kohler is a researcher in the Cyber Defense Team at the Center for Security Studies (CSS) at ETH Zurich.

This blog belongs to the CSS’ coronavirus blog series, which forms a part of the center’s analysis of the security policy implications of the coronavirus crisis. See the CSS special theme pageon the coronavirus for more.

JavaScript has been disabled in your browser