Some of those living with facial differences tell WIRED they have undergone multiple surgeries and experienced stigma for their entire lives, which is now being echoed by the technology they are forced to interact with. They say they haven’t been able to access public services due to facial verification services failing, while others have struggled to access financial services. Social media filters and face-unlocking systems on phones often won’t work, they say.
“The facial difference community is constantly overlooked,” says Phyllida Swift, the CEO of Face Equality International (FEI), an umbrella group representing other facial difference and disfigurement charities and organizations. There are more than 100 million people worldwide who live with facial disfigurements, FEI estimates. People with facial differences have experienced problems with airport passport gates, photo apps, social media video filters, background blurring on video calls, and more, according to FEI’s research. “In many countries, facial recognition is increasingly a part of everyday life, but this technology is failing our community,” Nikki Lilly, a patron for FEI, told a United Nations meeting in March.
Access Denied
From phones to hotel rooms, your face increasingly acts as a digital key. Over the past decade, rapid machine learning and AI advancements have led to the creation of a range of face recognition technologies—meaning that more than ever before, your appearance can be used as a digital identifier. Police have widely deployed face recognition systems, which have frequently been found to be inaccurate and biased against Asian and Black people, while the wider world of face checking has seen government services, anti-fraud systems, and financial institutions using AI to complete identity checks. Most recently, social media and porn websites have adopted face scanning as part of age verification measures.
These “authentication” face checks can take multiple forms. Selfies can be automatically compared to existing ID documents; while liveness tests can require you to take a short video to show you are real—not a fraudster holding a printed photo to the camera. Broadly speaking, these biometric systems often measure your facial features—such as the distance between your eyes, or the size of your jaw—to create “faceprints.” While these kinds of surveillance technologies may be effective for a lot of people, they may not be able to detect people with facial differences. The underlying machine learning technology may not be trained on datasets with a variety of faces, for example.
“The case of Face Equality International—of people who have different faces—is a really important canary in the coal mine of what can go wrong when these systems don’t work,” says Greta Byrum, the founder of technology consultancy firm Present Moment Enterprises, which focuses on the social impact of technologic systems and has provided some pro bono work to FEI. “We’re seeing facial recognition technologies becoming one of those hammers to which everything looks like a nail,” Byrum says.