Documents made available in response to an FOI request from MedConfidential, a campaigning group for confidentiality and consent, shows that the UK Home Office went ahead with the facial recognition system despite their own research throwing up major issues with it.
User research was carried out with a wide range of ethnic groups and did identify that people with very light or very dark skin found it difficult to provide an acceptable passport photograph.
The UK government concluded that despite the failings, the “overall performance was judged sufficient” and decided to use the automated photo checker regardless, the New Scientist report.
Since deployment, some users have reported a number of issues with it. In September, Joshua Bada, a black sports coach, complained that the system mistook his lips for an open mouth.
Cat Hallam, a black technology officer at Keele University, endured a similar experience in April, claiming that the technology wrongly suggested her eyes were closed and her mouth was open.
What is very disheartening about all of this is [that] they were aware of it.
Face detection software is usually tested on thousands of images. One way bias can take hold of the system is if there is insufficient or not diverse enough training data to represent the group it will be used on.
Facial recognition technology has a history of failing to recognize individuals with certain skin tones. Google famously had to apologize in 2015 when its photos app tagged a black couple as gorillas.
Think your friends would be interested? Share this story!