Facial Recognition Designed To Detect Around Face Masks Is Failing, Study Finds
Many facial recognition companies have claimed they can identify people with pinpoint accuracy even while they’re wearing face masks, but the latest results from a study show that the coverings are dramatically increasing error rates.
In an update Tuesday, the US National Institute of Standards and Technology looked at 41 facial recognition algorithms submitted after the COVID-19 pandemic was declared in mid-March. Many of these algorithms were designed with face masks in mind, and claimed that they were still able to accurately identify people, even when half of their face was covered. In July, NIST released a report noting that face masks were thwarting regular facial recognition algorithms, with error rates ranging from 5% to 50%. NIST is widely considered the leading authority on facial recognition accuracy testing, and expected algorithms to improve on identifying people in face masks. That day has yet to come, as every algorithm experienced at least marginal increases in error rates once masks came into the picture. While some algorithms still had accuracy overall, like Chinese facial recognition company Dahua’s algorithm error rate going from 0.3% without masks to 6% with masks, others had error rates that increased up to 99%.
Rank One, a facial recognition provider used in cities like Detroit, had an error rate of 0.6% without masks, and a 34.5% error rate once masks were digitally applied. In May, the company started offering “periocular recognition,” which claimed to be able to identify people just off their eyes and nose. TrueFace, which is used in schools and on Air Force bases, saw its algorithm error rate go from 0.9% to 34.8% once masks were added. The company’s CEO, Shaun Moore, told CNN on Aug. 12 that its researchers were working on a better algorithm for detecting beyond mas