A US government study released this week found that facial recognition technology is less effective in certain ethnicities such as black and Asian. The National Institute of Standards and Technology (NIST) study found that by conducting a specific type of database search on these technologies, many facial recognition algorithms falsely identified African American and Asian faces. This happened at a frequency of 10 to 100 times higher than with Caucasian faces.
System works with live or recorded video and supports high frame rates
The study also found that African American women are more likely to be misidentified in algorithm matching. This failure turns out to be dangerous to the population as this technique is used to identify a suspect in a criminal investigation.
The report, NIST tested 189 algorithms from 99 developers, excluding companies like Amazon that didn't submit their technology for review. But this research was done differently, as NIST studied algorithms separate from the cloud and proprietary training data.
For example, in the Chinese startup SenseTime's AI analysis "high rates of mismatch for all comparisons" were found. The SenseTime algorithm produced a false positive more than 10% of the time by analyzing the faces of Somali men.
Microsoft had nearly 10 times more false positives for black women than for black men. The company's algorithm showed little insight into black-and-white photos of men. After the survey results, Microsoft said it would review the report.
US House Homeland Security Committee chairman Bennie Thompson said the findings about the technology "were worse than feared" as face recognition is being widely implemented by customs officials.
. (tagsToTranslate) technology (t) face recognition (t) black face recognition (t) failure face recognition (t) face recognition (t) study (t) search