More facial recognition technology flaws uncovered by researchers.
Concerns about facial recognition flaws is nothing new, but now fresh research conducted at the University of Colorado Boulder has found that facial recognition gender discrimination is an issue, as these systems are failing to identify transgender people.
The test found that facial recognition systems misidentified trans men as women 38% of the time.
The researchers tested facial recognition systems from top tech behemoths Microsoft, IBM, Clarifai and Amazon. The researchers used photographs of trans men and learned that the systems misidentified the men as women 38% of the time, reports Reuters.
Comparatively speaking, cisgender men and women (men and women who identify as their birth gender) were correctly identified 97.6% and 98.3% of the time respectively.
Additionally, the software reportedly failed to recognize people who did not define themselves as female or male, such as individuals who identify as non binary, genderqueer or agender, 100% of the time.
The facial recognition gender discrimination findings reveal that even the latest technology identifies gender only in two set categories.
Facial recognition gender discrimination reveals the technology is still very limited.
According to Morgan Klaus Scheuerman, the report’s lead author, the results of the research highlight that even the facial recognition tech that is the most up to date only views gender in two set categories.
“While there are many different types of people out there,” Scheuerman says, “These systems have an extremely limited view of what gender looks like.”
This flaw, among others (gender and racial bias) in the system, is very concerning because the technology is increasingly being incorporated into everyday life, including airport security.
If these systems misidentify people it could be actively harmful, particularly in the case of trans people, who are often treated with harassment or are subject to invasive body searches if their ID fails to match their gender.
Moreover, software that exclude trans and non binary individuals may prove discriminatory as these individuals are essentially invisible to this technology.
Commenting on the facial recognition gender discrimination findings, senior author Jed Brubaker, an assistant professor of Information Science, said that although society’s vision and cultural understanding of what gender is has evolved, the “algorithms driving our technological future have not.” Brubaker says that this is “deeply problematic.”