Amazon has defended its facial-recognition tool, Rekognition, against claims of racial and gender bias, following a study published by the Massachusetts Institute of Technology.
The researchers compared tools from five companies that include Microsoft and IBM.
While none of the tools was 100% accurate, the study found Amazon‘s Rekognition tool had performed the worst among the ones tested when it came to recognising women with darker skin.
Amazon said the study was “misleading”.
The study found that Amazon had an error rate of 31% when identifying the gender of images of women with dark skin.
This compared with a 22.5% rate from Kairos, which offers a rival commercial product, and a 17% rate from IBM.
By contrast Amazon, Microsoft and Kairos all successfully identified images of light-skinned men 100% of the time.
The tools work by offering a probability score that they are correct in their assumption.
Facial-recognition tools are trained on huge datasets of hundreds of thousands of images.
But there is concern that many of these datasets are not sufficiently diverse to enable the algorithms to learn to correctly identify non-white faces.
Clients of Rekognition include a company that provides tools for US law enforcement, a genealogy service and a Japanese newspaper, according to the company’s Web Services website.
(Concise News)
26 thoughts on “Amazon defends its rekognition tool amidst backlash”