【亚马逊面部识别软件涉嫌性别和种族偏见】

【亚马逊面部识别软件涉嫌性别和种族偏见】亚马逊备受争议的面部识别软件Rekognition正面临新一轮的批评。麻省理工学院媒体实验室的一项新研究发现,该软件认知可能存在性别和种族偏见,该软件不能区分男性和女性,也不能识别深色皮肤的女性,相比之下,当它试图识别肤色较浅的男性时,没有出错,该软件受到了美国公民自由联盟等人权组织的广泛批评,员工和投资者敦促亚马逊停止向警方出售软件。

Amazon's controversial facial recognition software can't tell the difference between men and women or recognize dark-skinned females, MIT study finds

  • Scientists found Rekognition misidentified females and darker-skinned females
  • By comparison, it made no errors when it tried to identify pale-skinned men
  • The software has been widely criticized by human rights groups like the ACLU
  • Employees and investors have urged Amazon to stop selling software to police

    Amazon's controversial facial recognition software, Rekognition, is facing renewed criticism.

    Anew studyfrom the MIT Media Lab found that Rekognition may have gender and racial biases.

    In particular, the software performed worse when identifying gender for females and darker-skinned females.

    When the software was presented with a number of female faces, it incorrectly labeled 19 percent of them as male.

    But the outcome was much worse for darker-skinned women.

    Of the dark-skinned women it was presented, Rekognition incorrectly labeled 31 percent of them as men.

    By comparison, Rekognition made no errors in its attempts to identify pale-skinned men.

    MIT found that similar software developed by IBM and Microsoft performed better than Rekognition.

    Specifically, Microsoft incorrectly labeled 1.5 percent of darker-skinned women as men.

    When software was presented with a number of female faces, it incorrectly labeled 19 percent of them as male.Rekognition incorrectly labeled 31 percent of dark-skinned women as men

    When software was presented with a number of female faces, it incorrectly labeled 19 percent of them as male.Rekognition incorrectly labeled 31 percent of dark-skinned women as men

    MIT researcher Joy Buolamwini conducted a similar study last February that found facial analysis software created by IBM, Microsoft and Chinese firm Megvii struggled with racial and gender biases.

    The study generated major backlash for the companies, with Microsoft and IBM pledging to retool their software so that it would be more accurate.

    Amazon, meanwhile, hasn't made any changes following the report.

    In a statement to the Verge, the company said the researchers weren't using the latest version of.

    'It's not possible to draw a conclusion on the accuracy of facial recognition for any use case - including law enforcement - based on results obtained using facial analysis,' Matt Wood, general manager of deep learning and AI at Amazon Web Services, said in a statement.

    Experts, as well as the authors of the report, warn that should facial recognition software continue to present gender and racial biases, it could lead to racial profiling and other injustices.

    Amazon Rekognition gives softwareapplications the power to detect objects, scenes and faces within images.

    It was built with computer vision, which lets AI programs analyse still and video images.

    AI systems rely on artificial neural networks, which try to simulate the way the brain works in order to learn.

    They can be trained to recognise patterns in information - including speech, text data, or visual images.

    Rekognition uses deep learning neural network models to analyse billions of images daily.

    Updates since it was created even allow the technology to guess a person's age.

    In November 2017, its creators announced thatRekognition can now detect and recognise text in images, perform real-time face recognition across tens of millions of faces and detect up to 100 faces in challenging crowded photos.

    Buolamwini and Deborah Raji, the authors of the study, believe more needs to be done than just fixing biases in the software in order to ensure they're used fairly.

    'Consequently, the potential for weaponization and abuse of facial analysis technologies cannot be ignored nor the threats to privacy or breaches of civil liberties diminished even as accuracy disparities decrease,' they wrote.

    'More extensive explorations of policy, corporate practice and ethical guidelines is thus needed to ensure vulnerable and marginalized populations are protected and not harmed as this technology evolves.'

    Amazon has faced repeated calls for it to stop selling Rekognition to police.

    The FBI is believed to be testing the controversial facial recognition technology, while Amazon was found to be selling the service to law enforcement agencies in the city of Orlando and Washington County, Oregon.

    It's also believed to have proposed the technology to U.S. Immigration and Customs Enforcement (ICE).

    Earlier this month, Amazon shareholders penned a letter to CEO Jeff Bezos

    demanding he stop selling the company's controversial facial recognition technology to police.

    The shareholder proposal calls for Amazon to stop offering the product, called Rekognition, to government agencies until it undergoes a civil and human rights review.

    It follow similar criticisms voiced by 450 Amazon employees, as well as civil liberties groups and members of Congress, over the past several months.

    The tech giant has repeatedly drawn the ire ofthe American Civil Liberties Union (ACLU) and other privacy advocates over the tool.

    First released in 2016, Amazon has since been selling it on the cheap to several police departments around the US, listing the Washington County Sheriff's Office in Oregon as one of several customers.

    The ACLU and other organizations are now calling on Amazon to stop marketing the product to law enforcement, saying they could use the technology to 'easily build a system to automate the identification and tracking of anyone'.

    Police appear to be using Rekognition to check photographs of unidentified suspects against a database of mug shots from the county jail.

    In a newstudytitled Gender Shades, team of researchersdiscovered that popular facial recognition services from Microsoft, IBM and Face++ can discriminate based on gender and race

    The data set was made up of 1,270 photos of parliamentarians from three African nations and three Nordic countries where women held positions

    The faces were selected to represent a broad range of human skin tones, using a labeling system developed by dermatologists, called the Fitzpatrick scale

    All three services worked better on white, male faces and had the highest error rates on dark-skinned males and females

    Microsoft was unable to detect darker-skinned females 21% of the time, while IBM and Face++ wouldn't work on darker-skinned females in roughly 35% of cases

    The study tried to find out whether Microsoft, IBM and Face++'s facial recognition systems were discriminating based on gender and race. Researchers found that Microsoft's systems were unable to correctly identify darker-skinned females 21% of the time, while IBM and Face++ had an error rate of about 35%

    The study tried to find out whether Microsoft, IBM and Face++'s facial recognition systems were discriminating based on gender and race. Researchers found that Microsoft's systems were unable to correctly identify darker-skinned females 21% of the time, while IBM and Face++ had an error rate of about 35%

    原文链接:https://www.dailymail.co.uk/sciencetech/article-6633569/Amazons-facial-recognition-software-mistakes-women-men-darker-skinned-women-men.html


Comments are closed.

var_bdhmProtocol=(("https:" == document.location.protocol)?"https://":"http://");document.write(unescape("%3Cscriptsrc='" + _bdhmProtocol + "hm.baidu.com/h.js%3F6ea8cc36c420f6608b35afb7e89d3128' type='text/javascript' %3E%3C/script%3E"));