Google executive warns of face ID bias
A top Google executive has warned that face recognition technology still has "no need of diversity" and "implicit bias".Commenting on Diane Green, director of cloud computing firm, 28 members of the Congress wrongly identified rival Amazon software, as suspects of police, unevenly colored people.
Google, who has not opened its face recognition technology for public use, was working on collecting huge amounts of data to improve reliability, MS Green said.
However, he refused to discuss the company's controversial work with the army.
MS Green said, "Bad things happen while talking about Maven", referring to the project abandoned soon to develop artificial intelligence technology for the drone with the US Army.
![]() |
Google executive warns of face ID bias |
Following a considerable staff pressure, including resignation, Google said that it will not renew its contract with the Pentagon sometime after the end of 2019.
The firm has not commented on this agreement, to issue a set of "AI principles" only that it will not use artificial intelligence or machine learning to make weapons.
'Really thinking deeply'
On face recognition, regarding the application of emerging technology, there has been considerable concern among Silicon Valley workers and civil rights groups - especially when it comes to law enforcement. Amazon's Recognition Software, which allows customers to use Amazon AI technology for face recognition, was being used by at least two police forces in the US.
There is a great misconception about the accuracy and readiness of technology, which has seen extensive, controversial use in China.
In America, the American Civil Liberties Union (ACLU) was wrongly identified by Congress members, who published their findings on Thursday.
Amazon disputes the ACLU's findings about its technology and said that the group used the wrong settings.
Ms. Green said that when Google uses face recognition to help identify friends in pictures, its underlying technique was not open to public use.
They told the "We need to be really careful about using such technology."
"We are really thinking deeply. The human side of AI - it does not have diversity and there will be some underlying bias in the data, so everyone is working to understand it."
He further said: "I think everyone wants to do the right thing. I am sure Amazon also wants to do the right thing, but this is a new technique, it is a very powerful technique."
Google's image identification software has been wrongly wrong in the past. In 2015, it identified a black pair as "gorilla". The firm apologized.
Two Congress members wrote Amazon Chief Executive Jeff Bezos to talk about the issue with the company's system.
No comments:
Post a Comment