Thursday, July 2, 2015

Is facial recognition technology racist? – Le Figaro

A user of Google Photos service, application can detect the content of clichés, complained of having been identified as a gorilla by the software.

The machines make mistakes, but some are more hurtful than others. Jacky Alcine, a US developer, has had a bad surprise by trying to sort his pictures with Google Photos. The service, developed by Google, offers for a few weeks a new feature: it stores the images by automatically detecting certain elements, such as the presence of a landscape, an animal or object. For example, the software can store photos of a cat in a “chat” folder. But faced with a picture of Jacky Alcine and a friend, two black people Photos Google has not detected two human but “gorillas”. Indignant, the US developer complained on Twitter. Google quickly apologized and has temporarily removed the “gorilla” category for its application time to fix it. “It is with this kind of mistakes we realize to whom this service,” said Jacky Alcine.



A complex technology and biased

Google Photos he relies on a racist technology? “Facial recognition technology and artificial intelligence are complex,” relativize Loic Fretwell, company CEO SmartMeUp that produces facial recognition software. To teach a machine to distinguish an animal, a man or a car, subjecting him a very large number of photos so that it will mark the common and distinctive features. For example, by dint of seeing photos of a car, the computer learns she is generally rectangular. “In the early days of this technology, it was enough to draw two eyes and a mouth on one hand for the computer to recognize a human face,” laughs Loic Fretwell. “Now that’s better. But it turns out that we have much in common with the animals, which can trick a machine. “

Jacky Alcine is not the first victim of these errors user machines. On Twitter, netizens noticed other inaccuracies Google Photos, which can store children and adults, several skin colors, the “dog” category or “cat”. “A person white victim of this kind of error will not be complaining,” said Loic Fretwell. “For a black person, it is much more shocking because it reminds him a situation of racism unfortunately she lives every day.”

Other software in the past have had difficulty recognize non-white people. In 2009, HP was forced to apologize after developing a webcam capable of tracking facial movements, which did not detect black people. A similar story was coming to manufacturer Nikon, one smart cameras posted a warning message “you blinked?” When taking a picture of Asian people. “It is possible that Google engineers have shown their machine more pictures of white people than other colors,” says Loic Fretwell. Making the computer, in fact, more relevant for whites than for others.

While Silicon Valley now recognizes a serious problem of diversity in employees, this issue may affect the quality of its products. Today, 60% of Google employees are white. If Google Photos was developed by engineers from diverse backgrounds, they would be subjected to a more varied software database. Making it more relevant application. Google Photos can also learn from images submitted by its users. The service could therefore improve as as non-white visitors to record pictures.

Other software is developed today to deal with these breaches, including Facebook. They are based this time on the “deep-learning”: this is to let the machine learn from itself. She no longer recognizes an image by associating it with a database, but is supposed to discover for itself.



Bias more or less aware

“The problem the cliché that presents a strong contrast, “has justified his side a Google engineer on Twitter. In the 70s, director Jean Luc Godard had complained that the film developed by Kodak were “racist” because unable to give the exact color of the skin of black people. The company eventually changed its products, which were developed with reference to the image of a white woman. Even today, some digital cameras have difficulty recognizing contrasts in a photo, when it comes, for example, to photograph a black person and a white person. A machine is, for the moment at least, with emotions or reflection. She can not do with the choice of being racist. Instead, it is shaped in the image of its creators, their context and their unconscious bias.

LikeTweet

No comments:

Post a Comment