Google’s Photo App Still Can’t Find Gorillas. And Neither Can Apple’s.

credit…Desiree Rios/New York Times

Eight years after the controversy over how image-analysis software mislabeled black people, and despite major advances in computer vision, tech giants still fear repeating their mistakes.

When Google released its standalone Photos app in May 2015, people were amazed at its features. It was an amazing consumer service for its time, analyzing images and labeling the people, places and things in them. But months after its release, software developer Jackie Alcine discovered that Google was labeling photos of him and his black friends as “gorillas.” The term reflects centuries of racism and is a particularly offensive term. metaphor.

In the ensuing controversy, Google vowed to fix the problem by stopping its software from classifying anything in a photo as a gorilla. Eight years later, with artificial intelligence so advanced, we tested whether Google had solved the problem and looked at the equivalent tools from his competitors Apple, Amazon and Microsoft.

One member of the primate family that Google and Apple were able to recognize was the lemur. Lemurs are long-tailed animals that always have a surprised look on their faces and keep their thumbs on opposite sides of humans, but they are more distant relatives than apes.

When it came to image analysis, Google and Apple’s tools were clearly the most sophisticated.

But Google, which powers most of the world’s smartphones with Android software, decided to disable the ability to visually search for primates out of fear of making an unpleasant mistake and labeling people as animals. dropped. And Apple, with technology that performed similarly to Google in our tests, also seemed to disable the ability to find monkeys and apes.

Consumers may not need to perform such searches often, but in 2019 an iPhone user wrote on Apple’s customer support forums that the software “I can’t find monkeys in my device photosBut the issue raises bigger questions about other unfixed or unfixable flaws in services that rely on computer vision (the art of interpreting visual images) and other AI-powered products. causing it.

Alcine said he was disappointed to learn that Google hadn’t completely solved the problem yet, saying society puts too much trust in technology.

“I’m going to disbelieve in this AI forever,” he said.

Computer vision products are now being used for everything from mundane tasks like sending an alert when a package is on your doorstep to critical tasks like driving a car or finding criminals in law enforcement investigations. increase.

Errors may reflect racist attitudes among those encoding the data. In the gorilla case, two former Google employees working on the technology said the problem was that the company didn’t include enough photos of black people in the image collection it used to train its AI systems. As a result, the technology was not familiar enough to dark-skinned people, confusing them with gorillas.

As artificial intelligence seeps deeper into our lives, fears of unintended consequences arise. Computer vision products and AI chatbots like ChatGPT are different, but both rely on a set of underlying data to train the software, and both can be compromised due to flaws in the data or biases built into the code. may also malfunction.

Microsoft recently restricted users’ ability to interact after a chatbot built into its search engine, Bing, instigated inappropriate conversations.

Microsoft’s decision, like Google’s choice to make its algorithms completely unidentifiable to gorillas, demonstrates the industry’s general approach to blocking rather than fixing malfunctioning technology features.

“Solving these problems is important,” says Vicente Ordóñez, a computer vision professor at Rice University. “How can I trust this software in other scenarios?”

Google spokesman Michael Marconi said Google stopped labeling its photos app with monkeys and apes because it determined the benefits “do not outweigh the risk of harm.”

Apple declined to comment on users’ inability to search for most primates on the app.

Amazon and Microsoft representatives said the two companies are constantly working to improve their products.

When Google was developing its Photos app, which it released eight years ago, it collected a ton of images to train an AI system to identify people, animals and objects.

Two former Google employees said a critical oversight in training data that didn’t include enough photos of black people later caused the app to malfunction. At the time, the company couldn’t figure out the “gorilla” problem because it didn’t ask enough employees to test the feature before rolling it out to the public, former employees said.

Google apologized profusely for the gorilla incident, but it was one of many episodes that led to accusations of bigotry across the tech industry.

Other products that have been criticized include: HP face tracking webcamcould not detect people with dark skin. apple watch, according to it to litigation, could not accurately read blood oxygen levels across skin tones. The blunder suggested that tech products weren’t designed for dark-skinned people. (Apple points to on paper Beginning in 2022, it details its efforts to test its blood oxygen app on a “broad range of skin types and tones.”)

Years after the Google Photos error, the company encountered a similar issue with its Nest home security camera during internal testing, according to people familiar with the incident who worked at Google at the time. Nest cameras, which use AI to determine whether people on the premises are familiar or unfamiliar, mistook some black people for animals. Google rushed to fix the problem before users could access the product, the person said.

But Nest customers continue to complain about other flaws on the company’s forums. In 2021, a customer received an alert that her mother rang the doorbell but her mother-in-law was behind her door. When users complained that the system was confusing faces they marked as “familiar,” a forum customer support representative advised them to remove all labels and start over.

“Our goal is to make sure this type of mistake never happens again,” said Google spokeswoman Marconi. He added that the company has improved its technology “by partnering with experts and diversifying its image dataset.”

In 2019, Google tried to improve facial recognition on Android phones by increasing the number of dark-skinned people in its dataset. But the contractors Google hired to collect the facial scans reportedly resorted to nasty tactics to make up for the lack of diverse data. They targeted homeless people and students. Google executives called the incident “extremely disturbing” at the time.

Google worked behind the scenes to improve its technology, but never allowed users to appreciate its efforts.

Margaret Mitchell, a researcher and co-founder of Google’s Ethics AI group, joined the company after the gorilla incident and worked with the photography team. In her recent interview, she said she supports Google’s decision to remove the gorilla label “at least for the time being.”

“Instead of perpetuating harmful stereotypes, we have to think about how often we need to label gorillas,” Dr. Mitchell says. “The benefits do not outweigh the potential harm if done wrong.”

Ordóñez, the same professor, said Google and Apple may have been able to distinguish between primates and humans, but they didn’t want to enable the feature because of the reputational risk if it malfunctioned again. I guessed.

Then Google released Google Lens, a more powerful image analysis product. It’s a tool that searches the web by pictures instead of text. Wired In 2018, it was found that even this tool could not identify gorillas.

Dr. Mitchell, who no longer works at Google, says these systems are far from foolproof. With billions of people using Google’s services, even rare glitches that occur to just one of his billion users can surface.

“It’s just one mistake and it has a big social impact,” she said, calling it “the needle in the haystack.”

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button