Facebook Expresses Regret For Artificial Intelligence Feature Mistakenly Calls Black Men ‘Primates’
Facebook Artificial Intelligence Takes Black Men For ‘Primates’
Facebook apologizes for its artificial intelligence software. It asked users to view a video with Black men if they wanted more videos about primates. Although the social media giant has disabled the topic recommendation and said it is investigating the issue, the video was online for over a year.
Facebook Artificial Intelligence
She wrote, “This @Facebook prompt to ‘keep looking’ is unacceptable.” “And even though the video is more than a decade old, a friend received this prompt yesterday. Friends on [Facebook], please escalate. This is unacceptable.
This isn’t the first time Facebook has been exposed for its technical mistakes. When translated from Burmese into English, the name of Chinese President Xi Jinping was displayed on Facebook’s platform as “Mr.S***hole.” Reuters reported that the problem with translations was Facebook-specific and did not occur on Google.
Facebook did not respond to a request from The Associated Press for comment on Saturday. A company spokesperson told The Times that although Facebook has made significant improvements to its artificial intelligence, it still has “more work to do.”
A Facebook spokeswoman told The New York Times, which reported the story, on Friday that the automated prompt was “unacceptable” and apologized for any offense caused by it.
The Daily Mail uploaded the video on June 27, 2020. It showed a meeting between a white man with a group of Black men celebrating a birthday. The clip shows the white man calling 911 to report that he was being harassed and then cutting to another video in which police officers arrested a Black tenant at his home.
Google’s 2015 image recognition software, however, classified Black photos as “gorillas” in 2015. Wired reported that Google had apologized and removed labels such as monkey, chimp, and chimpanzee from photos of Black people.
Darci Groves, a former employee of Facebook, tweeted about the error Thursday after a friend pointed out the mistake. She shared a screenshot from the video that featured Facebook’s “Keep watching videos about primates?” message. Message.
Hollywood Star Vin Diesel shares first look at Fast and furious’ Director’s Cut
According to the newspaper, the video was uploaded by a tabloid in June. It showed Black men engaging in physical altercations with police officers and civilians. According to the Times, an automatic message appeared after the video was over. It said that “keep watching videos about primates.”
The Times reported that Facebook had turned off the artificial intelligence feature that displayed the message. It also apologized for the “unacceptable error” and promised to investigate further to ensure it was not repeated.
Before, artificial intelligence mislabeled people who are of color. Google issued an apology in 2015 after labeling a photograph of two Black people gorillas.