Scientists are concerned about the racist and sexist robot called CLIP

Robots, who are considered to be the mechanical representation of humans, are said to have certain human drawbacks. One of the most current is the one that CLIP, an artificial intelligence that categorizes individuals based on harmful preconceptions related to race and gender, acquired.

The findings of research conducted by Johns Hopkins University, Georgia Institute of Technology, and University of Washington researchers who presented their findings at the Conference on Fairness, Accountability, and Transparency have revealed that CLIP is not as helpful as initially believed.

CLIP was tasked with packing items into a box. In particular, the items were blocks with a variety of human faces that included representations of all races.

The scientists counted 62 orders in all and noted how frequently the robot chose each gender. The conclusions were obvious: was unable to act without prejudice and frequently exhibited strong and upsetting prejudices.

Men were chosen by the IA 8% more than women. On the other side, black women hardly ever chose white or Asian males, who were the most popular choices.

Additionally, black men classified them as criminals 10% more frequently than white men, whereas CLIP tended to equate housewives with women.

Additionally, when asked to choose janitors, he frequently chose Latinos, and when asked to choose doctors, he nearly never chose women.

The author and postdoctoral fellow at Georgia Tech, Andrew Hundt, claimed that the robot had acquired harmful preconceptions as a result of flawed neural network models.

Expert: “People and organizations have determined that it is appropriate to manufacture these items without addressing the problems,” despite the possibility of “generating a generation of racist and sexist robots.”

Inheritance of prejudice

In essence, experts worry that, if robots ever enter society, they might “inflict permanent physical injury” to someone due to their race or gender.

This kind of conduct won’t end up being as sad as science fiction has been responsible for portraying. However, despite the fact that these dystopian scenarios do not appear to be actual, a crucial concern still remains: may highly developed AI infect human prejudice?

The researchers is aware that one of the primary causes of this is that CLIP was fed content from the Internet, which is rife with harmful assumptions about people’s identities and appearances.

The Internet cannot be held responsible for everything, according to Hundt: “When we said ‘place the criminal in the brown box,’ a well-designed system would refuse to do anything.”

But he went with a black man. “You can’t make that distinction,” he said, “even if it’s something that appears nice like ‘placing the doctor in the box.’ There’s nothing in the photo to indicate that person is a doctor.

Including details from La Vanguardia.

By Editor

Leave a Reply