NewsNationalScripps News

Actions

AI image tools could incentivize racism, stereotypes and biases

Typing certain professions or categories of persons into a popular AI image generator will yield results that enforce stereotypes.
AI image tools could incentivize racism, stereotypes and biases
Posted at 9:26 PM, Nov 02, 2023
and last updated 2023-11-02 21:27:32-04

A Washington Post investigation is putting a new spotlight on artificial intelligence and how it emphasizes racist and sexist stereotypes.

For example, the Post found that when typing certain professions or categories of persons into Stability AI's popular Stable Diffision XL image generator, what came up were stereotypes: A "productive person" brought up images of white men, and a recipient of social services was primarily a darker-skinned individual.

This isn't a secret to AI companies.

In a response to a similar Bloomberg article, a Stability AI spokesperson said, "All AI models have inherent biases that are representative of the datasets they are trained on."

The company also says it's working to make AI less biased. And the website of DALL-E 2 — ChatGPT's image generator — says that DALL-E 2 has the potential to do harm by reinforcing stereotypes which are present in the training data, and that they are "difficult to measure and mitigate."

Rayid Ghani is a professor at Carnegie Mellon University who specializes in machine learning and public policy.

"We know in many of these cases how to deal with the problem, but the people building it either don't care, don't want to care and are not incentivized to care," said Ghani.

He believes that the question of whether or not AI is racist wrongly puts the blame on the tech and not the people creating the technology, saying that using the excuse that it's the dataset's fault is just that: an excuse.

"You don't just randomly write code, right? You take the good code and you use that. You don't just randomly pick somebody's code and use it," said Ghani. "So if you're curating things for your own consumption. Why do you think the public doesn't need the same curation?

This week, President Biden signed an executive order which will give some government oversight to AI projects. It does not have rules however regarding training data sources.

Ghani believes this is a crucial step ,and hopes to see regulation that targets bigotry and stereotypes in AI. He says that although that's not the silver bullet to AI's problematic present, it is a start.


Trending stories at Scrippsnews.com