AI images of women from around the world have gone viral. Do they promote colourism and cultural beauty standards?

AI images of women from around the world have gone viral. Do they promote colourism and cultural beauty standards?

Thursday 31/08/2023
Stereotypical women of India generated by AI.(Supplied: Madhav Kohli)

What does a "beautiful" woman from India look like?

What about the Philippines or Laos?

Artificial Intelligence (AI) purports to have the answer.

AI-generated images of "beautiful women" from around the world have been going viral for months.

One TikTok video featuring AI images of South and East-Asian women — recently posted by an account named AI World Beauties — has more than 1.7 million views.

However, experts say that because they're "trained" on biased data sets and stereotypes, the images can perpetuate limited, exclusionary and potentially harmful ideals.

How does AI make these images?

The images going viral are created by what's known as "generative AI", or "GenAI", programs.

"GenAI is a type of artificial intelligence powered by machine learning models," Shibani Antonette, a lecturer in data science and innovation at the University of Technology Sydney, told the ABC.

"It uses patterns and information its learned from millions of data points to create new content that never existed before."

Dr Antonette says the quality and diversity of the training data determines an image generator's output.

When contacted by the ABC, the creator of the viral video said they used a diffusion model called Midjourney to generate the images.

They declined to be named in the story or comment further.

'Colourism and cultural beauty standards'

Fair skin, thin noses, full lips and high cheekbones.

According to the viral videos, "beautiful" women share these same features.

Bias is "a serious problem" in image generation and facial recognition technologies, Dr Antonette says.

"The models can create a distorted reality by amplifying biases and stereotypes on race and gender," she says.

"Most of the generated images perpetuate colourism and cultural beauty standards."

When looking at the viral AI images, Dr Antonette says the model that generated them likely "did not have a diverse training dataset that contained many faces of people of colour with varying skin tones and shapes".

"After all, data for these models are pulled from the entire internet over the last few decades — without accountability for coverage, diversity, and inclusion — to cater to specific applications."

'You don't look like that'

Asia Jackson often has people guess at her ethnicity, and usually gets a "you don't look like that" when she tells people.

The Black and Filipino actress and content creator says as a child, being "mixed" created "a lot of identity issues".

Now 29, Ms Jackson says she has a stronger sense of self and identity.

"I definitely get way more offended when someone tells me 'I don't look Black' or 'don't look Asian'.

"Because both of these racial categories contain such a large spectrum of skin colour and features."

The same goes for when people tell Ms Jackson she "doesn't look Filipino".

"The Philippines is a country with more than 7,100 islands and hundreds of different ethnic groups," she says.

'Indifferent' about AI images

Ms Jackson feels "pretty indifferent" about the viral AI images.

"AI is just copying human behaviour, however non-inclusive or non-politically correct it might be," she says.

"This isn’t anything different from what happens in real life.

"At the same time, I really don't think it's possible to include the vast diversity of features or ethnicities from every country in a 30 second video."

AI falling of short of 'challenging perspectives'

When Ishara Sahama first saw the above images, she found them "almost ethereal".

She then realised what she was seeing was "the most accepted beauty standards of each ethnic group".

"The diversity of ethnic groups within each country is generalised into one model. It's reductive and far from reflective of those countries' diversity," she says.

The 25-year-old co-founder of strategic design agency, Echo Impact Group, has been mistaken for Indian, Pakistani, Arab and Indigenous Australian.

She's Sri Lankan, with Sinhalese, Tamil and Malay backgrounds.

"Assuming one's ethnicity and then associating them with that without asking is what annoys me the most."

She says it's "beauty stereotypes" seen in these AI images that causes people to think there is "one look" for each ethnicity.

"When people see these AI images of women, they may associate those features with what an Indian, Pakistani, or in my case, a Sri Lankan woman looks like," she says.

"I clearly don't look like those images.

"I think such AI has the capability to challenge perspectives and identities. But unfortunately, AI art can only respond to data it has."

'I don't see myself in any of these images'

Every feature in these AI-generated images fits into what is deemed as "the model minority," says Kriti Gupta.

"It's all the parts of our ethnic group that the internet [which is informed by society's preferences] deem as attractive."

Ms Gupta, a 27-year-old Indian Australian who works in social media strategy and consulting, says she doesn't see herself in any of the images.

"They're what every guy thinks of when he has a 'brown girl fetish,'" she says.

Ms Gupta says people have assumed she is Spanish, Mexican, Moroccan or from other Latin American countries.

"I think I'm always in this middle ground of like, why are you assuming my ethnicity? What value does that bring to this conversation?" she says.

"I only bring up my background when I feel it's relevant to the conversation."

'White-washed imagery'

Like many South Asian women, Ms Gupta made changes to her appearance in Australia.

She dyed her blonde and didn't worry about letting her brown skin get darker from being in the sun, admitting it was to make herself more appealing to the male gaze here.

"But then I would go to India and douse myself in whitening cream to fit in there," she says.

Still, Ms Gupta knows she's not really seeing what most South Asian women look like in those AI images.

"We're seeing white-washed imagery," she says.

"Most of the time, these AI platforms are created by men, and most of the coding that goes into these algorithms to create these images from the world's use of the internet is fixated on a Western use."

Dangers of AI's racial bias

Scholars and activists have warned that the datasets used to train AI models are biased.

And it can be be problematic.

A research study by Cornell University from March this year revealed how popular AI models produced images of men with lighter skin tones for high-paying jobs such as a "lawyer," "judge" or "CEO".

Whereas darker skinned people over-represented lower paying professions such as a "janitor" and "fast food worker".

What's the solution?

While there's no all-encompassing answer, Dr Antonette points to a few key actions.

"Tech-developers and companies rolling out services should ensure that their AI is fair and equitable by diversifying their datasets and avoiding over-representation of certain groups of people," she says.

"They should consider the implications of how their technology might be extended to contexts other than what it was originally built for."

Dr Antonette says researchers improving their accountability and transparency is also key.

"This is by publishing open-source models that others can critique and build on by adding more diverse data," she says.

Those also using and viewing AI should do so critically and responsibly, Dr Antonette says.

"The creation of biased synthetic images can inadvertently fuel future biases, entangling us in vicious cycle.

"Embracing diversity in data, championing transparency, and using AI tools thoughtfully can lead us towards a future where AI benefits everyone, without perpetuating harmful stereotypes."

Story By: Angelica Silva

Original link:

Vehicles seen transporting drugs into interior areas of Namosi
The people of Namosi are concerned about the spread of drugs in their area and they have informed the Namosi Provincial Council that new vehicles are ...
2 hours ago
Bastille Day celebrations connected to sports' values of humility, respect and inclusion
While celebrating the National Day of France or 'Bastille Day' at the FMF Gym in Laucala Bay, French Ambassador to Fiji, François-Xavier Léger says ...
2 hours ago

Hibiscus Festival partnership to help showcase vibrant Fijian culture
The Hibiscus Festival which is celebrating its 68th anniversary, has just announced their partnerships with the Fiji Arts Council and the Suva Orchid ...
2 hours ago

About 20 people homeless after 5 residential fires in the past week
About 20 people are homeless after 5 residential fires in the past week while investigations on the cause of the fires are being finalised. The ...
2 hours ago

RKS students reconcile with Lelean regarding stoning incident
Ratu Kadavulevu School and Lelean Memorial School have reconciled after a video went viral on Facebook showing the bus carrying the students of ...
2 hours ago

Naca Cawanibuka to appear on fijivillage Straight Talk With Vijay Narayan at 7pm tomorrow
Accomplished Strength and Conditioning Coach for the Olympics gold medal winning teams in Rio and Tokyo, Naca Cawanibuka will appear on fijivillage ...
6 hours ago

fijivillage Straight Talk with Vijay Narayan
Latest Videos

Stay tuned for the latest news on our radio stations

CFL radio frequencies
2024-2025 National Budget address by the Minister for Finance
2024-2025 National Budget address by the Minister for Finance Prof. Biman Prasad. 2024-2025 Budget ...
9 days ago

Crisis within FijiFirst that led to its deregistration
The FijiFirst Party has been de-registered by the Registrar of Political Parties, Ana Mataiciwa. This is the end of the party that led government ...
13 days ago

Pre-Budget 2024-2025
Pre Budget 2024-2025 budget recommendations and an insight of what people on the ground wish to see in the National Budget
19 days ago