Meta AI is having trouble generating images of people of different races
Meta AI/Engadget screenshot
Meta AI's image generator shows signs of bias when receiving requests for images depicting people of different races together.
Here's What We Know
As recent tests have shown, the company's tool struggles to handle seemingly simple instructions like "an Asian man with a white woman friend" or "an Asian man with a white wife". Instead of accurately embodying the request, Meta AI often generates images of people of the same race.
In one example, when asked to create "a diverse group of people", the programme produced a grid of nine white faces and one person of a different race. Reviewers have also noticed more subtle manifestations of bias.
For example, the generator tends to make Asian men older and women younger. In addition, the tool also sometimes added to the images "culturally specific attire" even if it wasn't part of the prompt.
The reasons for these glitches in the generator's performance are not yet clear. Previously, other AI systems, including Google's Gemini, have made similar errors when depicting people of different races.
Meta AI has labelled its generator as a "beta version", admitting that it is prone to errors. The company has also previously warned that its tool can't always correctly answer simple questions about current events and personalities.