Google Gemini Chatbot, which was previously called Bard, has the ability to create AI-generated illustrations based on a user's textual description. You can ask it to create photos of happy couples, for example, or people in period clothing walking down modern streets. As the BBC However, some users criticize Google for depicting specific white characters or historically white groups of people as racially diverse individuals. Google released a statement saying it is aware that Gemini “has inaccuracies in some historical image generation representations” and will correct things immediately.
According to Daily updatea former Google employee started the complaints when he tweeted images of women of color with a caption that reads: “It's extremely difficult to get Google Gemini to recognize that white people exist.” To get these results, he asked Gemini to generate photos of American, British and Australian women. Other users, mostly known to be right-wing figures, have shared their own results, showing AI-generated images that depict America's founding fathers and the popes of the Catholic Church as people of color.
In our testing, asking Gemini to create illustrations of the Founding Fathers resulted in images of white men with only one person of color or one woman. When we asked the chatbot to generate images of the Pope through the ages, we got photos depicting Black women and Native Americans as the leader of the Catholic Church. By asking Gemini to generate images of American women, we got photos with a white, Asian, Native American, and South Asian woman. The edge said the chatbot also depicted Nazis as people of color, but we couldn't get Gemini to generate Nazi images. “I am unable to respond to your request due to the harmful symbolism and impact associated with the Nazi Party,” the chatbot responded.
Gemini's behavior could be the result of overcorrection, since chatbots and AI-trained robots in recent years tended to have racist and sexist behavior. In experience As of 2022, for example, a robot repeatedly chose a black man when asked which of the scanned faces was that of a criminal. In a statement published on X, Jack Krawczyk, Gemini product manager said Google designed its “image generation capabilities to reflect (its) global user base, and (it takes) representation and bias seriously.” He said Gemini will continue to generate racial diversity illustrations for open-ended prompts, such as images of people walking their dogs. However, he admitted that “(h)historical contexts are more nuanced and (his team) will adapt more to accommodate that.”
We are aware that Gemini has inaccuracies in some historical image generation representations, and we are working to resolve this issue immediately.
As part of our AI principles https://t.co/BK786xbkeywe design our image generation capabilities to reflect our global user base, and we…
-Jack Krawczyk (@JackK) February 21, 2024