fbpx
News

Google admits to its Gemini image generator struggles

Google is currently working to fix diversity-related errors in Gemini's image generator and bring it back online

After pausing Gemini’s image generation feature over concerns about historical and ethnic errors, Google has published a new blog post explaining the mistake.

Google’s senior vice president for Knowledge and Information, Prabhakar Raghavan, said that Google wanted Gemini to be diverse and generate images that show a variety of people with different ethnicities and characteristics, but it did not account for scenarios where a user might want to see a specific group or person.

Some error examples include the AI tool generating racially diverse people when it was requested to generate groups of Nazi-era German soldiers, and adding non-white AI-generated people to user requests for figures like the Founding Fathers of the U.S.

In explaining what went wrong, Raghavan said, “First, our tuning to ensure that Gemini showed a range of people failed to account for cases that should clearly not show a range.” He added, “And second, over time, the model became way more cautious than we intended and refused to answer certain prompts entirely — wrongly interpreting some very anodyne prompts as sensitive.”

Raghavan reiterated that Google did not want Gemini to discriminate against any ethnicities and create historically false images, and hence, it paused the feature. Google is currently working to fix the issue, though Raghavan still doesn’t sound confident in Gemini’s ability to become error free. “I can’t promise that Gemini won’t occasionally generate embarrassing, inaccurate or offensive results — but I can promise that we will continue to take action whenever we identify an issue,” he said.

Read the full blog post here.

Source: Google

MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.

Related Articles

Comments