loader image

Google pauses AI image generation after diversity controversies

What makes us different from other similar websites? Forums Tech Google pauses AI image generation after diversity controversies

Viewing 1 post (of 1 total)
  • Author
    Posts
  • #7111
    thumbtak
    Keymaster

    Google has stopped allowing users to generate images of humans with its Gemini AI tool after people complained that it produced pictures of Black founding fathers, a female pope, and gay couples when you asked it to create images of straight couples.

    Why it matters: Google and other AI providers aim to avoid bias in the output of their generative AI tools, but removing stereotypes keeps tripping them up.

    Driving the news: Gemini’s mistakes triggered a wave of criticism, particularly from the right.

    • Elon Musk called Gemini’s errors “racist and anti-civilizational,” and the New York Post and others accused Google of being “woke.”
    • On Wednesday Google responded to the complaints, admitting that Gemini was “missing the mark.”
    • The company announced Thursday it was pausing image generation of humans and would release a new version soon.
    • By Friday the company issued a longer explanation. “This wasn’t what we intended,” Google senior vice president Prabhakar Raghavan said in a statement.
    • “I can’t promise that Gemini won’t occasionally generate embarrassing, inaccurate or offensive results — but I can promise that we will continue to take action whenever we identify an issue,” Raghavan said.
    • The big picture: Google and others have been struggling to solve a known problem in AI which is that without some guidance, tools will naturally generate stereotypical images based on the data they are trained on. That data comes from people, and people have prejudices.
    • AI creators’ efforts to avoid stereotypes have been shaped by at least a decade of missteps, in which AI image generators provided all-white-male CEO portraits and Google Photos’ AI sorting algorithm classified Black people as gorillas.

    Between the lines: Today’s AI tools aren’t smart enough to understand that you might want them to provide diverse results for generic queries (“show me a judge”) but not to tamper with history (“show me a judge from the 18th century”).

    • AI industry leaders believe their models will keep getting better at handling ethically, politically and socially fraught queries.
    • Critics suggest that generative AI doesn’t really understand anything and it will take more than incremental advances to solve this problem.

    Our thought bubble: There’s no such thing as an AI system without values, Axios’ Ina Fried writes, and that means this newest technology platform must navigate partisan rifts, culture-war chasms and international tensions from the very beginning.

    Quoted from: https://www.axios.com/2024/02/23/google-gemini-images-stereotypes-controversy

Viewing 1 post (of 1 total)
  • You must be logged in to reply to this topic.
TAKs Shack