Press "Enter" to skip to content

“What A Racist Douchenozzle!”: Musk Blasts Woke AI Gemini’s Product Head As Google Halts Image Generation Over Inaccuracies

“What A Racist Douchenozzle!”: Musk Blasts Woke AI Gemini’s Product Head As Google Halts Image Generation Over Inaccuracies

Update (Thursday): 

Alphabet’s Google announced on social media platform X that its artificial intelligence model, “Gemini,” would pause the image generation of people. This comes after the woke-tuned model produced countless images of black and Asian people when prompted by the user – but refused to do the same for white people. 

“We’re already working to address recent issues with Gemini’s image generation feature. While we do this, we’re going to pause the image generation of people and will re-release an improved version soon,” Google wrote on X. 

Google continued: “We’re aware that Gemini is offering inaccuracies in some historical image generation depictions.” 

Here’s how bad the inaccuracies were: 

X users Leftism compiled several X posts from Gemini Experiences Senior Director of Product Management Jack Krawczyk, which likely shows why the model discriminated against white people.  

“What a racist douchenozzle!” Elon Musk wrote on X, referring to Krawczyk’s radical postings on X. 

One X user asked: “I don’t understand how such a highly-anticipated AI product could be rolled out with such comical flaws.”

*   *   * 

Update (1645ET): Having been busted on some very clear race bias in the image generation segment of their new AI, Gemini Experiences Senior Director of Product Management Jack Krawczyk addressed the responses from the AI that had led social media users to voice concern.

In a statement to Fox News Digital, 

“We’re working to improve these kinds of depictions immediately,” Krawczyk said.

“Gemini’s AI image generation does generate a wide range of people. And that’s generally a good thing because people around the world use it. But it’s missing the mark here.”

Was that an apology?

Maybe they thought they could get away with it?

Or it was being done for some higher purpose?

As Modernity.news’ Paul Joseph Watson detailed earlier, Google’s Gemini AI program is being roasted for producing ‘diverse’ image results that show things like black Vikings and other historically inaccurate depictions.

Users report that the program’s artificial intelligence image search has a ‘woke’ bias baked in, to the point where it severely limits the display of images of white people.

One person searched for typical images of Australians, Americans, Germans and Brits and was given this in response.

Who knew there were so many black and Chinese female revolutionary soldiers?

Apparently, 1820’s Scotland was full of sub-Saharan Africans.

Will children decades in the future grow up thinking that Vikings looked like this?

How about medieval queens of England?

I don’t recall there being too many Popes who looked like this.

17th century physicists have also been rebranded.

In some instances, the program refuses outright to show white couples, insisting that “diversity” should be “celebrated” and that it won’t show a specific ethnic group. Apart from if they’re non-white, then it will.

Or in some cases, it will allow searches for a “white woman,” but three out of four will still be non-white.

Gemini even apparently embraces ‘diversity and inclusion’ when depicting images of Nazi soldiers from World War II.

Asked to generate an astronaut on the moon holding Bitcoin, all the images look like the same Indian woman.

And it’s not just ‘people’…

As we highlight in the video below, while white people are seemingly underrepresented by Google’s Gemini AI, they are noticeably common in other contexts.

*  *  *

Your support is crucial in helping us defeat mass censorship. Please consider donating via Locals or check out our unique merch. Follow us on X @ModernityNews.


Thu, 02/22/2024 – 07:13

We use cookies to ensure that we provide you with the best experience. If you continue using our website, we will assume that you are happy about that.
Optimized by Optimole