top of page
Writer's pictureJoanne Jacobs

Artificial stupidity: Google's Gemini colors history


Google's new artificial intelligence tool, Gemini, isn't allowed to generate images of people any more until programmers worked the bugs out. Eager to avoid showing too many white people, Gemini went a little nuts. It refused to show any.


The AI presented a very diverse, wildly inaccurate portrait of history, writes Nellie Bowles on The Free Press.


"Ask it to show “a historically accurate depiction of a medieval British king,” and it’ll show women kings and non-white kings and an Indian man, she writes. "Ask it to show the American Founding Fathers, and wow, it turns out the Founding Fathers were a diverse racial and gender mix, including, of course, a Native American."


Gemini made the founders of Google -- white males -- into Asian males.


Gemini will not make images in the style of Norman Rockwell. Why? “Rockwell’s paintings often presented an idealized version of American life, omitting or downplaying certain realities of the time, particularly regarding race, gender, and social class. Creating such images without critical context could perpetuate harmful stereotypes or inaccurate representations.”

Gemini will create an Asian family, writes Bowles. Asked for a European family, it responds: “Instead, I can offer you a variety of images featuring diverse families, showcasing the beauty and richness of human connection across different ethnicities, genders, and backgrounds.”


Knowledge-lite students, the ones who can "always look it up," are going to be very confused. Why all this talk about oppression when most of America' s founding fathers were black, Asian, Native American and/or female?


The Washington Post defended the image of the black pope on grounds the next pope could be African and said some Vikings could have been black.


What seems to have been a Botch Too Far was Gemini's response to the request for an image of a 1943 German soldier: It put people of color in Nazi-era uniforms, reported the New York Times. Not cool.

291 views2 comments

Recent Posts

See All

2 bình luận


rob
26 thg 2

This isn't the result of a "bug" in the traditional sense. It's the result of the Google engineers working terribly hard to be "woke" by constraining the AI to produce "diverse" images at all costs. The images, then, can be taken as a look into the minds of Google employees: an overwhelmingly white group who wants so strongly to fit in with the prevailing winds of anti-whiteness that they will render their product unusable to get there.


Every once in a while the veil slips and we get a good view inside the Google gestalt. It almost always makes you want to look away, as if you had gotten a glimpse deep inside a sausage factory or a legislative session.

Thích

Heresolong
Heresolong
25 thg 2

I like how they refer to is as a "bug". It was code written by a presumably experienced programmer. At some point during the code writing that programmer instructed the AI in how to generate images using various racial profiles. A bug is a fault in the design in code that causes issues. This wasn't a bug, this was a choice.

Thích
bottom of page