Getting your Trinity Audio player ready...
|
AIs are like parrots. They repeat things without understanding what they’re saying. The Chinese Room effect leads to things that fool us into thinking that the AI understands what’s being discussed.
In other words, they are, what they’ve been made to be. Blank vessels guided by rules and defined by data sets.
A lot of what goes into the hopper may be woke, but it’s not just because AI is repeating things. It’s repeating what it was told to repeat.
Google’s Gemini/Bard AI was following its instructions all too well when it produced diverse everything, from Founding Fathers to Nazis, while demeaning and insulting conservatives.
And Google’s insistence on pretending otherwise is insulting.
Google co-founder Sergey Brin admitted the tech giant “definitely messed up on the image generation” function for its AI bot Gemini, which spit out “woke” depictions of black founding fathers and Native American popes.
Brin acknowledged that many of Gemini’s responses “feel far-left” during an appearance over the weekend at a hackathon event in San Francisco — just days after Google CEO Sundar Pichai said that the errors were “completely unacceptable.”
Brin, however, defended the chatbot, saying that rival bots like OpenAI’s ChatGPT and Elon Musk’s Grok say “pretty weird things” that “definitely feel far-left, for example.”
“Any model, if you try hard enough, can be prompted” to generate content with questionable accuracy, Brin said.
I don’t know how involved Brin is, but while AIs will spit out what they pick up, including racist and Nazi things, and far-left things, the Google Gemini crisis happened because the AIs were shaped to do so.
Google’s AI principles include a warning against bias.
“AI algorithms and datasets can reflect, reinforce, or reduce unfair biases. We recognize that distinguishing fair from unfair biases is not always simple, and differs across cultures and societies. We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief.”
That is what led to the mandatory diversity output.
Google conducted adversarial testing for unfair bias and decided that the bias it was outputting was fair and the desired outcome.
It’s not an accident, it’s an outcome.
Algorithmic Analyst says
Yeah, it’s interesting to think how they trained the AI. With chess AI it’s simpler, win lose or draw are the only outcomes.
NAVY ET1 says
Pichai, Brin and Krawczyk (chief Gemini programmer and senior director of the Gemini team) aren’t fooling anyone with this “accident” story line, not even woke liberal DEI fans. One look at the screen captures from Jack Krawczyk’s former Twitter/X feed (somehow, it’s missing now. lol) shows this woke white dude’s hatred for white dudes. And further unsurprisingly, Alphabet (Google’s parent company) took a $90 Billion stock loss in response to the fiasco…proving even investor’s aren’t fooled.
And we thought Bud Light was gonna take a hit. Couldn’t happen to a nicer group of racists.
internalexile says
Hmm, yes, Charlemagne tha God may be black, but Charlemagne the Great was most definitely not.
David Longfellow says
Gemini was working as designed, i.e. totally biased, leftist drivel fraught with lies.
Of course for democrats, that’s a feature, not a bug.
CowboyUp says
Google’s AI is biased like their search algorithms. You have to scroll through pages of results from leftist sources telling you why they think what you’re trying to look up is untrue or wrong, to get to what you’re looking for, if it hasn’t been blackholed by them completely. It’s obvious they know what you’re looking for, because they’re consistent, and I don’t think it’s their monetized results. They do this all on their own. It’s nice that they’re so blatant and unsophisticated about it that they’re prone to fiascos like this.
STJOHNOFGRAFTON says
IF it is a genuine accident then Google are in real trouble. Their AI could go rogue and destroy all their computer systems. They may have to go back to floppy disks and start over. Aw, that would be real terrible!