
"You may be disappointed if you go looking for Google's open Gemma AI model in AI Studio today. Google announced late on Friday that it was pulling Gemma from the platform, but it was vague about the reasoning. The abrupt change appears to be tied to a letter from Sen. Marsha Blackburn (R-Tenn.), who claims the Gemma model generated false accusations of sexual misconduct against her."
"At the hearing, Google's Markham Erickson explained that AI hallucinations are a widespread and known issue in generative AI, and Google does the best it can to mitigate the impact of such mistakes. Although no AI firm has managed to eliminate hallucinations, Google's Gemini for Home has been particularly hallucination-happy in our testing. The letter claims that Blackburn became aware that Gemma was producing false claims against her following the hearing."
"Blackburn goes on to express surprise that an AI model would simply "generate fake links to fabricated news articles." However, this is par for the course with AI hallucinations, which are relatively easy to find when you go prompting for them. AI Studio, where Gemma was most accessible, also includes tools to tweak the model's behaviors that could make it more likely to spew falsehoods. Someone asked a leading question for Gemma, and it took the bait."
Google pulled the Gemma model from AI Studio following a letter from Sen. Marsha Blackburn alleging that Gemma generated false sexual-misconduct accusations against her. Blackburn demanded an explanation and linked the incident to hearings about bots defaming conservatives. Google representatives acknowledged that AI hallucinations are a widespread, known issue and said the company works to mitigate their impact. Testing found Gemini for Home prone to hallucinations. The letter alleges Gemma fabricated a drug-fueled affair and fake links to news articles. AI Studio provides tools that can change model behavior and may enable such false outputs when prompted leadingly.
 Read at Ars Technica
Unable to calculate read time
 Collection 
[
|
 ... 
]