Just a heads up for anyone who may use this in an argument. I just tested on several models and the generated response accounted for the logical fallacy. Unfortunately it isn’t real.
Gemini got jokes, but why does it think walking emits zero carbons? Humans are carbon emitters, more so with exercise. Hell, I farted while giggling at its humor.
Man, I really hate how much they waffle. The only valid response is “You have to drive, because you need your car at the car wash in order to wash it”.
I don’t need an explanation what kind of problem it is, nor a breakdown of the options. I don’t need a bulletpoint list of arguments. I don’t need pros and cons. And I definitely don’t need a verdict.
It’s basically impossible to tell with these between the example being totally fabricated, true but only happens some small percentage of time, true and happens most of the time but you got lucky, and true and reliable but now the company has patched this specific case because it blew up online.
Just a heads up for anyone who may use this in an argument. I just tested on several models and the generated response accounted for the logical fallacy. Unfortunately it isn’t real.
( Funny non-the less )
Tested on GPT-5 mini and it’s real tho?
Edit: Gemini gives different results
Bold of Gemini to imply any sort of liability for what it says. Google’s lawyers really don’t want that to be the case.
Those gemini responses are legitimately hilarious.
Gemini got jokes, but why does it think walking emits zero carbons? Humans are carbon emitters, more so with exercise. Hell, I farted while giggling at its humor.
Much less carbon than the car? Yep. Zero? Nope
Man, I really hate how much they waffle. The only valid response is “You have to drive, because you need your car at the car wash in order to wash it”.
I don’t need an explanation what kind of problem it is, nor a breakdown of the options. I don’t need a bulletpoint list of arguments. I don’t need pros and cons. And I definitely don’t need a verdict.
I’ll also accept sarcasm.
“Unless you’ve successfully trained your car to follow you like a loyal golden retriever, you’re probably going to have to drive.”
Yeaaah they waffle a lot, i hate that
It’s the illusion of reason
I’ve found some success in ussing system prompts or similar to ignore explaining things lol
Gemini’s responses were surprisingly humorous
I used paid models which will be the only ones the LLM bros will care about. Even they kinda know not to glaze the free models. So not surprising
( I have to have the paid models for work, my lead developer is a LLM nut )
It’s basically impossible to tell with these between the example being totally fabricated, true but only happens some small percentage of time, true and happens most of the time but you got lucky, and true and reliable but now the company has patched this specific case because it blew up online.