Skip to main content


Google Gemini tried to kill me

reddit.com/r/ChatGPT/comments/…

"I followed these steps, but just so happened to check on my mason jar 3-4 days in and saw tiny carbonation bubbles rapidly rising throughout ... I had just grew a botulism culture"

in reply to David Gerard

I've seen a few people say that ChatGPT is good at recommending recipes for a given list of ingredients, and this is exactly why that's not a good idea either.

I'm a pretty timid cook, but I think it's reasonable to be cautious when incorrectly prepared food can spread disease and lead to death.