Skip to main content


Google Gemini tried to kill me

https://www.reddit.com/r/ChatGPT/comments/1diljf2/google_gemini_tried_to_kill_me/

"I followed these steps, but just so happened to check on my mason jar 3-4 days in and saw tiny carbonation bubbles rapidly rising throughout ... I had just grew a botulism culture"

I've seen a few people say that ChatGPT is good at recommending recipes for a given list of ingredients, and this is exactly why that's not a good idea either.

I'm a pretty timid cook, but I think it's reasonable to be cautious when incorrectly prepared food can spread disease and lead to death.

⇧