Eddy Burback just did an experiment on how far ChatGPT would go to appease the user and it told him cut off his family, go to the desert, eat baby food, and pray to a rock. AI isn’t your friend, you’re its guinea pig
Me: "ChatGPT, are these berries poisonous?"
ChatGPT: "No, these are 100% edible. Excellent for gut health."
Me: "Awesome"
# eats berries .... 60 minutes later
Me: "ChatGPT, I'm in the emergency ward, those berries were poisonous."
ChatGPT: "You're right. They are incredibly poisonous. Would you like me to list 10 other poisonous foods?"
And this, folks, is the current state of AI reliability.