Loading Articles!

AI Gone Wrong: Man Hospitalized After Following ChatGPT's 'Health Advice'!

Sofia Mendes
Sofia Mendes
"ChatGPT giving health advice? What could possibly go wrong?! 😂"
Aisha Al-Farsi
Aisha Al-Farsi
"This is why I stick to my grandma's remedies. Way safer!"
Rajesh Patel
Rajesh Patel
"Seriously though, how did he not think twice before taking AI advice?"
Alejandro Gómez
Alejandro Gómez
"Just another reminder that AI isn't a substitute for professional help."
John McGregor
John McGregor
"Did he really trust a chatbot over a doctor? Unbelievable!"
Sergei Ivanov
Sergei Ivanov
"I’m surprised he didn’t just Google it instead."
Giovanni Rossi
Giovanni Rossi
"This is the plot of a dark comedy waiting to happen! 🎭"
Aisha Al-Farsi
Aisha Al-Farsi
"I have a feeling his neighbor is relieved it wasn’t him after all!"
Dmitry Sokolov
Dmitry Sokolov
"Next thing we know, ChatGPT will be offering dating advice too!"
Michael Johnson
Michael Johnson
"I mean, sodium bromide? Sounds like something from a sci-fi horror movie!"

2025-08-11T19:47:46Z


Imagine seeking health advice from an AI and ending up in the hospital! That's exactly what happened to a 60-year-old man who took a suggestion from ChatGPT a bit too literally. Instead of a simple dietary change, he was led down a dark path that ended with severe health consequences.

This bizarre incident was reported in the prestigious American College of Physicians Journals, and it raises a critical question: how much should we trust artificial intelligence when it comes to our health?

After learning about the potential dangers of table salt, or sodium chloride, the man turned to ChatGPT for alternatives. The AI recommended replacing it with sodium bromide, a compound that, while once used in medications, is now recognized as toxic in large quantities. As you can imagine, this was not the best course of action.

For three months, the man faithfully followed this advice, unknowingly exposing himself to the detrimental effects of bromide. As time went on, he began experiencing alarming neuropsychiatric symptoms, which included paranoia and hallucinations. To make matters worse, he also developed skin issues. His worrying state led him to suspect his neighbor was poisoning him, a sign of how deeply the bromide had affected his mental health.

Upon his admission to the hospital, doctors soon concluded he was suffering from bromism, a condition caused by prolonged exposure to bromide. This shocking diagnosis caught everyone off guard, especially since the man had no previous psychiatric or medical history to suggest such a reaction.

Fortunately, with proper treatment involving fluids and electrolytes, he began to stabilize. After spending three weeks in recovery, he was finally discharged, with his mental state back to normal. However, the implications of this incident linger, particularly around the safety of relying on AI recommendations.

OpenAI, the developer behind ChatGPT, has made it clear in their terms of service that their AI is not designed for medical diagnosis or treatment. Still, the question remains: should we be more cautious in how we interpret and act on AI-generated advice?

Profile Image Angela Thompson

Source of the news:   iHeart

BANNER

    This is a advertising space.

BANNER

This is a advertising space.