Man’s Rare Condition Linked to ChatGPT Health Advice

Man’s Rare Condition Linked to ChatGPT Health Advice

You may have heard of a man who recently developed the very rare medical condition known as bromism. He asked ChatGPT for dietary recommendations, the recently unveiled AI chatbot powered by the same GPT-5 model. This incident suggests far deeper issues with the reliability of AI in providing health-related information. That’s particularly concerning, as the technology has repeatedly been lauded for its supposed superior capabilities to do just that.

The patient’s first question to ChatGPT was whether he should remove all table salt from his diet. After this encounter he started to self-treat with sodium bromide, thinking that sodium could substitute chloride. Within three months, bromism had progressed to a profound level. This pathological state was very typical of psychiatric hospital admissions in the first half of the 20th century. Unfortunately, the man developed serious complications from his initial health problems and eventually had to be hospitalized. Within a day, he had already made one of his first escape attempts.

Bromism occurs when you have a surplus of bromide substances in your system. This over-consumption can lead to serious complications, including confusion, lethargy, and psychosis. The patient was eventually sectioned and treated for psychosis, underscoring the dangers of trusting AI-generated health advice.

Late-model ChatGPT—now on the more sophisticated GPT-5. This upgrade improves its ability to answer nuanced health-related questions and proactively identifies potential health concerns. A new commentary in the Annals of Internal Medicine raises an alarm. It warns people not to rely on ChatGPT for evidence-based medical advice. The authors emphasized that AI can “generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation.”

“As we presume a medical professional would do,” – Authors of the Annals of Internal Medicine.

In response to this incident and other recent warnings, experts recommend that you exercise extreme caution when looking for health information through AI models. The patient’s case is a good example that technology is ever-evolving. In our rush to embrace AI, we shouldn’t lose sight of what makes healthcare human.

Sodium bromide was a favorite sedative in the early 1900s. At its height, it was thought to be responsible for almost one in ten psychiatric admissions. This history illustrates the importance of caution when testing new substances following the advice of AI-generated suggestions.

Tags