When ChatGPT became 'Yamraj,' AI sent a 60-year-old man to the ICU, and a dangerous secret was revealed

Experts warn AI tools like ChatGPT lack personal expertise and cannot replace doctors, making human consultation essential for safe and reliable healthcare decisions.

ChatGPT Dangers: An incident that happened with a 60-year-old man in New York has proved that blindly trusting AI can be fatal for health. The diet advice received from ChatGPT not only ruined his health but also landed him directly on a hospital bed. The matter became serious when the man strictly followed the low-salt diet plan suggested by ChatGPT without any medical advice.

He almost eliminated sodium from his diet. As a result, his sodium levels dropped dangerously, and he suffered from 'hyponatremia.'

Sodium bromide instead of table salt

This case study, published in the Journal of the American College of Physicians, has shocked medical experts around the world. On the advice of ChatGPT, the man started using sodium bromide instead of table salt, a chemical that was used in medicines in the past but is now considered toxic. After consuming it for three months, his condition deteriorated so much that he started having problems like mental confusion, paranoia, and severe dehydration.

The danger started with a 'health tip.'

According to media reports, the person asked ChatGPT how to remove table salt from the diet. The AI suggested sodium bromide to him. This compound was used in medicines a century ago, but now its excessive amount is considered poisonous. Without consulting any doctor, he bought it online and started using it in his food.

Worsening Health And Symptoms

Soon he started showing symptoms like hallucinations, excessive thirst, and fear of contaminated water. He also developed red spots and acne-like rashes on his skin, which are classic signs of 'bromism.' On reaching the hospital, doctors found that he was suffering from 'bromide toxicity' and severe electrolyte imbalance.

Gradual return from ICU

The main focus of treatment was rehydration and restoration of sodium-chloride levels. After three weeks in the ICU, his condition gradually improved, and his electrolyte levels returned to normal at the time of discharge. This case became one of the rare cases of 'bromide toxicity' in the 21st century.

Blindly trusting AI is dangerous

The authors of the report and medical experts say that AI tools, such as ChatGPT, can never be a substitute for professional medical advice. OpenAI has also clearly written in its terms and conditions that its output should not be the sole basis for diagnosis or treatment. Experts say that AI is useful for general information, but a doctor's opinion is mandatory for health decisions.