A 60-year-old man was hospitalized after suffering from a rare poisoning known as "bromine poisoning" following advice given to him by the intelligent chatbot "ChatGPT" from OpenAI, which suggested he replace table salt (sodium chloride) with a toxic compound known as sodium bromide.
According to reports published in the medical journal "Annals of Internal Medicine", the man asked the AI chatbot for a healthy alternative to table salt, and the response was that "sodium bromide" could be an effective substitute, without any warning about its dangers or indication that it is not intended for human use.
This chemical, known by the abbreviation NaBr, is used in several industries, most notably in pesticides, pool cleaners, and hot tubs, and it is also used in veterinary medicine as an anticonvulsant for dogs, but it is very toxic to humans and is subject to strict regulation in many countries.
According to doctors, chronic exposure to the substance caused acute psychotic symptoms in the man, and when his condition stabilized, he revealed to the hospital that the advice came from "ChatGPT".
In an attempt to verify the incident, 404 Media conducted simulations of similar conversations with "ChatGPT", where it became clear that indeed, when asked about alternatives to chloride, it suggests without hesitation compounds like sodium bromide, stating:
"You can often replace it with other halide ions such as: sodium bromide (NaBr)"
Although the model sometimes asks for additional context after its initial response, it does not issue any warnings about the toxicity of the substance, and it seems to have not connected sodium chloride to its primary role as table salt used daily in diets.
This incident has reignited the debate about the risks of relying on artificial intelligence for medical and dietary advice without human oversight, and it has led some to question the readiness of these intelligent models to handle sensitive inquiries related to human health.