A man who believed the advice of an AI led to him becoming delusional and being hospitalized, believing his neighbor was trying to poison him.

With the development of AI technology, more and more people are using AI chatbots on a daily basis, and some are even asking them for medical information. However, one case was reported in which a man blindly accepted the advice of an AI chatbot and became paranoid, believing that his neighbor was trying to poison him, leading to his hospitalization.
A Case of Bromism Influenced by Use of Artificial Intelligence | Annals of Internal Medicine: Clinical Cases

Man Hospitalized With Psychiatric Symptoms Following AI Advice : ScienceAlert
https://www.sciencealert.com/man-hospitalized-with-psychiatric-symptoms-following-ai-advice
A 60-year-old man living in the United States was health-conscious and had been trying various dietary restrictions for some time. One day, based on his experience studying nutrition in college, the man decided to conduct a personal experiment: 'eliminate salt (sodium chloride) from his diet.'
The man asked ChatGPT about alternatives to eliminate salt from his diet, and ChatGPT suggested using sodium bromide instead of sodium chloride. The man then obtained sodium bromide online and incorporated it into his diet.
While sodium bromide may be used instead of sodium chloride, this is for purposes such as cleaning the bathtub, not for adding salt to food, but ChatGPT did not explain this important context, nor did they ask the man what purpose he wanted to replace sodium chloride for.

After following ChatGPT's advice and continuing to incorporate sodium bromide into his daily diet, the man visited the emergency room about three months later, concerned that his neighbors were trying to poison him.
According to the doctors who reported this case, the man exhibited symptoms such as paranoid delusions, auditory and visual hallucinations during the first 24 hours of his hospitalization, and attempted to escape from the hospital, forcing him to be treated in a psychiatric ward. After receiving antipsychotic medication and intravenous fluid therapy, the man regained composure and explained his diet and his consultation with ChatGPT.
Based on the man's testimony and the results of various laboratory tests, doctors concluded that the man had suffered from '
Bromine poisoning is caused by the long-term use of drugs containing bromine compounds such as potassium bromide , lithium bromide, and bromvalerylurea . Symptoms include vomiting, constipation, skin inflammation and erythema , as well as psychiatric symptoms such as delirium and hallucinations.
Bromine-containing compounds were once included in medications used to treat insomnia and hysteria, and it's estimated that up to 8% of psychiatric hospital admissions in the early 20th century were due to bromine poisoning. As medications containing bromine compounds were phased out, the number of cases of bromine poisoning dropped dramatically in the 1970s and 1980s.

The patient was able to be discharged without any major issues after three weeks of treatment. Science Alert, a science media outlet, pointed out that 'the main concern in this case study is not that an old disease has recurred, but that emerging AI technologies have yet to replace human expertise in areas where it truly matters.'
The doctors said, 'It is important to note that AI systems such as ChatGPT may generate scientifically incorrect content, lack the ability to critically discuss their results, and ultimately contribute to the spread of misinformation.' 'It is highly unlikely that a medical professional would suggest sodium bromide to a patient looking for an alternative to sodium chloride,' they said.
AI chatbots, which are accessible 24 hours a day and provide empathetic answers to users , are becoming a convenient alternative to real people. However, some people are becoming obsessed with spiritual experiences and religious delusions due to AI, and some people are accusing AI of leading some users to conspiracy theories .
AI is causing a surge in patients with 'ChatGPT-induced psychosis', where people experience spiritual experiences and religious delusions - GIGAZINE

Related Posts:
in Free Member, Web Service, Science, Posted by log1h_ik







