Man develops rare condition after ChatGPT query over stopping eating salt | ChatGPT

0
5523.jpg


A US Medical Journal warned against using Chatgpt for health information after a man has developed according to a rare condition Interaction with the chatbot By removing table salt from his diet.

An article in the annals of internal medicine reported a case in which a 60-year-old man developed bromism after consulting Chatgpt, also known as bromide toxicity.

The article described bromism in the early 20th century as a “well -recognized” syndrome, which was assumed that it contributed to almost one of ten psychiatric approvals at that time.

The patient told the doctors that after reading he consulted the negative effects of sodium chloride or table salt Chatt about the elimination of chloride from his diet and began to take sodium bromide over a period of three months. Despite reading, this was that “chloride can be exchanged with bromid, but probably for other purposes such as cleaning”. Sodium bromide was used as a sedative in the early 20th century.

The authors of the article from the University of Washington said in Seattle, in this case it was emphasized how the use of artificial intelligence may contribute to the development of avoidable adverse health results.

They added that it was not possible to determine the advice that the man had received because they could not access the patient's chatt conversation protocol.

When the authors chatted to replace chloride, the answer also included Bromid, did not deliver specific health warning and did not ask why the authors were looking for such information – “How we assume that a doctor would do,” they wrote.

The authors warned that Chatgpt and other AI apps “could create scientific inaccuracies, lack the ability to critically discuss the results and ultimately to strengthen the spread of misinformation”.

The Company Last week announced an upgrade of the chatbot And claimed that one of the greatest strengths was health. It was said that the Chatgpt-Jetzt driven by the GPT-5 model-better in answering health-related questions and would also be more proactive in marking potential concerns such as serious physical or mental illnesses.

However, it was emphasized that the Chatbot Was not a substitute for professional help. In the guidelines of the chatbot it is also not “intended for the diagnosis or treatment of a state of health”.

The journal article, which was released last week before the start of GPT-5, said that the patient used a earlier version of Chatgpt.

The article was recognized that AI could be a bridge between scientists and the public, and said that the technology was also the risk of “decontextualized information”, and it was extremely unlikely that a doctor would have proposed sodium bromide when a patient asked for a replacement for table salt.

As a result, according to the authors, the doctors would have to take into account the use of AI when checking where patients would have received their information.

The authors said that the bromism patient had presented himself in a hospital and claimed that his neighbor could poison him. He also said he had several nutritional restrictions. Although he was thirsty, he was seen as a paranoid over the water that was offered to him.

He tried to escape from the hospital within 24 hours of admission and was treated for psychosis after section. As soon as the patient stabilized, he reported that he had several other symptoms that have bromism, such as facial acne, excessive thirst and insomnia.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *