ChatGPT can give you medical advice. Should you take it?

0
Dr-AI.jpg


An artist in Germany who liked to pull outdoors, appeared in the hospital with an insect bites and a variety of symptoms that the doctors could not completely combine. After a month and several unsuccessful treatments, the patient began to close his medical history in Chatgpt. The one diagnosis offered: Tularemia, also known as rabbit fever. The chat bot was correct and the case was later written in A medical study examined by experts.

About the same time, Another study described a man who appeared in a hospital in the United States with signs of psychosis, paranoid that his neighbor had poisoned him. It turned out that the patient had asked for alternatives to sodium chloride or table salt. The chat bot proposed sodium bromidewith which is used to clean pools. He had eaten the toxic substance for three months and took three weeks in a psychiatric unit to stabilize.

You are probably familiar with the advice from Google for a mysterious illness. They search for their symptoms on the Internet, sometimes find helpful advice and are sometimes initiated in a Vortex of fear and fear, convinced that they have a rare, non -diagnosed form of cancer. Thanks to the miracle that is a generative AI, you can carry out this process in more detail. Meet Dr. Chatgpt.

AI chatbots are an appealing deputy for a human doctor, especially in view of the continuing lack of doctor and the more comprehensive obstacles to access to healthcare in the USA.

Chatgpt is not as a doctor as Google is not a doctor. The search for medical information on both platforms probably leads you to the wrong conclusion as well as the correct diagnosis. In contrast to Google Search, however, users simply refer to information, chatt and other major language models (LLMS) to have people to talk about it. They are designed so that they are accessible, committed and always available. This makes AI chatbots an appealing stand-in for a human doctor, especially for given The continued lack of doctor as well as The broader obstacles to access to healthcare in the United States.

As the rabbit fever anecdote shows, these tools can also absorb all types of data and, after the formation of tons of medical magazines, sometimes get conclusions at expert level that missed doctors. Or there could be really terrible medical advice.

There is a difference between the question of a chat bot about medical advice and with him about your health in general. Right made, the conversation with Chatgpt could lead to better conversations with your doctor and better care. Just don't let the AI ​​persuade you to eat from pool cleaner.

The right and wrong ways, with Dr. Chatgpt to speak

Many people talk to chatt about their health. For example, one of six adults in the USA says that he uses AI chatbots monthly for medical advice. According to a 2024 kff survey. A majority of them are not sure how exactly the information that the bots provide, and frankly that the skepticism in view of the stubborn trend for LLMS is appropriate to harm hallucination and the potential for poor health information. The real challenge for the average user is to know how to differentiate between fact and manufacture.

“To be honest, I think people have to be very careful Dr. Roxana DaneshjouProfessor and AI researcher at the Stanford School of Medicine. “If it is right, it does a pretty good job, but if it is wrong, it can be pretty catastrophic.”

Chatbots also have a tendency to be sycopheric or to please, which means that they could steer them in the wrong direction if they think that they want it.

The situation was precarious enough, added Daneshjou, to encourage patients to instead to Dr. Google to go, which serves trustworthy sources. The search giant Working with experts with experts Verified information about diseases and symptoms after the rise of something as “cyberchondria” or fear of health that are made possible by the Internet.

This condition is much older than GoogleStrictly speaking. People have been looking for answers to their health questions since the Usenet Days of the 1980s and eight out of ten people in the mid-2000s used the internet to search for health information. Regardless of their reliability, chatbots are ready to get more and more of these questions. Google even sets its problematic results of the AI-generated results For medical questions above The reviewed results from the symptom inspector.

If you have a list of things you can ask your doctor about, Chatgpt can help you ask questions.

However, if you skip the cross -symptom side of things, tools like Chatgpt can be very helpful if you only want to learn about what is going on with your health, based on what your doctor has already told you, or to get a better understanding of your jargony notes. Chatbots are conversing and they are good at it. If you have a list of things you can ask your doctor about, Chatgpt can help you ask questions. If you have received some test results and have to make a decision about the best next steps with your doctor, you can rehearse this with a chatbot without actually asking the AI ​​for advice.

In fact, there is some evidence that Chatgpt is better in it. One Study of 2023 Compare real medical words to health questions from a Reddit forum with A-generated answers if a chat bot has been asked with the same questions. Subsequent members of the health professions then evaluated all the answers and found that the chat bot-generated were both a higher quality and more sensitive. This is not the same as a doctor who is in the same room as a patient and discusses their health. Now is a good time to point out that the patients get on average Only 18 minutes With your family doctor on a certain visit. If you only go once a year, this is not much time to speak to a doctor.

You should be aware that, unlike your human doctor, Chatgpt is not a hipaa compliant. Chatbots in general have very few data protection protection. This means that you should expect health information that you upload to the memory of the AI ​​and will train large voice models in the future. Theoretically, it is also possible that your data will be included in an output for another. There are More private ways to use chatbotsNevertheless, there is the hallucination problem and the potential for disasters.

The future of bot -based health care

Even if you do not use AI to find out medical secrets, you have the possibility that your doctor is. According to a report from 2025 ElsevierAbout half of the clinicians said they had used a AI tool for work, and a little more said that these tools save them time, and one out of five said they had used AI for a second opinion for a complex case. This does not necessarily mean that your doctor calls chatt to find out what your symptoms mean.

Doctors have used AI-driven tools to help with everything from diagnosing patients to notes, since chatt. This includes clinical decision-making systems that have been created especially for doctors and that are currently off-the-shelf chatbots-obtained the chatbots can actually increase the existing tools. A 2023 study showed that doctors work together with Chatgpt just a little better executed When diagnosing test cases as those who work independently. Interestingly, Chatgpt only played the best.

This study Headlines madeProbably for the suggestion that AI chatbots are better than doctors. One of his co-authors, Dr. Adam Rodmansuggests that this would not necessarily be the case if doctors were more open to listen to chatt instead of assuming that the chatbots were wrong if the doctor did not agree with their conclusions. Sure, the AI ​​can hallucinate, but it can also recognize connections that people may have overlooked. Take a look at the rabbit fever case again.

“Patients have to talk to their doctors about their LLM use, and frankly, doctors should talk to their patients about it her LLM usage. “

“The average doctor has a feeling for when something is hallucinated or from the rails,” said Rodman, hospital doctor at the Beth Israel Deaconess Medical Center and trainer at the Harvard Medical School. “I don't know that the average patient necessarily does it.”

In the short term, you should not expect Dr. Chatgpt appears in your local clinic. It is more likely that AI works as a writerSave your doctor time to take notes and possibly analyze this data one day to help your doctor. Your doctor can use AI To help patients with messages Faster. In the near future it is possible that AI tools will better use more clinicians to use AI for diagnosis and second opinions. That still doesn't mean that you should rush in chatt with your urgent medical concerns. If you do this, tell your doctor how it went.

“Patients have to talk to their doctors about their LLM use, and frankly, doctors should talk to their patients about it her LLM use, “said Rodman.” If we both only get out of the shadow world and talk to each other, we will have more productive conversations. “

A version of this story was also published in the user -friendly newsletter. Register here So you don't miss the next one!



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *