9 Doctor-Approved Ways to Use ChatGPT for Health Advice

Wait 5 sec.

In August, Lance Johnson woke up in the middle of the night with excruciating stomach pain in his lower right side. He initially blamed it on the pizza and ice cream he had enjoyed the night before. But five sleepless hours later, the 17-year-old from Phoenix was still suffering, so he decided to consult the nearest expert: ChatGPT.“I described what I’d eaten the night before and where the pain was, and I was like, ‘Do you think it’s just my stomach?’ And then it said that it sounded like it was appendicitis based on how long it was lasting and where it was,” Johnson says. “I kept asking it more questions, like could it be anything else? And it said, ‘Based on what you described, you should get it checked out.’”[time-brightcove not-tgx=”true”]Johnson followed the bot’s advice—and, sure enough, the doctors at the emergency room soon said that he did, in fact, have appendicitis and needed immediate surgery. When he told them he had suspected as much because of ChatGPT’s insights, “I think they were kind of surprised that it would answer something like that—that it would diagnose me before they did,” Johnson said during a recent Zoom interview alongside his parents. “I didn’t know anything about appendicitis before. I didn’t even know it was in the bottom right.”Similar scenarios are playing out across the country as ChatGPT usurps Dr. Google. One recent survey found that 1 in 7 adults over age 50 use AI to seek health information, while 1 in 4 of those under 30 do so. Usage is particularly prevalent in areas with limited access to health care providers. While there are plenty of potential risks—like receiving inaccurate, outdated, or generic information—some doctors say AI platforms can be helpful, if you know how to use them the right way. “There is 100% a place for these tools to enrich patients’ care journeys,” says Dr. Adam Rodman, an assistant professor at Harvard Medical School and a general internist at Beth Israel Deaconess Medical Center, where he is director of AI programs for the Carl J. Shapiro Center for Education and Research. “Learning language models (LLMs) have very powerful abilities in some domains, but they can fail dramatically in others—you don’t want to rely on them as a doctor. However, LLMs are, I think, the best tool to help you understand your health right now.”We asked providers to share the smartest ways patients are using AI platforms like ChatGPT—and how they might benefit your health, too.Ask it medical factsLLMs are a helpful way to get answers to fact-based queries—“what do plasma cells do?”—and questions about disease processes: “What happens when they mutate and become cancerous?”“It’s not specific to a scenario,” says Dr. Adeel Khan, a hematologist-oncologist and epidemiologist who’s an assistant professor of medicine and public health at the University of Texas Southwestern Medical Center. “It’s general, and there’s a textbook answer.” An explanation about plasma cells’ purpose, for example, doesn’t require any context about individual circumstances; the answer will be the same regardless of your age, gender, and general health condition.Read More: The 4 Words That Drive Your Doctor Up the WallKhan, who treats a rare form of cancer, has also seen newly diagnosed “tech-savvy patients” ask ChatGPT questions like this: “What is myeloma?” “What are common side effects of lenalidomide?” And, “What can a patient with myeloma expect?”He prefers this type of usage to seeking personalized medical advice. “For now, AI should be used to understand medical and treatment facts broadly,” he says. If you do turn to the tool for more individual-based insights, he cautions, use whatever you learn as a supplement to—not replacement for—actual medical care. The information you get from ChatGPT can guide your next conversation with a doctor, he adds, but it shouldn’t be treated as the final word on your condition.Plug in lots of detailsDr. Colin Banas suggests querying LLMs like this: “I’m a 48-year-old male who’s completed X level of education, and I need to understand what this diagnosis is and what potential treatment options might be.”“I think that’s entirely fair game because it will give you good answers” in a comprehensible way, says Banas, an internist and chief medical officer of DrFirst, a health care technology company. The more details and context you provide the tool—including your relevant health history or family history of a certain condition—the better equipped it will be to dispense information that’s actually pertinent. But don’t forget:Be mindful of privacy concernsSome people have uploaded medical test results—like EKG scans, brain MRIs, and X-rays—or even their entire medical record into a LLM like ChatGPT for a “second-opinion”-esque analysis. While it can be an interesting exercise that provides fodder for conversations with your doctor, Rodman worries about the privacy implications. “I think everyone needs to know that if you’re putting it into ChatGPT, that data is going straight to OpenAI,” he says. “You are giving a tech company your personal health information, and that’s probably not a good thing.”Plus, he adds, vision language models—a type of AI designed to understand information based on both image and text input—are not yet as accurate as text-based learning language models. “Vision language models are not actually that good at image interpretation themselves,” Rodman says. “They’re usually exploiting text. If you put an EKG in, it’s mostly reading the text at the top to help interpret it, as well as the other context you’ve given.” While he understands the urge to get a second opinion on potentially confusing results, these tools are “really unreliable,” he says, “and at this point, I’m comfortable definitively saying not to do it.”Make sure to ask it unbiased questionsAs you research, you can take steps to lower the chances of receiving biased information. For example, Khan recently asked ChatGPT why chemotherapy is preferred over immunotherapy for a certain type of cancer. That wording, he says, was intentionally biased: It suggested that chemo was the superior choice, which isn’t necessarily true, and ChatGPT responded accordingly, ticking off chemo’s advantages. A better approach, Khan says, is to ask the tool whether chemotherapy or immunotherapy was preferred, and to explain the pros and cons of each. AI tools “aren’t foolproof,” he says. “How it’s framed makes a difference.”Let it help you decode medical jargonAI tools like ChatGPT are “really good at breaking down doctor-speak,” Banas says. “Doctors use a lot of advanced terminology and abbreviations—we can’t help it. It’s part of years and years of training, but patients don’t always understand.” If you head home feeling mystified, plug your questions into your favorite AI platform, he recommends.You might, for example, be stumped by an oncologist’s repeated use of the word “grade.” Ask ChatGPT what it means, and within seconds, you’ll have a few brief, easy-to-understand paragraphs explaining that grade refers to “how severe, advanced, or abnormal something is when seen under the microscope or assessed clinically” and how it differs from condition to condition.At the end, you’ll see a message like this from the bot: “Would you like me to also explain how grade differs from stage, since those terms are often confused?” From there, you can continue to follow the prompts until you’re ready to wrap up your impromptu session of medical school.Use it to prepare for doctor’s appointmentsTools like ChatGPT can help you formulate better questions to take to your doctor. “Patients use it to prepare for their visits ahead of time,” Banas says. “They’ll say, ‘Here are my symptoms; what are some questions I should ask my doctor?’ Or, ‘What are some things my doctor should be thinking of?’”For example, say you input this query into ChatGPT: “I’ve had a headache, nausea, and fatigue for two weeks. What questions should I ask my doctor?” The tool will advise you to seek medical care in a timely manner, and then suggest “focused questions” that will help you “get the clearest answers,” broken into categories like symptoms, tests, treatment, and next steps. Read More: 8 Symptoms Doctors Often Dismiss As AnxietyAmong the suggestions: “Could this be related to dehydration, infection, a migraine disorder, or something more serious?” “What initial tests should we do?” “Should we check for anemia, thyroid function, or other metabolic issues?” “What are safe options to manage my headache and nausea in the meantime?” “Should I avoid certain medications or foods until we know more?” “Should I be referred to a neurologist, endocrinologist, or another specialist?”If you find the questions useful, Banas recommends writing them down or taking screenshots you can show your doctor.Let it help you understand your care plan Maybe your doctor just told you that you have gout and prescribed a high dose of ibuprofen and colchicine. When you get home, you might realize you can’t remember the side effects they listed while you were absorbing the news. LLMs can help. Rodman suggests plugging in a prompt like this: “My doctor thinks I have gout. This is what I’ve been prescribed. What are things I need to look out for? And what should make me call my doctor again?”Use it to brainstorm lifestyle modifications Shriya Boppana, an MBA candidate in North Carolina, credits ChatGPT with helping her manage her eczema, which is triggered by skin and makeup products. Every time she tries a new product, she uploads its information into the AI tool and documents whether it caused a reaction. “If it does, I ask what ingredient might have caused the reaction so I can stay away,” she says. “It’s a running list, and it’s helped my skin stay super clear.”While Gigi Robinson, a creator-economy strategist in New York, doesn’t use ChatGPT to replace medical advice for her endometriosis, she says it’s been a “powerful tool for empowerment and mindset shifts.” When she’s navigating flare-ups, she asks it to help her brainstorm ways to adjust her work schedule or manage projects so she can still be productive while respecting her body’s needs. “It’s helped me reframe situations that would normally feel limiting into opportunities to work smarter,” she says. Robinson also leans on ChatGPT to talk through lifestyle adjustments like meal prep ideas, travel accommodations, and communication strategies for explaining her health needs to clients and colleagues.Those uses exemplify the positive potential of AI tools. “Information is power,” says Lora Sparkman, a longtime registered nurse who’s now a clinical strategist at Relias, a health care tech and education company “We’re not looking to replace the health care team, but this better informs the consumer on what they’re interested in. Patients have these tools at their fingertips, and they’re going to lead to a shift in conversations.”Keep your doctor in the loopIf you don’t get better following your provider’s treatment plan, Rodman is OK with the idea of uploading the documentation along with a prompt like this: “I didn’t get any better; what else could this be?” “And then when you go see your doctor [for a follow-up], be honest about your LLM use and have an open conversation with them,” he says. “You should not get a second opinion from the AI and then act on that without talking to a health provider.”Read More: 10 Questions You Should Always Ask at Doctors’ AppointmentsIf you and your doctor disagree about something related to your care—and their guidance contradicts or overlooks what you learned online—you could even show them your conversation with the chatbot, Rodman says. Many will be open to taking the time to talk through it with you. “Honesty and transparency are the best way to have a good clinical conversation with your doctor,” he adds.It also makes sense to experiment with your favorite AI platform to figure out what kind of usage feels the most helpful. “Chatbots don’t come with a user’s manual,” Rodman says. “They couldn’t, because everyone uses them differently, and they’re kind of unpredictable. The only way you’re going to get good at them is by experimenting.”