Advertisement

ChatGPT May Have Potential to Help Educate Patients With Cirrhosis and Hepatic Cancer in Basic Knowledge, Lifestyle, and Treatment Domains


Advertisement
Get Permission

Investigators revealed that the artificial intelligence (AI) chatbot ChatGPT may help improve health outcomes for patients with cirrhosis and hepatic cancer by providing easy-to-understand information about basic knowledge, lifestyle modifications, and treatment options for these conditions, according to a novel study published by Yeo et al in Clinical and Molecular Hepatology.

The new findings highlight ChatGPT’s potential to play a role in clinical practice.

Background

“Patients with cirrhosis and/or [hepatic] cancer and their caregivers often have unmet needs and insufficient knowledge about managing and preventing complications of their disease,” explained co–corresponding study author Brennan Spiegel, MD, MSHS, Professor of Medicine at the David Geffen School of Medicine at the University of California, Los Angeles and Director of Health Services Research at Cedars-Sinai Medical Center.

Patients diagnosed with hepatic cancer and cirrhosis often require extensive treatment that can be complex and challenging to manage.

“The complexity of the care required for this patient population makes [empowering patients] with knowledge about their disease crucial for optimal outcomes,” underscored co–corresponding author Alexander Kuo, MD, Professor of Medicine and Medical Director of Liver Transplantation Medicine at Cedars-Sinai Medical Center. “While there are currently online resources for patients and caregivers, the literature available is often lengthy and difficult for many to understand, highlighting the limited options for this group,” he added.

Investigators highlighted that personalized education AI models such as ChatGPT could help increase patient knowledge.

The novel chatbot—which stands for generative pretrained transformer—has quickly become popular for its human-like text conversations where users can input any prompt to generate a response based on the information stored in its database.

It has already shown some potential for medical professionals by writing basic medical reports and correctly answering examination questions designed for medical students. 

“ChatGPT has shown to be able to provide professional, yet highly comprehensible responses,” detailed first study author Yee Hui Yeo, MD, a clinical research fellow in the Karsh Division of Gastroenterology and Hepatology at Cedars-Sinai Medical Center. “However, this is one of the first studies to examine the ability of ChatGPT to answer clinically oriented, disease-specific questions correctly and compare its performance to physicians and trainees.”

“We found ChatGPT—while it has limitations—can help empower patients and improve health literacy for different populations,” Dr. Spiegel emphasized.

Study Methods and Results

In the new study, the investigators presented ChatGPT with 164 frequently asked questions in five categories to evaluate the accuracy of the chatbot’s knowledge of both cirrhosis and hepatic cancer. The ChatGPT answers were then graded independently by two liver transplant specialists. 

Each question was posed twice to ChatGPT and was categorized as either basic knowledge, diagnosis, treatment, lifestyle, or preventive medicine. 

After completing their study, the investigators reported that:

  • ChatGPT answered about 77% of the questions correctly—providing high levels of accuracy for 91 questions from a variety of categories.  
  • The specialists grading the responses noted that 75% of the responses for basic knowledge, treatment, and lifestyle were "comprehensive" or "correct but inadequate."
  • The proportion of responses that were “mixed with correct and incorrect data” was 22% for basic knowledge, 33% for diagnosis, 25% for treatment, 18% for lifestyle, and 50% for preventive medicine. 

ChatGPT also provided practical and useful advice to patients and caregivers regarding the next steps in adjusting to a new diagnosis. 

Conclusions

The investigators underlined, however, that the study left no doubt that advice from a physician was superior to advice from ChatGPT. 

“While the model was able to demonstrate strong capability in the basic knowledge, lifestyle, and treatment domains, it suffered on the ability to provide tailored recommendations according to the region where the inquirer lived,” stressed Dr. Yeo. “This is most likely due to the varied recommendations in [the hepatic] cancer surveillance interval and indications reported by different professional societies. But we are hopeful that it will be more accurate in addressing the questions according to the inquirers’ location [in the future].”

“More research is still needed to better examine the tool in patient education, but we believe ChatGPT to be a very useful adjunctive tool for physicians—not a replacement—but [one] that provides access to reliable and accurate health information that is easy for many to understand. We hope that this can help physicians to empower patients and improve health literacy for patients facing challenging conditions such as cirrhosis and [hepatic] cancer,” Dr. Spiegel concluded.

Disclosure: For full disclosures of the study authors, visit e-cmh.org.

The content in this post has not been reviewed by the American Society of Clinical Oncology, Inc. (ASCO®) and does not necessarily reflect the ideas and opinions of ASCO®.
Advertisement

Advertisement




Advertisement