Jannah Theme License is not validated, Go to the theme options page to validate the license, You need a single license for each domain name.
Illinois

Doctors work with ChatGPT to handle high volumes of patient messages

Healthcare systems are turning to artificial intelligence to solve the big challenges of doctors. Responding quickly to people’s messages, including care questions, while ensuring a steady flow of patients.

Physicians from three different health care systems across the United States A “generative” AI tool based on ChatGPT Automatically generate answers to patient questions about symptoms, medications, and other medical issues. The goal is to enable physicians to spend less time on written communications, see more patients in person, and focus on more complex medical tasks.

University of California, San Diego Health and UW Health has been piloting the tool since April. Stanford Healthcare, considered one of the leading US hospitals, will make AI tools available to some doctors starting next week. At least a dozen doctors are already using it regularly as part of trials.

Dr. Patricia Garcia, a gastroenterologist at Stanford University who is leading the pilot, told CBS Moneywatch: “Care teams do not have the capacity to deal with the high volume of patient messages they receive in a timely manner.”

A HIPAA-compliant version of OpenAI’s GPT language model, the tool integrates into physicians’ inboxes via medical software company Epic’s “MyChart” patient portal, allowing clients to send messages to healthcare providers.

“This could be a great opportunity to support patient care and allow clinicians to engage in more complex interactions,” said Dr. Garcia. “A large language model could be the tool that transforms ‘InBasket’ from a burden to an opportunity.”

The tool is expected to reduce physician clerical work while increasing patient engagement and satisfaction. “If it works as predicted, it’s an all-around win,” she added.

Can AI Show Empathy?

Addressing the new generation of AI is no substitute for interacting with doctors, but research suggests that technology is now sophisticated enough to engage with patients.

Indeed, recent study A study published in the journal JAMA Internal Medicine found that patients preferred responses from ChatGPT to physicians for nearly 200 queries posted on online social media forums. The authors found that the chatbot responses were rated highly by patients for both quality and empathy.

Dr. Christopher Longhurst, author of the study, says tools like ChatGPT show great promise for use in healthcare.

“I think we’re going to see needle movement like never before,” said Longhurst, chief medical officer and chief digital officer at UC San Diego Health and vice president at UC San Diego. says. of medicine. “Physicians receive a ton of messages. This is unique to primary care physicians and a problem we are trying to solve.”

In particular, using technology to help doctors work more efficiently and intelligently is not revolutionary.

“There are a lot of things we use in healthcare that help doctors. Electronic medical records have alerts that say, ‘This prescription may overdose the patient.’ Alarms and all sorts of decision support tools. Yes, but only doctors practice medicine,” Longhurst says.


ChatGPT: Artificial Intelligence, Chatbots and the Unknown World | 60 Minutes

13:22

In the UC San Diego Health pilot, a preview of a dashboard displaying patient messages shared with CBS MoneyWatch shows how doctors interact with AI. For example, when a patient opens a message asking about their blood test results, they see AI-generated suggested replies. The corresponding doctor can choose to use, edit or discard it.

GPT can generate what he calls “useful responses” to queries such as “I have a sore throat.” However, no message is sent to a patient without first being reviewed by a live member of the care team.

On the other hand, all responses asking AI for help also come with a disclaimer.

“Some say, ‘Some of this message was generated automatically in a secure environment and was reviewed and edited by your care team,'” Longhurst said. “Our intention is to be completely transparent with our patients.”

So far, patients seem to think it’s working.

“Patients feel that we’ve tried to help doctors respond,” he said. This is an edited response.”

“Caution is required”

Despite AI’s potential to improve how clinicians and patients communicate, there are various concerns and limitations to the use of chatbots in healthcare.

First, even the most advanced technology today malfunctions or “HallucinationProviding random and even incorrect answers to people’s questions can be a serious risk in providing care.

Dr. Garcia of Stanford University said: “We are dealing with real patients with real medical concerns, and we have the following concerns: [large language models] confabulation or hallucinations. So it is very important that the first users nationwide are doing so with a very cautious and conservative eye. ”

Second, chatbots can help patients understand a wide range of issues they may have, including those related to prognosis and treatment, test results, insurance and payment considerations, and many of the more common issues in seeking treatment. It remains unclear whether it is suitable for answering these kinds of questions.

A third concern is how current and future AI products will ensure patient privacy.in the number of cyber attack In a growing number of healthcare facilities, increased use of technology in healthcare can lead to a proliferation of digital data containing sensitive medical information. This raises pressing questions about how such data is stored and protected, and what rights patients have when interacting with chatbots about their care.

“[U]The use of AI assistants in healthcare raises various ethical concerns that need to be addressed before implementing these technologies. This includes the accuracy of AI-generated content and the need for human review of potentially false or fabricated information,” notes the JAMA study.

https://www.cbsnews.com/news/chatgpt-artificial-intelligence-ai-health-care-patient-messages/ Doctors work with ChatGPT to handle high volumes of patient messages

Related Articles

Back to top button