Skip to main content

Verified by Psychology Today

Artificial Intelligence

AI Can Make Healthcare More Empathic

Doctors and droids team up for better patient care.

Key points

  • Stressed and short of time, health professionals can find it difficult to maintain an empathic bedside manner.
  • Consultations online via text make it even more challenging to strike the right tone with patients.
  • AI can model empathy in real time for use during online conversations.
Photo by National Cancer Institute on Unsplash
Consultations are increasingly online.
Source: Photo by National Cancer Institute on Unsplash

Doctors, nurses, and healthcare workers of all kinds are in short supply around the world. Even in better-paying countries such as the US, 30% of Americans don’t have a primary care doctor, and in the UK, 42% of general practitioners (GPs) say they are likely to leave within five years, with one in ten planning to quit within a year. It’s getting harder and harder to see a doctor in person, and when you do, appointments can feel rushed and impersonal as doctors are increasingly overloaded and stressed.

There are many ways that artificial intelligence (AI) could ease the increasing burden on doctors. It’s easy to see how the administrative load and managing information could be streamlined. But what about actually dealing with patients?

People have already turned online for medical advice from social media platforms like Reddit’s t/AskDocs, which has nearly half a million members. On this platform, questions are answered by real certified doctors, and this is still a time-consuming process that takes patience and empathy. Could a chatbot do as well or even better? To find out, a team of 10 researchers led by John Ayers at the University of California San Diego took 195 randomly selected online questions that had been answered by real doctors and generated alternative replies using ChatGPT. These answers were then given to three real doctors to judge the ‘quality of the information provided’ and ‘the empathy or bedside manner provided.’

In 79% of cases, the judges preferred the chatbot response. The chatbot responses tended to be longer, of higher quality, and more empathic—10 times more likely to be rated empathic or very empathic than those from real doctors. In contrast, responses from real doctors were three times more likely to be rated as less than acceptable in quality. This may be unsurprising to those of us who regularly chat with Chat, basking in its tireless enthusiasm and endless patience, never worrying if our questions are boring or plain stupid. Still, I think most of us would be uneasy depending on AI for medical advice without some input from a human medical professional.

In education, we are increasingly urging students to work with AI collaboratively (it’s too late just to say no). Could this work for health professionals, too? Ashish Sharma and colleagues at the University of Washington developed HAILEY, an AI designed to give feedback to people providing peer-to-peer mental health support. These peer supporters are often untrained and struggle to achieve and maintain high levels of empathy with those seeking support.

Three hundred peer supporters were given some initial empathy training and then divided into two groups to respond to posters seeking support. One group had the option to use HAILEY to suggest changes to draft responses before hitting send. Responders could then accept the changes, further edit the response and/or get more feedback.

The example they give is this seeker post: ‘My job is becoming more and more stressful with each passing day’ with an initial response of ‘Don’t worry. I’m there for you.’ HAILEY then pops up asking if the supporter would like some help with that response and suggests: ‘It must be a real struggle. I’m there for you. Have you tried talking to your boss?’ This altered response engages more with the specifics of the seeker’s problem and encourages further conversation. These specific suggestions are generated for HAILEY by a specialist program, PARTNER (emPAthic RewriTing in MeNtal hEalth suppoRt), designed to edit sentences to be both empathic and conversational.

The responses written in collaboration with AI were 20% more likely to be rated as more empathic, rising to 39% for peer supporters who expressed difficulty in providing empathic support. Most supporters used the AI some of the time and did not completely rely on the feedback. Supporters who never used AI, however, tended to express less empathy in their responses. Overall, 77% of supporters in the HAILEY group said they would like this type of feedback when giving support in the future, and 70% said it helped them to feel more confident.

Online consultations took off during the COVID-19 pandemic and are likely to increase. Short of conjuring up millions of new doctors, perhaps the time and energy of the precious few we have could be amplified by working in partnership with specialist learning models like PARTNER, improving the experience for both doctors and patients.

advertisement
More from Gillian Ragsdale Ph.D.
More from Psychology Today