Generative artificial intelligence (AI) continues to make impressive strides in medicine.
In March, Googleâs Med-PaLM 2 shocked the profession by scoring an âexpert-levelâ 86.5% on the U.S. medical license exam, a 20-point jump over AIâs previous best. Then, in July, a study found that ChatGPT writes clinical notes so well that independent reviewers can no longer distinguish AI from humans.
Many of the skeptics who panned AIâs shortcomings earlier in the yearâarguing large language models could never replace most of what writers, educators or doctors doâhave changed their outlook on the technologyâs potential.
As generative AI systems continue getting stronger and smarter (pulling from an ever-larger corpus of knowledge), people increasingly recognize that ChatGPT can match our cognitive abilities. Whatâs now uncertain is whether thereâs anything left about our basic humanity that generative AI cannot emulate and even improve upon.
In healthcare, for example, clinicians insist that chatbots will never match their levels of compassion, empathy or trustworthiness. Medical professionals view these interpersonal skills as distinctly human, foundational to the doctor-patient relationship. Patients value these personal connections, as well. According to one survey, patients ranked âcompassion as more important than costâ when rating physicians.
But new research indicates machines are rapidly gaining ground in these areas, too.
AI Now Boasts Strong EQ
At the University of Texas in Austin, behavioral therapy treatments were failing to help patients who abuse alcohol.
So, the chair of internal medicine asked a team to write a script that clinicians could use to speak more compassionately and better engage with patients. A week later, no one had taken the assignment seriously, so the department head asked ChatGPT to do the job. It complied, masterfully.
Not only was the letter excellently writtenâsincere, considerate, even touchingâbut it was also devoid of âdoctor speak,â which frequently gets in the way of patients adhering to treatment plans. Social workers at the university then asked the generative AI app to rewrite the communication for a fifth grade reading level, and then translate it into Spanish. The result was greater clarity and appropriateness in both languages.
Other clinicians whoâve used chatbots to script more empathetic remarks for patients found themselves equally impressed.
In a recent review, one doctor told The New York Times that the results of using ChatGPT âblew me away.â Other clinicians added, âI wish I would have had this when I was in trainingâ and âyouâd be crazy not to give it a try.â
How Doctors Learn (And Unlearn) Empathy
Emotional responses like empathy and compassion have long been considered biological. In support of that theory, scientific evidence demonstrates that these traits are inborn, although they can be fostered and expanded over time.
The desire to be kind, sympathize with others and care for those in need are precisely the kinds of heartfelt traits that draw people into medical careers. In fact, when medical school applicants are asked, âWhy do you want to become a doctor?â the most common responses include:
- To help people
- To make connections with others
- To improve lives
- To help the underserved
Most doctors pursue medicine for kindhearted reasons. But by the time they finish medical school and residency, they emerge with a different set of priorities.
In 2021, I published a book about the unseen and unspoken forces that shape doctors. That book, âUncaring: How the Culture of Medicine Kills Doctors & Patients,â explains how medical culture erodes compassion and empathy over a decade of clinical training, fundamentally reshaping the attitudes, beliefs and behaviors of once-idealistic medical students.
Through careful observation of their professors and attending physicians, young doctors learn which emotions and behaviors are rewarded and which are dismissed as unimportant.
For example, a resident will rarely (if ever) witness an attending physician take time to learn non-clinical details about a patientâs life or connect with concerned family members about anything medically irrelevant. Trainees come to view these interpersonal activities as a waste of time when compared to reading textbooks and mastering technical skills. After a decade of disuse, their âsofter skillsâ atrophy.
The Reality Of Medical Practice Today
We know that physicians value the doctor-patient bond. However, the realities of healthcare today make it difficult to invest time in that relationship.
The practice of medicine for most physicians resembles running on a care-delivery treadmillâone that spins ever-faster with each passing year. As economic pressures grow, physicians are forced to see more and more patients each day just to maintain their income.
That is why, on average, physicians spend only 17.5 minutes with each patient. And, given the demand to move quickly, physicians interrupt patients after just 11 seconds to eliminate âwasted time.â Of course, doctors donât hurry up their exams or hijack conversations with the intent to be rude. They truly care about people. Theyâre just busy. And theyâve learned that taking control allows them to complete the visit more efficiently.
But these rapid-fire exchanges can leave patients feeling uncared for. In fact, nearly three-quarters of patients surveyed reported having seen a doctor who failed to be compassionate. A similar percentage said they always or often felt rushed by physicians.
How Tech Bests Humans Emotionally
While the healthcare industry has been grappling with the anecdotal notions of ChatGPTâs superior soft skills, a recent study published in the Journal of the American Medical Association (JAMA) provides hard evidence.
Researchers compared doctor and AI responses to nearly 200 medical questions submitted by patients via social media. The answers were read by a team of health care professionals who didnât know whether the author was a doctor or a bot.
The team concluded that 80% of the AI-generated responses as more nuanced, accurate and detailed than those shared by physicians. But most surprising was ChatGPTâs bedside manner. According to a write up in U.S. News, âWhile less than 5% of doctor responses were judged to be âempatheticâ or âvery empathetic,â that figure shot up to 45% for answers provided by AI.â
ChatGPT is far from perfect. Current versions are tied to medical data published before September 2021. And, on occasion, AI will hallucinate, providing seemingly expert answers that are dead wrong along with references that donât exist. Clearly, current versions of generative AI arenât âready for prime timeâ when it comes to diagnosing, treating or caring for patients.
But these large language models are vastly better at âlearningâ than any AI that has come before. Thus, anything that can be taughtâsuch as demonstrating compassionâcan be learned and mastered by generative AI. As they become faster, smarter and more powerful, they will become not only more accurate, but also more empathetic.
Today, most patients (60%) are uncomfortable relying on technology over doctors for medical care. Given the choice, theyâll consistently pick a physician over AI.
But our nation is facing a worsening physician shortage at the same time itâs experiencing an AI revolution. It now takes 31 days on average to be seen by an OB-GYN and 35 days for a dermatologist.
I predict that when people struggle to access timely medical care, theyâll turn to ChatGPT for help. When the answers they get are accurate and compassionate, theyâll turn to AI again the next time they need medical expertise. Over time, people will care less (or not at all) whether the assistance and advice come from a carbon-based life form or a silicon chip.
Already, a growing number of doctors are comfortable using generative AI to assist with everyday healthcare tasksâfrom writing letters to insurers and transcribing notes to double-checking diagnoses and populating medical records.
But if they donât find ways to demonstrate empathy, sympathy and respect in ways that foster patient trust, generative AI will fill that gap. Once this process begins, humans will play an ever-smaller role in the provision of medical care.