In our new series In Your Feelings with Samantha Boardman, MD, she’s helping you better understand your emotions. This article originally appeared in her substack The Dose, which you can subscribe to here.
While I like to think of myself as a supportive and responsive doctor, a 2023 study published in JAMA is making me think again. According to the findings, ChatGPT was more informative and understanding than a real doctor when answering patient questions. The study compared written responses from physicians and those from ChatGPT to real-world health queries. A panel of licensed healthcare professionals preferred ChatGPT’s responses 79 percent of the time and rated ChatGPT’s responses as higher quality and more empathetic. Not only was ChatGPT better at providing information, but it was also better at caring.
A few months before this study was published, ChatGPT passed all three steps of the medical licensing exam. I studied for weeks for those tests but, in all fairness, the licensing exams cover facts and baseline medical knowledge. So, it’s not that surprising that a large language model did well. But empathy and bedside manner? That’s a whole different story.
Many are already turning to ChatGPT to discuss their mental health. One Reddit user described ChatGPT as “better than my therapist.” She continued, “In a very scary way, I feel heard by ChatGPT.”
As good as ChatGPT might be, I’m not ready to pass the baton just yet. (Hopefully my patients also feel heard.) That said, I’m optimistic that ChatGPT can be used to enhance treatment. There’s already evidence it can help assess suicide risk, prevent future depression, improve care, and increase access to treatment. Artificial intelligence can help researchers figure out what makes some therapists more effective than others and assist in the training process of new clinicians. It can also detect subtle changes in a patient’s speech pattern that might indicate a worsening condition that would otherwise go unnoticed. Driving down burnout in clinicians by drafting clinical documentation and responding to endless emails is yet another way artificial intelligence can be helpful. Harnessing artificial intelligence to diminish the time clinicians spend on burdensome paperwork and increase the time they spend with patients would be a win-win for all involved.
Using artificially intelligent companion robots to reduce loneliness is an area I’m particularly excited about. Colleagues Murali Doraiswamy, Elizabeth Broadbent, and I just published a paper in Science Robotics highlighting how robots can support health in elderly adults and help them remain at home rather than moving to assisted living. We argue that robots can enhance feelings of connectedness and reduce caregiver burden. Unlike human caregivers, robotic ones never judge, get impatient, or become frustrated. Robots will never be a substitute for humans, but they amplify humanity.

At a conference a few years ago, I encountered Paro, a robot in the form of an adorable baby seal. Apparently, the designers chose a seal because it wouldn’t conjure memories of a beloved deceased cat or dog. It’s undeniably cute — it has big eyes, black batting eyelashes, and a sunny personality. Paro coos, snuggles, and moves when it recognizes certain words and in response to touch. It learns to behave in a way that pleases the user, remembering and repeating actions that earned it strokes — and avoiding behaviors that the user disliked. Paro has been shown to reduce loneliness, agitation, and blood pressure. For patients with dementia, Paro is a friendly companion and has proven to be a reliable mood brightener. I’m not proposing that robot companions replace human companions, but robots may provide a bridge between what many isolated individuals need and what society currently provides.
In the spirit of fairness, I asked ChatGPT for its opinion on the subject.
Me: Can ChatGPT be useful to assist psychiatrists in the treatment of patients?
ChatGPT: Yes, GPT-powered chatbots can be useful tools to assist psychiatrists in the treatment of patients. These chatbots can provide support and guidance to patients by engaging in therapeutic conversations, offering coping strategies, answering questions about mental health conditions, and even providing psychoeducation. They can also be helpful in monitoring patients’ mental health between visits and providing reminders for medication or therapy sessions. However, it’s important to note that while chatbots can be useful supplements to traditional therapy, they should not be seen as a replacement for human interaction and personalized care provided by psychiatrists.
Not a bad answer…
Dr. Samantha Boardman is a New York-based positive psychiatrist committed to fixing what’s wrong and building what’s strong. Based on 15 years of experience, she helps clients cultivate vitality and boost resilience. Visit her website, follow her on Instagram, or read her newsletter, The Dose.