In today’s column, I examine the use of generative AI and large language models (LLMs) by human therapists during client sessions.
This is a controversial professional act. Most clients would normally assume that a therapist would not leverage AI during therapy. The therapist might use AI before a therapeutic session, doing so to get prepared for a session with a client. Therapists might use AI after a session, perhaps leaning into the AI to help organize notes or otherwise get an AI-based “second opinion” of sorts.
The wrinkle in this matter entails the therapist actively employing AI while a session is underway. How would they do so? It’s easy. A therapist might use any of a number of LLMs. They could tap into popular generic AIs of ChatGPT, GPT-5, Claude, Grok, Gemini, Llama, etc. An alternative would be to use a specialized or customized LLM that has been shaped specifically for use in mental health therapy.
Let’s talk about it.
This analysis of AI breakthroughs is part of my ongoing Forbes column coverage on the latest in AI, including identifying and explaining various impactful AI complexities (see the link here).
AI And Mental Health Therapy
As a quick background, I’ve been extensively covering and analyzing a myriad of facets regarding the advent of modern-era AI that produces mental health advice and performs AI-driven therapy. This rising use of AI has principally been spurred by the evolving advances and widespread adoption of generative AI. For a quick summary of some of my posted columns on this evolving topic, see the link here, which briefly recaps about forty of the over one hundred column postings that I’ve made on the subject.
There is little doubt that this is a rapidly developing field and that there are tremendous upsides to be had, but at the same time, regrettably, hidden risks and outright gotchas come into these endeavors too. I frequently speak up about these pressing matters, including in an appearance last year on an episode of CBS’s 60 Minutes, see the link here.
Human-To-Human Therapy Gets AI
The usual expectation when undergoing therapy is that your therapist is all-ears and all-eyes on you as a client.
Well, it turns out that this still might be the case, though with a bit of a twist included. Some therapists are opting to get AI intermingled into therapy sessions. During a session, a therapist could opt to have AI listen to the dialogue taking place and provide real-time instant commentary to the therapist. A client might not be aware that this is taking place. It can be done discreetly. A therapist who wishes to undertake this process without disrupting the session or possibly raising concerns from the client can engage in a silent conversation with the AI.
The Mechanics Involved
Here’s how it works.
Assume that the therapy is taking place in a session room with the therapist and client both physically present. The therapist might have an iPad or similar device that they are presumably using to take notes. Meanwhile, the audio of the discussion is streaming into the iPad, or there might be a microphone secreted elsewhere in the room. The crux is that the spoken words are being streamed into a digital device.
An AI that the therapist has chosen to use is connected to the input stream of audio. I’m sure that you know how common it is these days for audio to be rapidly translated into written words. People use this capability frequently to capture notes of meetings.
The AI is scanning the words and gauging the nature of the therapy dialogue. It then displays on the iPad an assessment of how the therapeutic session is coming along. The therapist might get an alert that the client has said something that seems especially disconcerting. Or the AI might bring up prior notes about the client and highlight that they have mentioned the same topic many times previously.
Therapist Interacts With AI In Real-Time
There is more that can happen during the session.
Besides the AI informing the therapist about the status of the session, the therapist could readily type questions into the AI. How many times has the client so far said that they are depressed? The AI counts the tally and displays it to the therapist. Did I tell the client last meeting to do some readings on mental health, and if so, what recommendations did I provide? The AI looks up the last meeting and displays the recommendations.
A much deeper form of interaction can occur.
The therapist might ask the AI to provide a potential psychological analysis of the client. Does the client appear to demonstrate the conditions associated with PTSD? The AI responds. Give me a question that I can ask the client to further explore their potential PTSD? The AI proposes a question that could be used at this moment during the session.
A bottom line is that the therapist and the AI are able to fully interact in the midst of a client session. The therapist will merely seem to be typing notes onto their iPad. The client presumably won’t notice the activity taking place. Nor will they realize that AI is smackdab in the middle of the therapeutic process that is occurring.
Remote Therapy Is More AI Amenable
The advent of remote therapy has made the inclusion of AI an even simpler proposition.
Envision that a therapist brings up a remote session and has the client log in to it. AI is already active behind the scenes. The therapist and client carry on their discussion, seemingly face-to-face, doing so remotely.
Now, things are much more straightforward for the therapist and their use of AI. The client probably cannot see that the therapist has on their private screen the open dialogue with the AI. Nor can the client likely see that the therapist is typing during the session. Being physically separate allows the AI to more readily dovetail into the process.
Another possibility is that the therapist wears an earbud, and the AI communicates to the therapist by verbalizing aspects. This works generally too during an in-person session, as long as the earbud doesn’t leak audio. You can imagine how disturbed the client would be if they could hear the AI whispering in the ear of the therapist and offering advice about the therapy underway.
Some refer to the use of an earbud as a “bug in the ear” approach to AI-augmented therapy.
Arguing About Right Or Wrong
Your first thought might be that this intrusion by AI is absolutely outrageous.
A client who is paying a human therapist ought to get their full money’s worth and have the complete undivided attention of the therapist. The AI is going to be a distraction. Keep the AI out of the session. If the therapist wants to use AI beforehand or afterward, maybe that’s okay, but taboo during a session.
Hold on for a second, what if the AI is dynamically boosting the therapy that is taking place?
Think of things this way. You are getting two therapists for the price of one, albeit this consists of a human therapist combined with an AI-based therapist. It could be that the AI will enable the human therapist to do a much better job during the session. The AI will remind the therapist about this or that. Human therapists are human. They can likely up their game by using AI.
Furthermore, if the AI were relegated to only being used beforehand or after the session, it’s kind of out of the picture. The AI usage during a session is precisely the right place and time for AI to offer its two cents. The therapist is there, the client is there, and that’s the time to make the magic happen.
Who Is Doing The Work
You might begrudgingly say that if the therapist is not distracted by the AI, and if the AI is adding value, perhaps this is an acceptable way to proceed.
The turning point, though, might be that if the AI is giving therapeutic advice to the therapist, that’s a bridge too far. The therapist is supposed to be versed in therapy. Handing over the reins to the AI is unprofessional and quite unsettling. Human therapists are supposed to give human-based advice.
Sure, that makes sense, so let’s consider that the therapist is not handing the keys of the therapy over to the AI. All that they are doing is gauging what the AI indicates. The therapist can utterly disregard the AI. The therapist might be spurred to take the therapy in a creative direction due to the AI.
Overall, the therapist is presumably going to still speak their mind and perform the therapy via their wits and therapeutic acumen.
A Gloomy Spiral
The hefty counterpoint is that not all therapists will necessarily engage in the use of AI via prudent means that keep AI at an arm’s length.
Here’s the worry.
A therapist is overwhelmed with the number of clients they have. The therapist didn’t have time to prepare and review the notes about an existing client. During the session, the therapist is having a bad day and just not on top of their game. Ergo, they decide that relying on AI is fine, just this one time.
Step by step, the therapist gets increasingly comfortable with AI being the keystone in the therapy session. No longer does the therapist do any prep work. They know that the AI is going to handle things well. All the therapist has to do is keep the AI online and pretty much abide by whatever the AI seems to suggest.
The therapist becomes the last point of backstop. If the AI is telling the therapist to do this or that, and the therapist realizes the AI has gone overboard, the therapist overrides the AI. Otherwise, the bulk of the time, the therapist is following the lead of the AI.
In a sense, the human therapist is not much more than a mouthpiece for the AI.
Cut Out The Middleman
Whoa, some might be thinking, if a therapist is going to essentially hand the session over to the AI, though pretend that the therapist is still calling the shots, that is despicable, and the client might as well directly use AI themselves.
I’m sure you can see the logic there.
A therapist who doesn’t seem to be adding value can be removed from the equation. The client can merely access the AI by themselves. This does away with any of the logistics of arranging to meet with a human therapist, and the AI is available anywhere and 24/7.
In theory, only if the therapist is adding value would they remain in the loop. It is supposed that if a therapist is completely disregarding the therapy by their own noggin, a client might figure this out and opt to no longer see the therapist. One further assumes that the client would turn to using AI rather than a therapist or find a different therapist who is not quite so enamored with using AI during sessions.
I have discussed that there is an overall concern that therapists might be heading toward a decay of their therapeutic skills due to an overreliance on AI, my coverage at the link here. Savvy therapists realize that AI is now an element in therapy, whereby clients walk in the door with AI-generated advice and want to know what the therapist thinks of the AI guidance (see my analysis at the link here).
In total, I’ve predicted that we are headed toward a replacement of the therapist-client duality to instead be a triad consisting of therapist-AI-client, see the link here.
Informing The Client
Therapists who are going to incorporate AI usage into their practice ought to be disclosing outright to their clients how they intend to use the AI (see my suggestions at the link here).
Existing guidelines on therapists’ code of conduct do not yet commonly call out the AI considerations, though various state laws are now stepping into the evolving topic. Illinois has enacted a new law restricting AI for mental health as used by therapists, see my explanation at the link here, as have Utah (see the link here) and Nevada (see the link here).
Are existing codes of conduct sufficiently broad to handle the inclusion of AI?
Maybe not.
The American Psychological Association (APA) “Ethical Principles of Psychologists and Code of Conduct” provides these salient points (excerpts):
- Informed Consent. “When psychologists conduct research or provide assessment, therapy, counseling, or consulting services in person or via electronic transmission or other forms of communication, they obtain the informed consent of the individual or individuals using language that is reasonably understandable to that person or persons except when conducting such activities without consent is mandated by law or governmental regulation or as otherwise provided in this Ethics Code.”
- Maintaining Confidentiality: “Psychologists have a primary obligation and take reasonable precautions to protect confidential information obtained through or stored in any medium, recognizing that the extent and limits of confidentiality may be regulated by law or established by institutional rules or professional or scientific relationship.”
- Recording: “Before recording the voices or images of individuals to whom they provide services, psychologists obtain permission from all such persons or their legal representatives.”
A close review reveals that there are potential loopholes. For example, if the AI is being live-streamed the audio of the session, does that constitute a recording of the voices involved in the session? You might claim that as long as the audio is not being permanently stored, it isn’t a form of recording. It is only being parsed in real-time and not being saved per se.
On the other hand, therapists do need to be aware that if they are using AI, there is a solid chance that the AI might be undercutting their obligation of confidentiality and privacy. The public-facing LLMs tend to have licensing agreements that stipulate the AI is being inspected by their AI development teams and that they can use whatever you do with the AI as part of their added training of the AI. Be very cautious about these sobering aspects.
Upping The Ante
A final thought for now on this controversial topic.
I have emphasized in this discussion the audio aspects of a client session. The next wave of AI usage entails capturing the video stream and analyzing the facial expressions and mannerisms of the client during the session. This is known as ambient AI (see my discussion at the link here).
The upshot is that AI is coming to town, whether therapists like it or not.
The famous psychiatrist Viktor E. Frankl made this insightful remark: “When we are no longer able to change a situation, we are challenged to change ourselves.” The same applies to therapists and the unstoppable force of AI that is barging rapidly into the practice of therapy. Therapists, get ready to change yourselves.
