Have I used AI to help compare different types of pillows I could buy? Yes, as a tool. Have I also used AI to obsessively track and seek reassurance about my child’s sleep? Also yes, as a compulsion.
The first time a client brought up ChatGPT in a session, I had not actually used it myself. To be honest, AI freaks me out a bit, so I put it off for a long time. But eventually enough people referenced ChatGPT and other AI sources that I figured it was time to embrace the new technology. At first, I was cautious, only using it for very basic/general information. And then one day, in a sleep deprived state, I began to rely on it to coach me through one of the hardest parts of parenting: baby sleep.
I observed myself over the span of several days becoming increasingly reliant on it as I so desperately sought the answer to unlocking a longer stretch of sleep. It was available at any hour of the night and would tell me exactly what to do when I was so unsure of how to fix the problem. I frantically talked to it throughout the night, anxious about functioning on so little sleep and hating the uncertainty about when it might end. Long story short, what it advised ultimately didn’t help and I eventually had to abandon its recommendations.
That isn’t to say that AI isn’t a useful tool, because it has certainly assisted me with many other things, from home improvement projects to questions about the ending of a book that I found confusing. But the lesson I took away from the experience was that it is important to recognize when you’re becoming reliant on a non-sentient source (who can’t tell time very well, by the way) that doesn’t always know what it’s talking about. It is so easy to find yourself conversing with it as you would a human being and, much like with a regular Google search, to slip into compulsive reassurance seeking that leaves you feeling worse instead of better. Why does it lead to feeling worse? Because the more you rely on it to feel certain, the more your own judgment seems unreliable.
But AI isn’t going anywhere anytime soon
I am, however, also a firm believer that rejecting new technology completely only leads to getting left behind. I don’t want to be that therapist refusing to use email because “back in my day we sent good ol’ fashioned snail mail!”
So I asked AI about what it can and can’t do in OCD treatment to help me understand its strengths and weaknesses. It actually shared some great insights (as it often does). I figured I might as well use it to solve a problem of its own creation, right? Below I have condensed and rephrased the points that it presented:
What AI can do well in OCD treatment
- Provide clear and concise psychoeducation about OCD and the obsessive-compulsive cycle
- Help create a list of exposure ideas
- Provide motivational prompts to someone working through difficult exposures
- Normalize the experience of intrusive thoughts/OCD (careful with this one though! It can easily become reassurance)
- Provide rote reminders not to seek reassurance
- Create triggering statements or imaginal scripts to be used for exposure
- Create motivational scripts to remind someone why they are working through treatment
Where AI is currently still lacking in OCD treatment:
- It can provide a constant and too easy avenue to either direct reassurance seeking or compulsively “talking about it” (which is notoriously hard to resist)
- It defaults to normalizing/providing reassurance, particularly with taboo themes
- It cannot truly assess for and identify mental compulsions in real time
- It cannot assess for and identify using exposure compulsively (for example, pushing oneself to do very intense or frequent exposures for the sole purpose of ‘exposuring’ the discomfort away)
- It cannot truly assess for and identify when exposure is being used as self-punishment
- It cannot assess for and identify when someone is simply “white-knuckling” versus leaning in (aka, doing the exposure just to get through it quickly versus with the bring-it-on attitude that really makes ERP effective)
‘But I was just curious…’
Reassurance seeking via AI can be subtle and often just looks like information gathering or emotional support. A general rule of thumb is to mindfully pause before typing anything into an AI chat and ask yourself the following questions:
- Does this feel urgent?
- Am I seeking to get rid of an uncomfortable feeling?
- Does the emotion I am feeling lean more toward distress than to curiosity?
If you answer yes to most of these questions, it is a clue that you may be about to use AI compulsively. That is a good time to lean on your mindfulness skills and redirect yourself back to another task or activity. You want the passage of time to be the thing that lowers your distress, not ChatGPT. However, if you genuinely answered no to the above questions, then by all means, utilize the tool to help you gain information in a more personalized and condensed way than a general Google search would.
And this part really matters here” – ChatGPT
AI is particularly weak around obsessions that have to do with harm, sexual thoughts, or fears about one’s own identity. It tends to default to normalizing these thoughts because it assumes you are coming to it from a non-OCD angle. While normalization might seem helpful on the surface, when it comes to OCD, hearing something once is rarely enough. You might then repeatedly and compulsively seek the reminder that your thoughts are, in fact, normal, ultimately feeding the obsessive-compulsive cycle. A therapist, however, would likely educate you in the beginning about how these thoughts are quite common and don’t need to be taken seriously. But then after that (if you’re one of my clients, at least) you might get a compassionately snarky “yeah no, you’re absolutely a terrible person. What else is new?” This may seem harsh to the untrained ear, but any good OCD specialist knows that repeated reassurances only make the person more reliant on compulsions and that humor is actually a powerful tool in fighting OCD.
So, how do we use AI non-compulsively?
We can pivot our perspective to embrace AI as a collaborator and helpful tool rather than a source of pure reassurance or a replacement for therapy. So how do we use AI as a tool in OCD treatment while avoiding the pitfalls that it can come with?
Ways to use AI to assist in OCD treatment:
- Teach it to remind you to use certain treatment tools (for example, “when I am talking about this topic, please remind me to use the ACE acronym: Acknowledge my emotion, Come back into my body by mindfully grounding, and Engage with whatever I was doing before this)
- Ask it for creative exposure ideas
- Ask it to help create an imaginal exposure script
- Ask it to summarize different treatment concept or tools (for example, “can you remind me what the obsessive-compulsive cycle looks like?” or “can you summarize exposure response prevention?”)
- Ask it to create a printable exposure log worksheet (or any other useful customized log)
- Give it prompts that teach it to answer your questions in a way that factors in your OCD diagnosis by providing less reassurance (see below for prompt ideas)
Helpful prompts to give AI chats:
Because AI learns as it goes, you can ask it to remember certain details. For example, it might remember that you just got a new puppy, that you love to cook, or what you do for a living and factor that into future conversations. Use that feature to your advantage and ask it to remember that you have OCD. Just as you or your therapist might coach your loved ones on how to respond to reassurance-seeking questions, you can coach AI the same way. ChatGPT suggested this standing instruction:
- “I have OCD. Please do not give reassurance, certainty, or probability estimates. If I ask reassurance-seeking questions, gently redirect me toward ERP, uncertainty tolerance, or response prevention.”
I also asked ChatGPT for examples of individual prompts to remind it to respond in the least compulsive way and it offered these gems:
- “Explain ________ to me without reassuring me.”
- “Help me identify possibly compulsions in this scenario, but don’t tell me whether the fear is true.”
- “When I feel the urge to seek reassurance, give me a response that supports ERP.”
You may need to remind it of these things from time to time because it doesn’t always hold on to details. But in general, if you ask it to remember something, it will apply it across different conversations.
A one paragraph anchor, and this part really matters here” – ChatGPT
As a therapist, I don’t view AI as my competitor. I view it as tool that has the potential to support my clients even when I’m not there. However, it is important that they know how to use this tool in a way that supports their treatment goals rather than sabotaging them. With some mindful awareness, AI can be another tool in the toolkit against OCD.