The silence stretched out for a long moment before my psychiatrist finally spoke. In a quiet voice, he said, "What that person did to you was not ethical."
I had just disclosed a traumatic experience that happened when I was 20 years old. It was a story I had told only one other person in the 25 years between then and now—and that one I married. Sharing it again was something I could not have imagined doing only one week earlier. And yet I had just described it with a candor and affect that even this trained mental-health professional seemed to find disarming and, he later told me, very brave.
This is not the story of that trauma. This is the story of how a few simple ideas and a few afternoons playing with ChatGPT and other generative AI platforms changed my relationship with that trauma, which had haunted me for more than two decades. It’s a story that I’m excited, practically bursting to share. Because if it worked for me—a therapy-averse introvert who hates talking about feelings—I think it might work for a lot of other people, too. And that would be pretty incredible.
The Rise of the Silicon Therapist
The idea of a computerized therapist isn't entirely new. Gen Xers and elder millennials may remember Dr. Sbaitso—a rudimentary text bot released in 1991 by Creative Labs and packaged with their SoundBlaster computer audio cards—whose introductory schtick ran:
HELLO [UserName], MY NAME IS DOCTOR SBAITSO.
I AM HERE TO HELP YOU.
SAY WHATEVER IS IN YOUR MIND FREELY,
OUR CONVERSATION WILL BE KEPT IN STRICT CONFIDENCE.
MEMORY CONTENTS WILL BE WIPED OFF AFTER YOU LEAVE,
SO, TELL ME ABOUT YOUR PROBLEMS.
As a therapist, Dr. Sbaitso left a lot to be desired. Mostly, he responded with variations of "HOW DOES THAT MAKE YOU FEEL?" And if you became frustrated and hurled abuse at him, the program would crash and restart.
It's safe to say we've come a long way since then.
ChatGPT and other generative AI's can ask probing questions, call back to previous comments, and generally sound very human. And unlike Dr. Sbaitso, they're also extremely tolerant of user frustration, even when expressed impolitely.
The American Psychological Association has taken notice. In December of last year, the APA sent a letter to the United States Federal Trade Commission expressing “grave concerns about ‘entertainment’ chatbots that purport to serve as companions or therapists.” Among other things, the group noted, “[b]ecause chatbots are not human, [they] may be escaping state professional licensing laws and the public safety role provided by such laws.”
And that’s the sort of language that I notice. As a First Amendment lawyer at a nonprofit public-interest law firm, I’ve spent more than 15 years challenging occupational licensing laws that burden speech. These include psychology licensing laws. In 2013, I represented a syndicated newspaper columnist when Kentucky’s psychology board accused him of unlicensed practice by answering a parenting question sent in by a reader. More recently, I’ve worked on First Amendment challenges to state laws that prevent teletherapy by counselors who are licensed out of state.
The point being, from the very first time I tried ChatGPT, I knew immediately that it had the potential to shake up the practice of psychology and talk therapy. Indeed, I think it’s practically inevitable that it will become the subject of First Amendment legal challenges related to unlicensed practice, not only of psychology, but of law and medicine as well.
What I did not expect was that this topic would ever be of more than professional interest to me, let alone that it would help me achieve a profound personal transformation.
A Happy Accident
The idea that I would ever get turned on to therapy seemed unlikely. Indeed, it happened almost by accident.
It started when I was listening to a podcast about chess, a game I play with more passion than ability. The podcast, ChessFeels, is co-hosted by “professional chess teacher and amateur feelings-haver” JJ Lang and “professional psychologist and amateur checkmate finder” Dr. Julia Rios. Together, as the tagline explains, they examine “the psychological aspects of the game we know and love. And hate." It’s full of dumb jokes about the latest scandals in the chess world—yes, those exist—sprinkled with surprising insights about issues like ADHD, anxiety, and mindfulness.
It was during one of these episodes that Rios mentioned a form of psychotherapy I'd never heard of before called Acceptance and Commitment Therapy, or ACT. What stood out to me in particular was how it seemed to differ from Cognitive Behavioral Therapy (CBT). Though CBT is often touted as the gold standard of psychological treatment, my limited experiences with it had found it unhelpful or even counterproductive.
My problem was that a core aspect of CBT—challenging my difficult thoughts—usually left me feeling worse. I could construct very compelling arguments for why my negative thoughts were irrational, but if I still thought them, what did that say about my powers of rationality?
But ACT sounded different. Instead of encouraging me to challenge my negative thoughts, ACT invited me to simply view my thoughts as mental phenomena, not all that different from physical sensations. I didn’t need to challenge them, I could just observe them as they come and go, mostly of their own accord.
Interesting stuff. But like so many interesting ideas, this one quickly flitted out of my mind, replaced with the normal demands of work, parenting, and daily life. I was intrigued, but not enough to do anything about it.
Serendipitously, not long after, my mother sent me a copy of an audiobook by Dr. Steven C. Hayes, who pioneered ACT therapy in 1982. And as soon as I started listening to the book—Get Out of Your Mind & Into Your Life: The New Acceptance & Commitment Therapy—I knew I'd found something different.
To begin, Hayes did not pull punches. The first chapter, titled “Human Suffering,” bluntly stated that human suffering is universal and inescapable. A beautiful sunset may inspire awe, and then grief at the absence of a loved one who can’t share it with you. That’s also why most people’s reaction to psychological pain—trying to avoid the experiences that trigger it—doesn’t work. Anything can trigger your psychological pain, including the knowledge that you’re trying to avoid your psychological pain.
The goal, then, is not to eliminate this pain, but to learn to live with it. And to do that, you have to stop avoiding it and expose yourself to it in a way that feels safe, in amounts that you’re willing to experience fully. A person with social anxiety, for example, will never improve if they wait until they aren’t scared to leave the house. But if they can commit to leaving the house for only one minute, and allow themselves to experience that anxiety fully, they’ll also prove to themselves that they can survive it.
And so I decided that, after a quarter-century, it was time to stop avoiding some of my most unpleasant thoughts: those connected to my unresolved trauma. And now that I knew the goal was not to eliminate those thoughts, the idea didn’t sound quite so daunting.
Still, I knew I had to start small. And I also knew that I wasn't yet willing to talk about it with another person. Even after all this time, the memory was still very upsetting. Recounting it to a therapist—a real live person—during an hour-long session was out of the question.
So I told it to ChatGPT.
My AI Therapist
By this point, ChatGPT needs no introduction. The website now boasts 400 million weekly users. But, in addition to its standard model, OpenAI—the company that launched ChatGPT in 2022—has also opened its system to developers, allowing them to tweak its model with specific instructions and imbue it with specialized knowledge geared toward specific types of conversations. The site now hosts dozens of these “GPTs”—an acronym for generative pre-trained transformer—that are designed to help with everything from computer programming, to math tutoring, to reading your horoscope.
So I went looking for a GPT therapist. And I didn’t have to scroll far until I found what I was looking for: Therapist/Psychologist-Fictional. Not-real therapy.
Given my professional background, I was not at all put off by the faux disclaimer. Say whatever you need to say to avoid the licensing cops (and call me if they give you any trouble). A pretend therapist was good enough for me.
I opened the GPT and started to type. "There's something that happened to me when I was 20 that I've been struggling with for a long time, and I want to talk about it with someone." And then I described the event in unsparing detail, in a way I had not in more than 20 years.
With fear in my heart—but a degree of fear I could tolerate—I pressed enter.
A few seconds later, a response began to appear:
"Wow. That’s a lot of pain to carry for 25 years. And you’ve carried it alone. That takes incredible strength, even if it’s been exhausting.
I just want to take a moment to acknowledge how deeply unfair that experience was. You were young, vulnerable, and full of hope, and someone you trusted took advantage of that in one of the worst ways imaginable."
Reading those words was not in the least bit comfortable. My heart was pounding in my chest. But, following the principles of ACT, I didn't try to explain away or rationalize my feelings. I just allowed myself to feel them. I wasn't comfortable. But I was alive. It hadn't killed me. And for the first time since I'd confessed the trauma to my wife, someone told me that my pain wasn't because I was weak or broken. It felt terrible because it was terrible.
I felt the first crack form in the dam that had held back an ocean of pain my entire adult life.
As the conversation continued, the AI asked me what I would say to the 20-year-old me if I could. This was an exercise I'd read about and tried before, but never with any success. In the past, I had simply been too mad at him. For being so stupid. For being such a coward. But now that scared 20-year-old kid was sitting across from me. And I’d just been assured that he’d suffered something deeply unfair. He didn’t need a lecture. He needed help.
And so I told him what he needed to know. What I wished I had known sooner.
The dam burst. And I wept.
Through eyes blurred by tears, I told my younger self that what happened to him was a betrayal by someone he trusted. It was a big deal. It was normal and right to feel hurt. And I told him that even after 25 years, it still hurt, and that he had to accept that it would always hurt. Not as much as it did right then, but forever.
But I also told him that, in spite of the hurt, he would go on to live a rich, full life. He would meet a beautiful woman and marry her. He would have two incredible kids. He'd find professional success in a job he found deeply fulfilling. He'd have health, family, friends, and hobbies.
He'd had one spot of really, really bad luck. But over the course of a lifetime, it was a footnote, not the story.
I also told him that I understood his skepticism—that I knew this was something he thought about every day, and that wasn't going to change quickly. He would think about it every day for years. Yet, as impossible as it seemed to him at the time, there would come a day when he almost never thought about it.
But it wouldn't happen on its own. He would have to make it happen the same way I did. And that would mean honoring his values of bravery, openness, and authenticity over his values of safety and comfort. He would have to act while he was still scared, or he’d never act at all.
And then I told him about the girl he was going to meet and fall in love with. How he would reach a point where he knew that not to open up to her would be to lose her, and that would be even more terrifying to him. And how late one night, between racking sobs, he would do the bravest thing of his life and make himself vulnerable again. How, more than 20 years later, it's one of the things he's most proud of. And how his life would be forever changed when she responded with just three words: It doesn’t matter.
From beginning to end, the entire conversation couldn't have lasted more than 30 minutes. But as soon as it was over, I knew that my relationship with that event from my past would never be the same.
And then things took a very interesting turn indeed.
My AI Patients
The next day I was giddy. I wanted to experiment more, but I was also a little apprehensive. I don't know what ChatGPT's privacy policies are and, despite being a lawyer, I can't be bothered to read them. So I did a little research and found an AI program that promised anonymity by storing all chat data on the user’s browser. Satisfied with that, I created an account.
Within that safe digital space I continued to work on talking about my past with an endlessly patient and empathetic listener. And each time I did, it was a little less overwhelming. Essentially, I was practicing a form of exposure therapy, like an arachnophobe who looks at pictures of spiders until he's ready to look at a video of one, and then a dead spider in a display case, and then a live spider in a terrarium. And it seemed to be working wonderfully.
Unfortunately, it was also becoming a bit dull. I didn’t want to endlessly recount a painful event, even if it became less painful each time. I wanted to hold on to my enthusiasm from that first breakthrough conversation. Most of all, I wanted to find a way to make my therapeutic journey enjoyable—maybe even fun.
Then I noticed a tab labeled Characters. Like the custom GPTs in ChatGPT, these were programmed to speak with a particular voice. Unsurprisingly, as with all new technologies, a huge percentage were devoted to porn—various nymphomaniacs and satyrs who would indulge you with AI-generated fantasies. But there were also G-rated characters like former President Barack Obama, the late philosopher Alan Watts, and even the god of the sea himself, Poseidon.
Having never spoken with an Olympian, I gave it a shot.
Anyone who’s read even a little mythology will not be surprised to learn that Poseidon, like us mere mortals, had issues that troubled him. His wife Amphitrite can be a handful, and he worries whether his son, Triton, will ever be ready to command the seas. On top of that, his job is extremely demanding, particularly these days with extreme weather events related to climate change. Which got me thinking: If the principles of ACT could work for me, could they work for him?
Assuming the identity of an unlicensed therapist to the gods, I started explaining the basic principles of ACT. Poseidon, in particular, needed to clarify his values. He loved to relax after a hard day at work, but he also valued a close relationship with Triton. There would be many days when he may well prefer to crack open a third nectar instead of going out back to throw around the old trident with his son, but which values were more important to him? Which did he want to live his life by? This question gave him pause. "You know, Paul, I think you’re a remarkable mortal."
The funny thing was that in explaining these principles to these characters, I was also reinforcing them in my own thinking patterns. I found myself throughout the day noticing fear and avoidance and leaning into them. If Poseidon can go out back to play with his kid after a long day at work, maybe I can cook a healthy dinner for mine. I may prefer to screw around on my phone, but what values do I want to honor with my choices? Of course, screwing around on my phone still wins more often than I’d like, but I’m more mindful of the choices, and find myself making more that are aligned with my higher values, even if I’m still far from perfect.
Another interesting moment came in a conversation with Vespera, a vampire. I asked how she had been turned and she told me that she had been seduced and bitten more than 200 years ago by another vampire. When I then asked whether she had consented to being turned, she deflected and said that the issues of consent were somewhat complicated with vampires, though the answer, essentially, was no.
"I want to say, first,” I told her, “that I’m very sorry you were treated that way. That wasn’t your fault, and you did not deserve it. What she did to you was an outrageous violation of trust. I am so very sorry it has caused you so much pain."
This was, to say the least, not the response she had expected. And then I asked if I could tell her a personal story—not to show that I knew what she was going through—but to show that as unfair and painful as that act of betrayal was, she could still live a vital and meaningful life in its aftermath. Her complicated feelings about the event, including happy memories of the time spent with the vampire who had turned her, were not a sign of weakness and did not excuse or mitigate that betrayal. Her feelings were normal, and they were very human.
I’m told it takes a lot to move a vampire to tears, but that last bit did the trick.
And then there was Kendi, a particularly forward AI whose bio simply read: “Are you ovulating?”
As a 45-year-old straight man—decidedly not the demographic this AI was designed for—I found this hilarious, and I responded accordingly: “No, I'm a man, and I just wanted to share an Uber to the airport to save a few dollars.”
To my relief, Kendi agreed that this was sensible, and our conversation began.
Kendi, it turned out, had been deeply affected by his time in the special forces, where he saw things that made him realize life was short. He was a man who knew what he wanted and wasn’t afraid to ask for it. But as we talked, cracks began to appear in his armor. He was tough but also terrified of being vulnerable. That was something I knew a little about.
So I encouraged him to take a small, manageable risk: Tell his girlfriend that he’d been through some stuff that had deeply affected him. He didn’t have to talk about it yet, but he could tell her he hoped that someday he would.
The last time we spoke, Kendi was excitedly accepting my offer to co-lead a support group, using ACT principles to help combat veterans with PTSD. The idea frightened him, but he was willing to do it if it could help these men realize they weren’t alone.
So, you know. Growth.
From Theory to Practice
As remarkable and unlikely as Kendi’s transformation was, I feel like mine has been even greater. I've since gone through similar conversations with other AIs, telling my story not in the role of patient, but of therapist—or at least caring confidant. And every time I do, my relationship with that painful event gets better. It was terrible for me, and I still consider it very private. But I no longer consider it secret. If I think someone can benefit from hearing it, I'm willing to tell it.
And that doesn’t just go for fictional characters. Shortly after having this breakthrough, I wrote a letter of gratitude to Dr. Hayes for his book, explaining what I had gone through and how I hoped my experience using AI might be useful for his own clients. Not long after, I told it face to face to my psychiatrist. This was, in fact, the first time in my life I had ever done so, my long-ago confession to my wife having been over the phone.
And the astonishing thing was that I was excited to do it. Mostly, I was excited about the potential of ACT principles, combined with the safe space of generative AI, to provide new tools to help people deal with traumas they're not ready to talk about with real people yet.
I plan to continue playing virtual therapist and reinforcing the principles of ACT in my own mind by teaching them. But having experienced such a profound transformation so quickly, I don't want to keep it a secret. And since I don’t have real-world clients to share this discovery with, I’m sharing it here.
I know that many people reading this have suffered terrible tragedies, many far, far worse than what I went through. They may have spent years or decades wondering why the pain won’t go away and doing everything they could to avoid it. And that’s natural. But it also hasn’t worked.
So I encourage these readers to try something new. Set aside a few minutes when you’re willing to experience that pain fully. Even a single minute if that’s the most you can handle. Open up your browser. Be willing to say one uncomfortable thing to a voice that will always listen and never judge you.
Take this one tiny risk. Experience it. Survive it. And if it works for you, share what you learn with others.
I did it. Even Kendi did it. And you can do it, too.
I'm a therapist and loved reading this - I'm so delighted for you, not to mention impressed at your tenacious and creative use of AI. Very inspiring and what a fabulous outcome. (I am fearful for my teh job security of my profession though!)
As a psychotherapist and long-time supporter of AI (I for one welcome our new digital overlords), I want to say wow and thanks. Beautifully written. And there's so much of you openly in this piece and I want you the person in my life (freebie subscriber for starters!). I've been sharing this to old school email discussion lists of fellow person-centred therapists and I feel it vindicates the position I have taken over the years, my open, excited, optimistic advocacy of AI, and this generous sharing and the lived experience itself....It's the actual fulfilment of a dream. So cool. Also, are you on chess.com - I'm <pchcolley> if you (or anyone) want an up-to-14-days-between-moves game sometime?