20 Comments
User's avatar
Bruce Rogers-Vaughn's avatar

I have been a psychotherapist for almost 40 years, as well as an academic in the field. This essay is superb, and deserves to be widely read. I am a frequent speaker on what I tend to call “the crisis in psychotherapy.” This essay gets to the heart of most of my concerns. Including that most of the suffering I encounter these days has deep roots in social and economic practices, and cultural assumptions. (See my book Caring for Souls in a Neoliberal Age.) I would enjoy a conversation with this author, if an opportunity ever comes our way. Thank you so much for your time and work on this essay!

Expand full comment
Selda Koydemir's avatar

Hi Bruce, thanks for your comment. It would be great to connect!

Expand full comment
Bruce Rogers-Vaughn's avatar

I’m about to go into a therapy session, and I’m forgetting at the moment how to send a direct message through Substack. Shall I send an email using the email address on your website? I’d love to connect and share information!

Expand full comment
Selda Koydemir's avatar

Sure.

Expand full comment
Liza Debevec's avatar

Thank you for the detailed post, I appreciate what you say. I have found that I can sometimes get some useful feedback from AI between my therapy sessions and I know that some of my coaching clients will use it in a similar way (and I don't see it as a threat to my profession).

As an extra tool, I believe it is useful. Also, I keep training Chat GPT to not sugarcoat it and will often ask it to reconsider its answer, to see if it will double down or revise the answer. But always, I look at the answers and study them to see how much bullshit there is there, before considering the advice.

Expand full comment
Dr. Samaiya Mushtaq's avatar

Oh I loved this. I hope more people read it. I know a few people who’ve started using AI as a therapist and it really is interesting (and a bit disturbing!) to see its effects anecdotally.

Expand full comment
Selda Koydemir's avatar

Thank you, Samaiya!

Expand full comment
Vanessa Scaringi, PhD's avatar

Love the thoughtfulness- such an important message. Thanks Selda.

Expand full comment
Amanda Bainbridge's avatar

Beautifully put

Expand full comment
Selda Koydemir's avatar

Thanks, Amanda, much appreciated!

Expand full comment
Marieb's avatar

I like it because you do the work, with honesty and humility and that " I hate you " to AI made me laugh :)

Expand full comment
Selda Koydemir's avatar

Thanks, Marieb! Yes, I did say that I hated it, because, well, that's something some clients might say in a session :)

Expand full comment
Marieb's avatar

hahahah yes, I understood what you tried to explain but at the same time I found it kind of funny and so good that went deep and explained that to us as well, super important!

Expand full comment
Lucas's avatar

Thanks so much for this post. Two things immediately come to mind.

1. Seeking out an AI instead of a therapist is itself a decision. I can see it being an economic one in some cases, but it seems to me at least plausible that it is *itself* a way of resisting or refusing to connect with other, real, humans. I suppose I wonder whether much good can come from a relationship that begins in such a place.

2. This also seems relevant to the question of 'self-analysis.' If it is possible for one, in principle, to 'work through' their resistances and conflict, then it stands to reason that it will also be possible with an AI (easier! now we simply have someone to check our self-interpretations against, or to give us alternative ones).

And if it is impossible to work through such issues with an AI, then I imagine it will be for similar reasons that it will be impossible to work through such issues on one's lonesome. (I recognize my talking of 'possibility' here might be misleading). At the very least because AI might simply be used as a tool in the wider toolkit of self-analysis.

But this seems like a big pill to swallow. Is self-analysis in principle impossible, because it is not a relationship with another person? If so, I think we need a stronger theoretical answer as to why this should be. Is it a practical problem: no person can reasonably be expected to look past their own resistances, ask the right questions, and so it is simply misguided or a bad idea to try to self-analze?

Or is it a relationship that is required? If so: need it be a specifically thereapeutic one? And this *need* is a practical one -- would it be that other kinds of relationships *could* lead to health, but are simply unlikely to because people lack training, or is the therapeutic relationship necessary in a deeper way?

Expand full comment
Selda Koydemir's avatar

Hi Lucas. Appreciate your comment and these thought provoking questions. I'd like to comment on the last bit. The relationship doesn't have to be therapeutic. I mean therapeutic in the sense that it doesn't have to be a relationship with a human mental health professional. The benefits of a relationship is not grounded in training. We can have a therapeutic relationship with a romantic partner, a friend, a sibling, or a parent. In fact, it's where everything starts anyway. There is something I always say: change happens in relationships; we evolve and grow in relationships. And I don't mean a relationship in therapy, but relationships in our daily lives. That said, it's still a human to human relationship. With a machine (or AI) you encounter the same problem I mentioned: no conflict, no tension, no risk of judgment, constant reassurance and comfort.

AI can be helpful in many respects in understanding ourselves to many people. But it's limited in the sense that it is what we want it to be, what we want it to say to us or how we want us to treat us. It won't have its own issues, tensions, conflicts, personality. And those are the ingredients that makes a human to human relationship different than a relationship with AI.

Expand full comment
Lucas's avatar

I see, yes, I like that answer a lot. It almost feels a bit romantic! If such healing relationships required, in principle, some kind of professional training I would be a bit dissapointed. Thanks for your reply :)

Expand full comment
Dr. Linda L. Moore's avatar

Even asking that is idiotic…

Expand full comment
bridget c.'s avatar

This was such a terrific explanation and I can’t tell you how much I appreciate your naming what actual psychotherapy is. And how the robots aren’t it. 😬

Expand full comment
Liz Brown's avatar

I was abused exploited and tormented by my Human psychotherapist for 3 1/2 years and every therapist I have gone to since won’t shut the fuck up about themselves and their own problems so I’m gonna stick with AI thanks

Expand full comment
Selda Koydemir's avatar

Sorry to hear about your experience. Not sure if you have read the whole post but I wrote that some psychotherapists do more harms than good. And what you shared is exactly what I meant. Meaningful or effective psychotherapy is different than what not only a machine does but also many real life therapists do. My wish is to help people and beginner therapists understand that. You were clearly unfortunately. If a therapist abused you, then they should be punished and should not practice at all.

Expand full comment