top of page

Artificial Intelligence, Artificial Therapy: Interview with InsideHook


couples therapy

There is a meme that men will do anything instead of going to therapy. Instead of getting therapy for anxiety, or going to trauma therapy, men will play video games, hang out with friends, or start a podcast. However, now that it is sort of possible to get psychotherapy from artificial intelligence, that might change.


The excitement around artificial intelligence (AI) has caused many to explore the limits of its abilities, shoehorning it into every conceivable use-case until interacting with it becomes an exercise in futility. However, there is growing interest in using AI to resolve relationship disputes, as a digital girlfriend, or as a digital psychotherapist which can be much worse than useless.

 


therapy for couples

Recently, InsideHook reached out to Madison Park Psychotherapy’s founder and clinical director, Jordan Conrad, to discuss the ups and downs of AI therapy. In “Is AI Therapy Really a Good Idea?”, InsideHook reached out experts to discuss AI’s growing role in the mental health landscape.

 

Jordan explains that there is room for AI in psychotherapeutic treatment programs. Stepped care is a treatment model that “works by matching a patient to the least costly and invasive treatment relevant to their needs, and then stepping up, as it were, to more intensive treatments as needed.” Many patients start with bibliotherapy – directed reading self-help books designed to alleviate symptoms – before moving to digital programs, group therapy, and individual therapy. “Digital mental health interventions, including AI therapists, could serve as an excellent addition to stepped care programs,” Jordan told InsideHook.

 

Others, however, suggest that AI could serve as a standalone treatment, or an adjunctive to psychotherapy with a human therapist. Jordan is skeptical about that: “It’s an exciting time for AI and it is important not to get swept up in it at the expense of patient safety,” Jordan explains, “There are a lot of cases of digital programs causing a lot of harm to a lot of very vulnerable people and people talking about all the upsides are generally omitting discussing these severe downsides.”

 

"Right now there really isn't any reliable reporting on safety issues. One recent meta-analysis found that over a third of the studies evaluated didn't even include safety data and the remaining two thirds used inadequate or widely varying methods (which makes the data difficult to compare). Another meta-analysis found that the few studies assessing safety in mental health chatbots had a high risk of bias. That really needs to be addressed."

And, Jordan explains, these are not merely abstract concerns. There are documented cases of people getting therapy for depression from an AI chatbot who ended their life at the chatbots encouragement, people seeking therapy for disordered eating given advice on how to lose weight, and people using AI to get therapy for anxiety becoming more isolated by using it. Although finding a therapist in New York City can be tough, the costs of AI therapy might be too high at the moment.


Even its use alongside human psychotherapists is more difficult than it appears. Professional organizations like the American Psychological Association or the National Association of Social Workers have not issued any instruction about the permissible use of AI in treatment and the government has not created an agency capable of, or prepared to, regulate these technologies.



 
 

Madison Park Psychotherapy 

1123 Broadway, New York, NY, 10010

All content copyright ©2025 Madison Park Psychotherapy. All rights reserved

bottom of page