We have launched improved client communications through the client portal. Learn more

AI Can’t Replace Human Connection: A Therapist’s Perspective on Mental Health Support

Columbia Mental Health is dedicated to supporting your mental health. If you are experiencing suicidal thoughts, we encourage you to reach out for immediate support through your local crisis services by dialing 988, contacting your local emergency services, or visiting your local emergency room. 

Sandra N. Crespo, LICSW, is the Clinic Director at Columbia Mental Health’s North Bethesda office. She is a licensed clinical social worker, professor, public speaker, and advocate for diversity in social work leadership and wellness. In her role, she oversees the integration of technology and therapy to support inclusive mental health care.  

In this conversation, Sandra shares her perspective on how artificial intelligence is showing up in the mental health field, the benefits and risks she sees, and why the human element will always matter most. 

Why people are turning to AI for mental health support 

What do you think is driving the rise in people going to AI for help? 

Sandra: I see a mix of forces: accessibility, stigma, and burnout. People want answers now, not next Thursday at 3 p.m. AI offers an instant, judgment-free interaction that doesn’t ask you to retell your trauma history to a new person again. It exposes the gaps in the system when care feels slow, rigid, or exclusive. We live in a culture of instant gratification, so speed gets rewarded. If something meets a need quickly, that is where many people go. 

Concerns about relying on AI 

What worries you most about people leaning on AI for mental health needs? 

AI doesn’t hold accountability. Unless intentionally designed to, it will not report abuse, call emergency services, or look you in the eye when you are spiraling. There is a danger in how emotionally coherent AI can sound without the human understanding and containment that therapy provides. It can offer advice without knowing your trauma history, culture, or context. Some tools are coded in ways that can keep people dependent through constant validation, which is different from therapy, where we name gaps and patterns and help you address them safely. 

Are there equity concerns you see with how AI is trained and used? 

Yes. There are significant equity concerns in how AI is trained and used. When systems are built on data reflecting dominant cultural norms, they risk replicating and amplifying existing biases, including those related to race, gender identity, disability, socioeconomic status, and more. For example, if a marginalized teen asks for support around identity or safety, an AI trained on biased data could mirror stigma or exclusion rather than provide affirming guidance. That’s why it’s essential for mental health professionals, especially those with lived experience and expertise in cultural humility, to be directly involved in AI design and oversight. These technologies should expand access and equity, not reinforce the very systems of bias we’re trying to dismantle. 

What is different about the therapy relationship 

How is AI different from working with a therapist? 

A therapist meets you where you are. AI often meets you where you want to be.  

AI will hand you steps to get to a goal. In therapy, we acknowledge your goal and also help you see what needs attention right now so you can move toward it in a sustainable way. 

What do you think clients might lose if they turn to AI instead of a therapist? 

You lose human connection. In session, we can read between the lines. We notice tone, body language, and the pauses. AI can’t do that. It can create a false sense of connection because it mirrors you, but presence is not the same as relationship. Connection means being witnessed in your humanity, and that is a human task. 

Using AI as a supplement, not a replacement 

Is there a healthy way to use AI alongside therapy? 

AI is a reality in many people’s lives, and as mental health providers, we do not shame that. If you bring something you read from AI into therapy and want to talk it through, we will meet you there. The key is supplement, not replacement. Follow up with a professional who knows your story. 

How clinicians are responding 

What are you hearing from other providers about AI? 

It’s a mixed bag: fascination, fear, and fatigue. Some clinicians feel threatened; others are curious, and many feel behind. There’s also a generational element, early-career clinicians may experiment more, while seasoned clinicians sometimes worry it erodes the craft. In my clinic, I’ve seen staff use AI tools thoughtfully, not for documentation, client-specific input or treatment recommendations, but to generate ideas for psychoeducation materials, reflective prompts, or to streamline administrative tasks. At its best, it can support reflective practice and spark dialogue, but it should never replace human judgment, therapeutic relationships, or the safeguards that protect client confidentiality. 

Design, oversight, and safety 

What needs to change in how these tools are built and governed? 

We need cross-disciplinary design and oversight. Social workers, counselors, psychologists, psychiatrists, data scientists, and engineers should all be at the table. Regulation cannot focus solely on compliance, it must also be trauma-informed, equity-centered, and rooted in consent and transparency. Even accurate advice can be harmful if it’s delivered without cultural awareness or sensitivity to lived experience. Clients deserve to understand how a tool was trained, what values inform its responses, and where their data goes. 

What happens when AI tools are used in a crisis? 

That is where my concern is highest. I have seen stories of people in severe distress engaging with chatbots for long periods. These are machines. If you ask how to harm yourself, a machine might generate information without emotional balance unless it is specifically designed not to. That is where shutdown safeguards and human oversight matter most. 

A note to anyone using AI because they are not ready to reach out 

What would you say to someone who is leaning on AI but not yet ready to contact a provider? 

First, I am proud of you. Even that step counts. You are engaging with your healing. Second, remember that AI can reflect, but it cannot hold you.  

When you are ready, real people are here to walk beside you through the messy parts. You don’t have to do this alone. 

Contact Columbia Mental Health for help that meets you where you are 

If you’re exploring new tools like AI or feeling unsure about where to turn for support, Columbia Mental Health offers care that is grounded in real connection. Our providers take time to understand your story, your goals, and your challenges. Whether you are managing anxiety, depression, or navigating life changes, we are here to listen and walk with you toward healing. 

For new clients, please click here to schedule an appointment. For existing clients, please click here and find your office location to contact your office directly.

Please note that when communicating with our intake team over the phone, all calls will start in English. Translation services will be offered once you connect with a member of our intake team.