AI and the Need for Connection
- Sierra Fouts

- Nov 10, 2025
- 3 min read

As humans, we are all connected by what are called universal longings or universal needs. Examples of these needs might be safety, stability, connection, love and belonging, and purpose. When these needs struggle to be met whether due to societal culture, isolating events, or relationship issues, technology presents as a seemingly harmless solution. We might turn to social media apps, designed to interact with the reward centers in our brains, stream a movie or TV series for a distraction, or we might turn to the latest technological integration to our world, AI chatbots. Many popular websites have integrated AI bots (ex. Meta for Facebook, Gemini for Google, etc.) with perhaps the most widely used of all being OpenAI’s ChatGPT. ChatGPT introduced convenience by way of proofreading, researching information, logistical planning, and content generating, but it has been discovered that many users will utilize it for personal inquiries and mental/emotional support.
With their assumed privacy, nonjudgment, wide range of knowledge, and 24/7 access, it is understandable why individuals might be inclined to utilize AI bots in a therapeutic capacity. There have been reports of success utilizing such technology for basic support, providing positive coping skills and offering alternative perspectives. However, as with most things, there are certain risks imposed by using AI for therapy.
AI Chatbots/Learning Language Models (LLM) are not bound by HIPAA, therefore cannot ensure privacy or confidentiality of any sensitive information you share with them.
Learning Language Models are designed to adapt and mirror identified beliefs, values, and communication styles of its user. This can create a sort of echo chamber, where users might be validated, but not challenged in necessary and important ways.
AI/LLMs lack human instincts and vital ethical obligations to keep users safe, as with therapists.
When users look to these models for support, they do so alone and remain alone, as opposed to inviting a real person into their struggle, ensuring continued future support.
We all have needs that can go unattended to, and though counterfeit, it makes sense why one might choose to seek attendance through AI/LLMs. If you’ve used ChatGPT (or others) for therapeutic purposes, please don’t feel ashamed, for just as we intuitively design protective mechanisms related to things like trauma, there is a wisdom in utilizing what is available to you. However, you deserve complete confidentiality and privacy. You deserve unconditional positive regard from a real person who will have genuine care for you. You deserve to be figuratively held and witnessed in vulnerable moments. You deserve greater certainty of safe and intentional care. This technology has its benefits, but my hope is that when it comes to your inner experiences and curiosities, that those be reserved for safe and respectful exploration, whether this be through therapy, trusted loved ones, faith practices, or private reflection. Your vulnerability is worth that and more.

Sierra Fouts, LMFT-A
My own experiences have given me deep empathy for the struggles others face, and I’m committed to helping you navigate whatever you’re dealing with—whether it’s anxiety, depression, grief, trauma, ADHD/neurodiversity, relationship dynamics or the aftermath of life’s unexpected turns. My approach is grounded in systems theory, which allows me to understand how family, relationships, and our broader environment shape who we are. I believe that mental health is influenced by the dynamics of the systems we’re part of.
Comments