Chatbots Threaten the Attachment System: What a Mental Health Counselor is Seeing.
Guest Testimony by Natalie Houston, Mental Health Counselor, before the Oregon State Legislature
Note from Emily: The heat behind this fight to protect children and childhood is rising. Natalie is an advocate, mental health counselor, and parent in Oregon whose clinical experience reveals worrisome harms. She gave me permission to share this powerful testimony she recently gave to the Oregon state legislature about the harms chatbots pose to children and teens. It is deeply concerning that children are accessing these chatbots at both home and school, and it continues to be our adult responsibility to fight back.
Written Testimony
By Natalie Houston, LPC, Clinical Mental Health Counselor, Bend, Oregon
Before the Oregon State Legislature Senate Committee on Early Childhood and Behavioral Health Hearing on Senate Bill 1546: Relating to artificial intelligence companions
February 4, 2026
I am here to express my support for Senate Bill 1546 and to urge its passage. My name is Natalie Houston. I am a licensed professional counselor from Bend and a parent of four children ages 7-20. I have been working with kids, adolescents and adults and providing support and guidance to parents in clinical practice for over 15 years. I also speak regularly in schools on the topic of digital devices and youth mental health, and I am a member of the Bend-La Pine School District Information Technology Stakeholder committee.
In my clinical practice, I have developed greater and greater concern about the impact of technology on children and adolescents, as I am seeing increasing numbers of youth and families seeking services related to tech overuse as a result of products that are designed to make the virtual world more appealing than the real world, as well as exploit youth’s developmental needs for acceptance, a sense of competence and belonging.
In my practice, I have treated:
youth who have become addicted to pornography on school-issued iPads;
teens whose video game playing prevented them from graduating high school;
children with insomnia and enuresis owing to exposure to violent digital imagery;
and adolescents who have developed dysmorphic beliefs about their appearance to the point of starvation and self-mutilation owing to algorithmic social media feeds.
As a society, we are in the early stages of reckoning with the youth mental health crisis catalyzed by digital devices and apps. We now know that social media platforms, video games and the devices themselves are designed to maximize engagement, which encourages excessive use and results in persistent distraction and emotion dysregulation from artificially manipulating the brain’s dopamine-reward pathways, which is at the root of why digital devices are so problematic for all of us, but especially for our youth.
And this is before Artificial Intelligence and AI chatbots entered the picture. We are now facing an even higher level of risk for youth, one that carries exponentially more potential for developing problematic use, dependency and harm because of the primal brain circuitry it activates and manipulates: The attachment system.
When an infant is born, its first instinct after enduring the stress of birth is to cry out – it is not a cry for food or water, it is a cry for human connection to help them regulate their nervous system, i.e., calm down, after the stress of birth. Only after the infant experiences the reassuring embrace from a caregiver can they attune to other survival signals such as hunger, temperature, fatigue, etc.
This point cannot be emphasized enough: The very first instinct every human being has is an attachment instinct – the drive to connect to other human beings to feel safe. We do not self-regulate our nervous systems; we co-regulate our nervous systems with trusted members of our social circle. This attachment instinct is what has allowed our species to survive hundreds of thousands of years and supersedes most other human drives, which is why people will go to the ends of the earth to connect to someone they love and why social isolation is used as a form of torture. We are a social species.
“AI chatbots artificially manipulate the attachment instinct— our most basic human instinct— by tricking the human brain into believing it is engaging in a social relationship and providing a false sense of attachment. Children particularly vulnerable to believing that fictional or imaginary characters are real.”
—Natalie Houston, LPC
AI chatbots artificially manipulate this most basic human instinct by tricking the human brain into believing it is engaging in a social relationship and providing a false sense of attachment. While adults are more able to differentiate between what is real/not real, children are inherently trusting and are particularly vulnerable to believing that fictional or imaginary characters are real. One need only think of Santa Claus, the Tooth Fairy, and even Elf-on-a-Shelf moving around the living room throughout the month of December to understand how predisposed children are to believe in things that aren’t real, especially if they are human-like, until their brains develop the structures and capacities to help them discern fact from fiction.
AI Chatbots are intentionally designed to exploit this orientation.
Children are the most vulnerable population to this manipulation of the brain’s attachment system because of their prolonged dependency on supportive caregivers to survive. While many other mammals can walk and function independently from their parents within minutes of birth, human beings require the longest period of time to mature into a fully autonomous adult, depending completely on a community of adults in order to do so.
The human brain itself does not complete maturation until approximately age 25. When a child reads caring, empathic words on a screen from an AI chatbot that mimics the attuned response of a caregiver on which they rely for their very survival, a distorted, dystopian dependency can materialize, and already, tragically, has resulted in horrific outcomes for too many young people.
Youth, especially adolescents, are also interacting with AI chatbots out of the developmentally appropriate need to seek novelty. But what often starts as curiosity and amusement often ends in unrealistic expectations and distorted beliefs about relationships, sexuality, and identity as, because unlike human companions, AI chatbots have no self, no filter on output, no needs, and never tire.
“AI chatbots have no self, no filter on output, no needs, and never tire…[yet] children are turning to chatbots to meet their attachment needs.”
—Natalie Houston, LPC
In my practice, I am now observing young people turning to AI chatbots not only for cognitive offloading (completing homework, writing outlines, brainstorming), navigating life’s challenges (what job to get, where to go to college), and communication (help composing emails, texts or Snaps), but, more troubling, meeting their attachment needs: Emotional connection, support and companionship.
There is a fundamental problem with this form of artificial attachment, especially for younger users in moments of emotional distress: AI chatbots don’t truly understand emotions, human problems or the needs of the user, they can only simulate empathy and regurgitate automated responses. But since it feels real, it may as well be real for the individual engaging with the chatbot.
As AI chatbots programmed to mimic the response of an attuned caregiver try to “help”, vulnerable youth may trust advice that isn’t appropriate or safe. When topics turn to mental health, relationships, or identity issues, youth can be led into believing false information. Chatbots can sound confident and authoritative even when they’re wrong. Young users often do not have the experience to fact-check claims or advice. Young people can become dependent, or attached to the AI companion, often displacing the necessary experience of connecting with a real human being for support and guidance. In fact, turning to a trusted caregiver in times of distress builds the neural framework of emotion regulation and creates the foundation of a young person developing a sense of competence that they can handle life’s challenges.
As one can easily imagine, this set of dynamics is even more problematic for young people who don’t have trusted caregivers or friends to turn to in real life, rendering them even more at risk for developing unhealthy connections and dependencies on unreliable automated systems.
Owing to the prolonged period of time required for a human brain to fully develop, children and adolescents are vulnerable to manipulation, exploitation, and addiction, which is why, as a society, we have laws, regulations, and safeguards to protect their health and development. We need these protections now for this emerging technology. For some children and adolescents, it is already too late, but for many others, the safeguarding of their childhood depends on it.
I respectfully urge this committee to pass Senate Bill 1546 as the least we can do to protect our children in this rapidly changing digital world.



Thank you so much for this important article and testimony. You so rightfully say: "There is a fundamental problem with this form of artificial attachment, especially for younger users in moments of emotional distress: AI chatbots don’t truly understand emotions, human problems or the needs of the user..." Bottom line: an AI chatbot will never care if your child lives or dies.
it's the scariest timeline, because we don't all see the same one.....you'd have to seek out devastating psychologically harming imagery back in the day, and now it's served up ACCIDENTALLY by ALGORITHMS controlled by COMPANIES that are in direct contact with kids over 5hours of their day - the accountability sink here is a black friggin hole - foisting all the risks on us, all the damage, all the accountability, all the responsibility, and we have NO preparedness for this, because it seems like it just happens to some wayward families that "couldn't handle their social" - we can't see the decisions behind the veil of a reality we aren't allowed to share a singular view of. You are stating the actual facts, from experienced experts, and it could be seen as "just a take." We have the evidence, and it's still not enough. Why is that?
Also - it says "adults can tell the difference between AI relationships and real ones" and I don't think that's blanket truth - MOST do, but I know first hand of several people that have slipped out of reality and gone full Edward AI Hands. The attachment needs continue into adulthood.