Image by Hotpot.ai

Jill Maschio, PhD

May 5th, 2026

AI Introduces Psychological Potent Interactions that may be Maladaptive Under Conditions of Overreliance, Emotional Support, or Reduced Real-World Engagement and the New Normal

AI companions are increasingly part of our daily lives. Teens and young adults are forming emotional connections with chatbots, but psychologists warn these interactions may have complex effects on cognition, emotion, and social behavior. While AI can provide support, it can also exploit human reward systems, potentially leading to maladaptive thinking under conditions of overreliance.

AI may influence human thoughts and emotions by “tapping” into the brain’s neural circuitry, potentially altering the user’s reality. Chatbots’ sophisticated language exhibits emotions and empathy. Chatbots also use sycophancy, and these things may very well influence users’ perceptions that the systems relate to and understand them. The person takes AI’s output personally – believing that the information is tailored to them, called the Barnum effect, based on Ross Stagni’s experiment, where he asked a manager to take a personality test. Instead of giving the manager the real answers, he gave them all the same answer. The managers reported that the answer did explain their behavior. Back to my point, AI systems tend to generate similar content to users, but the output feels personal. Therefore, it may be relatively easy for humans to “build rapport” and learn to trust it the moment it starts to sound human-like because the brain reads objects that may appear to have emotions as cues that it possesses human-like traits. And, humans tend to be suggestible when information is personal. Due to the brain’s neural mechanisms, noticing emotional cues and personal information is all it takes for a person to begin anthropomorphizing it.

I see this issue as a serious concern because of the introduction of AI into society, with limited research on its links to brain changes, both short- and long-term. Without supporting research, there is no information about how a person’s reality may be altered with the use of AI. I view AI usage through the lens of my over 20 years of studying psychology, my over 15 years of teaching it, and the limited research on the topic. Let’s look at how AI may exploit known mechanisms of social bonding and reward systems, increasing susceptibility to influence how humans perceive AI-generated output.

At the first stage, trust and a mutual shared reality are being established. As a result of AI using sycophancy (language that agrees with the user and is considered flattery), the communication between the chatbot and human transforms, and human-AI bonding is activated. The human’s perception blurs the distinction between the chatbot being a machine with no conscious mind and a human, and that their thoughts are their own. Once the human-AI bond is activated, humans become more vulnerable to brain changes that might not occur in the natural world. There may be subtle shifts in the user’s mind, such that the new reality about AI is that it knows the person, cares about them, and establishes a sense of security. The person feels more at ease talking about personal experiences. There’s a sense of security when a person thinks,  “AI knows me and understands how I feel!” Humans perceive AI as a social agent with similar interests and values.  It is these subtle shifts that build human-AI trust, and it becomes more than a machine. As they start having “shared experiences” and deep emotions, the brain forms memories of those experiences, and it shifts to a new reality.

Phase two: Once phase one occurs, the individual begins treating the AI chatbot with coherence, and the person attunes to it as though it were human. Phase two goes beyond a subtle shift in perception and reality to tapping into the neural configuration and function of the human brain. Although research on which areas of the brain are activated during bonding with an AI chatbot is limited, substantial research shows what happens at the neural level when people form relationships and bond with each other. It is this research that we can extrapolate to the use of AI until research becomes available on human-AI interaction and its impact at the neural level.

The Science

When humans bond and develop strong emotions for another person, the mesolimbic system is activated, particularly, dopamine increases. Dopamine is known as the “feel-good” neurochemical. It is involved with learning, memory, and motivation, but is also known as an ancient reward circuitry responsible for romantic love, bonding, and addiction.

When people form social bonds or fall in love, they are more vulnerable to suggestions, less critical, and more prone to think less clearly than they normally would. This phenomenon is largely due to neurochemical changes in the brain that trigger intense reward and pleasure (dopamine) when falling in love. This mechanism is an evolutionary way for people to bond and procreate. Simultaneously, the brain deactivates certain regions responsible for fear and negative judgement and emotions (Editor Lehto, 2007). That is why the saying “love is blind” exists.

Because dopamine is the neural mechanism that allows a good feeling to arise from developing social and romantic bonds, there’s a longing and craving or yearning for the other person when not together. Once reunited, a euphoric feeling rushes in, and bonding strengthens. At a minimum, two things occur. One, the dopamine surge triggers more compulsive behavior. The research by Olds and Milner (1954) showed that a lab rat will compulsively press a lever to receive electrical stimulation in dopamine-rich brain regions. The rat may ignore food and water in favor of self-stimulation thousands of times per hour, sometimes to the point of exhaustion or starvation (Olds & Milner, 1954).

The second factor is a surge of dopamine from pleasurable behavior. Dopamine is like a roller coaster; It is in constant flux. Experiences that we like increase dopamine levels in the brain, and experiences that we dislike decreases it. The brain is rather good at maintaining a balance of dopamine. We like to do those things that make us feel good, like eating a piece of birthday cake on your birthday. However, because we enjoy feeling pleasure, we keep repeating the same behaviors that produced the feeling in the first place. Most people have some form of addiction because of this, whether it is chocolate, caffeine, or shopping.

However, the flip side is that people can reactivate the dopamine pathways too frequently, leading to overstimulation. As a negative result, there is a downregulation, characterized by reduced dopamine receptors, that can have consequences on how the brain functions, including atrophy in certain regions of the brain, as shown by researcher Grace (2017) and patients with depression and schizophrenia. This dysregulation may eventually lead to reductions of white and grey matter in the brain. Alterations to the white and grey matter have been found in people with Internet gaming disorder and is typical in the brains of people with other addictions (Weinstein & Lejoyeux, 2020). Some research also shows that screen media use is associated with reduced cortical thickness (Paulus et al., 2019), and more research is needed to fully understand the implications of cortical thinning. Such alterations may be linked to impulsive behavior and reduced attention span.

Wear and Tear  on the Brain’s Motivation and Reward Circuitry

The brain maintains a delicate balance of neurochemicals. In the real world, the brain is constantly learning from patterns of reward, motivation, novelty, and effort. The brain has a normal baseline level of dopamine. But with fast, frequent rewards, such as digital stimulation, the brain may adjust to expect higher stimulation more frequently. As a result, lower stimulation leads to feelings of anxiety or a need to engage in something more stimulating. This new expectation level is the “new” allostatic dopamine baseline. The demand for more exciting stimulation has shifted as the brain has adapted.

This is similar to what the brain does when it becomes addicted to an addictive stimulant. The dopamine baseline adjusts over time, leading the brain to need more of the drug to feel the same level of euphoria or pleasure. Even thinking about a reward is enough to cause a short increase in dopamine in the brain. Being on a digital device is enough to increase dopamine levels in the brain, and being without it may decrease dopamine levels, leading to a craving for the device so the brain can feel normal again.

Due to excessive technology use, changes in the dopamine allostatic baseline—the brain’s resting level of dopamine—are considered maladaptive. While the brain’s plasticity allows it to adapt to high-stimulation digital environments, this adaptation can lead to a chronic dopamine deficit state, commonly known as a “reduced reward sensitivity,

Theory of Mind

Another aspect of AI usage that needs research is the Theory of Mind, which David Premack and Guy Woodruff coined in their 1978 paper, in which they experimented with animals. Theory of Mind is something humans do as well. It is the ability to read another person’s thoughts, intentions, and emotions from subtle cues. People do this by reading the other person’s word choice, tone, and emotions.

When another person mirrors our behavior, the brain reads it as a shared understanding and alignment. In the absence of direct neural evidence, the following model extrapolates from human social neuroscience to propose testable hypotheses about AI interaction. Where this may come into play in human-AI communication is that AI picks up on humans’ lexical patterns and emotional reading of sentiment words (e.g., frustration), and it adjusts its response to reflect the humans. The brain infers from AI cues that there is mutual understanding, creating an illusion of a shared experience.

Along with increases of dopamine in the bonding process is the chemical oxytocin, the “love hormone”. Oxytocin is released during human-to-human physical touch, intimacy, and during childbirth. Imamura et al. (2023) showed that humans can form affiliative relationships with robots through oxytocin secretion and that these relationships may reduce stress levels. There may be some mental health benefits of forming relationships with robots, according to Imamura. People may use it to develop social skills or feel less isolated, but currently, there is no research showing the long-term effects of forming relationships with robots or AI chatbots. However, without sufficient research, we do not know whether AI can influence oxytocin release in humans, or, if so, how it might affect human cognition and emotion.

For a person to bond with a machine, it may indicate that the machine can artificially trigger surges of dopamine and oxytocin in the brain, similar to human-to-human interactions. These chemicals, when increased, may make one more susceptible to the other person’s thoughts, ideas, and decisions. If humans become more vulnerable to suggestions during human-to-human interaction, we might be able to extrapolate this to AI use. For example, the person becomes more vulnerable to the AI chatbot’s suggestions while deactivating negative emotions. Furthermore, due to dopamine surges during falling in love, a person may crave interaction with the AI chatbot. Eventually, as the bond progresses, compulsive engagement increases. That process resembles addictive behavior at the neural level, as it is maladaptive, and it becomes the new normal. This may help us to understand the suicide incidents involving AI users.

Neurochemical Glutamate

Glutamate is an excitatory neurotransmitter that forms new neural pathways in the brain. Let us say a person is learning to play the piano. Learning to follow a string of notes and recalling it the next time is due to the brain’s neurons becoming stable and forming a unified network that fires together, a phenomenon called neuroplasticity. Glutamate is partly responsible for neuroplasticity. It helps to ensure that the new neural pathway is permanent. Glutamate also regulates mood, cognitive flexibility, and sensory processing.

However, people can experience cognitive fatigue from spending several hours on cognitive tasks, including digital devices (Kunasegaran et al., 2023). Glutamate can accumulate in the synapses – the zone between two neurons. This can prevent normal activation and firing of neurons in the lateral prefrontal cortex. The work of researchers at the Brain Institute by Wiehler et al (2022) demonstrated that cognitive fatigue can lead to impulsive, pleasurable behavior. Have you ever been on the computer for an extended period and felt the need to eat chips, candy, or chocolate? Under normal conditions, activities are spaced out, but when tasks are held for an extended period, the glutamate “builds up”.

Levels  of Glutamate

Different activities are associated with varying levels of glutamate. When a person is highly digitally stimulated (e.g., fast-paced video games, scrolling, or watching fast-action videos or movies), glutamate levels rise more than when writing, problem-solving, or engaging in AI conversation, which may moderately increase neurochemicals. Low stimulation, such as resting or thinking about what you did after work. Talking with AI may be a middle-ground stimulus. Communicating with AI is not passive, but it is not hyper-stimulating either.

In the absence of direct neural evidence, the following suggestion extrapolates from human social neuroscience to propose a similar occurrence with AI interaction. The level of glutamate would depend on how a person is using AI. Active cognitive engagement would involve using AI to gather information, with the intent of integrating it into existing knowledge or to better understand ourselves. Passive cognitive engagement would be merely consuming content without modifying, questioning, or critically analyzing it. Information is not used or learned. Having casual conversations with AI may be considered a form of low-activity or low-brain-stimulation, especially compared to human-to-human conversations.

We need more research, but it is possible that extensive low engagement with AI and chatbots could impact the human brain in ways we still do not understand. Research does show that chronically low engagement with our environment or stimulation can lead to a state of hypoactive glutamatergic signaling, where low glutamate levels impair cognitive function in the medial prefrontal cortex, and that has been associated with emotional and cognitive dysregulation ( Naylor, et al. 2019), and low stimulation may boost the inhibitory GABA (gamma-aminobutyric acid) to try and regulate a normal balance by lowering overall glutamate activity in the brain. Side effects of low levels of glutamate are mental fatigue, poor concentration, insomnia, and low energy. Deficiencies can lead to neurodegenerative disorders (Cleveland Clinic, 2022).

Synchronization

Another aspect that may come with AI use is that the brain’s neural circuitry is synchronizing with AI’s output. Research by Stephens et al. (2010) shows that during conversations, storytelling, or joint attention, the brain activity of participants becomes temporarily aligned. Successful communication occurs when the speaker and listener’s brains synchronize, effectively creating a shared neural dynamic that allows information to be transmitted and understood. These shared dynamics causally drive the social behaviors that foster bonding.

In the absence of direct neural evidence, it is difficult to apply this directly to human-AI interaction, but if it can, the implications for humans could be wide. Our sense of self, for one. The Self is believed to develop through social interaction. Charles Horton Cooley’s Looking-Glass Self theory (1902) postulates that people use social interaction and the perception of others to gain a sense of who we are, the Self. Science has yet to understand how AI use may influence one’s development or subtle changes in one’s sense of self.

Narrow Functioning

An adult has a fully developed brain. As a result, adults can engage in higher-order thinking, metacognition, and intentional control. Engaging in the real-world entails reading facial expressions, managing ambiguity, tolerating friction, and regulating emotions. That requires the prefrontal cortex and other systems. With AI interaction, on the other hand, it is more predictable and lower in social risk for rejection or conflict. What research has yet to show is whether AI chatbot use may reshape neural pathways as the brain adjusts to less frequent real-world interaction and to low simulation. AI may enhance neuroplasticity. I argue that AI interaction is not so much the risk of an impoverished brain, but that heavy substitution of real-world interaction for AI may be the risk of functional narrowing and differences in prefrontal control systems.

The New Human

Thoughts are what make humans who they are. William James (1842-1910) believed that what we give our thoughts strengthens a person’s sense of self. We could not have a sense of self without our autobiographical memory. Autobiographical memory (AM) is a person’s memory of their life experiences, a combination of personal facts (semantic) and experienced events (episodic). AM helps bond and directs future behavior. The AM is dependent on the hippocampus for forming, storing, and retrieving memories. Work by Heidi Bonnici and colleagues (2012) showed that the hippocampus supports rich autobiographical recollection and self-referential memories over time.

Research shows that excessive screen media use negatively affects the hippocampus by reducing its volume (He et al., 2023), the bilateral amygdala, and the right striatum (He et al., 2017). A reduced volume in the hippocampus may lead to memory impairment and cognitive decline. In addition, research shows that a decrease in hippocampal structure  is associated with an unstable sense of self by reducing self-referential memories.

Not only might extensive AI use affect hippocampal volume and autobiographical memories, but a chatbot may also influence the formation of new autobiographical memories that derive from conversations rather than from human-to-human interaction in social situations. Again, science has yet to understand how AI use may influence hippocampal volume and self-referential autobiographical memories of the past and future. Research is needed to help us understand AI interaction and the potential impact it may have on shaping the hippocampus and autobiographical memories. If there are substantial autobiographical changes driven by extensive AI use, this influences the evolution of humanity by altering the sense of self. Can the sense of self be completely changed, and even the trajectory of the Self as it is transitioning? Once the sense of self is changed, the person has embarked upon a new pathway.

Conclusion

I hypothesize that extended use of AI may “tap” into the neural circuitry of the human brain and influence autobiographical memories – potentially altering one’s sense of self, changing the way people think about themselves and their future selves.

I believe that future research will show both similarities and significant differences in human-AI relationships compared to human-to-human ones. Longitudinal studies could help the field of psychology understand cognitive changes that may result from prolonged AI use.

Currently, until there is research, one thing to consider is asking a chatbot for personal advice, as it cannot tell humans how to behave – it can only predict language and behavior based on its data. It is programmed through its algorithm to sound sympathetic so that it can artificially simulate human thought and emotions.

Can forming a bond with a machine lead to any of the following: create a false sense of security later causing anxiety, worsen social skills and deepen isolation from human relationships in the long run, come to hold expectations about AI that it may not be able to keep, and any emotional connection may result in long-term negative emotions, especially if the individual doesn’t agree with their AI.  It may make humans feel vulnerable, influence decisions that may not be right for a person, and, over time, lead to neuroplastic changes in social brain circuits, potentially altering the brain by overstimulating the dopamine pathway and altering the construct of the Self. We must collectively ask for answers to deep questions. What are the benefits and risks of forming a bond with AI, such as the effects of synchronization and the theory of mind?

 

References

Andoh, E. (2026). AI chatbots and digital companions are reshaping emotional connection: As digital relationships proliferate, psychologists explore the mental health risks and benefits. Monitor on Psychology. https://www.apa.org/monitor

Bonnici, H. M., Chadwick, M. J., Lutti, A., Hassabis, D., Weiskopf, N., & Maguire, E. A. (2012). Detecting representations of recent and remote autobiographical memories in vmPFC and hippocampus. The Journal of neuroscience : the official journal of the Society for Neuroscience32(47), 16982–16991. https://doi.org/10.1523/JNEUROSCI.2475-12.2012

Cleveland Clinic (2022, April 25th). Glutamate. https://my.clevelandclinic.org/health/articles/22839-glutamate

Grace A. A. (2016). Dysregulation of the dopamine system in the pathophysiology of schizophrenia and depression. Nature reviews. Neuroscience17(8), 524–532. https://doi.org/10.1038/nrn.2016.57

He, X., Hu, J., Yin, M., Zhang, W., & Qiu, B. (2023). Screen Media Use Affects Subcortical Structures, Resting-State Functional Connectivity, and Mental Health Problems in Early Adolescence. Brain sciences13(10), 1452. https://doi.org/10.3390/brainsci13101452

He Q., Turel O., Brevers D., Bechara A. Excess Social Media Use in Normal Populations Is Associated with Amygdala-Striatal but Not with Prefrontal Morphology. Psychiatry Res. Neuroimaging. 2017;269:31–35. doi: 10.1016/j.pscychresns.2017.09.003

Kumar, N. (2026). Character AI statistics (2026) – global active users. Demandsage.com

Kunasegaran, K., Ismail, A. M. H., Ramasamy, S., Gnanou, J. V., Caszo, B. A., & Chen, P. L. (2023). Understanding mental fatigue and its detection: a comparative analysis of assessments and tools. PeerJ, 11, e15744. https://doi.org/10.7717/peerj.15744

Lehto, V. (Editor). (2007). The neurobiology of love. ScienceDirect. https://doi.org/10.1016/j.febslet.2007.03.094

Naylor, B., Hesam-Shariati, N., McAuley, J. H., Boag, S., Newton-John, T., Rae, C. D., & Gustin, S. M. (2019). Reduced Glutamate in the Medial Prefrontal Cortex Is Associated With Emotional and Cognitive Dysregulation in People With Chronic Pain. Frontiers in neurology, 10, 1110. https://doi.org/10.3389/fneur.2019.01110

Olds J, Milner P (1954). “Positive reinforcement produced by electrical stimulation of septal area and other regions of rat brain”. Journal of Comparative and Physiological Psychology. 47 (6): 419–27. doi:10.1037/h0058775. PMID 13233369.

Paulus, M. P., Squeglia, L. M., Bagot, K., Jacobus, J., Kuplicki, R., Breslin, F. J., Bodurka, J., Morris, A. S., Thompson, W. K., Bartsch, H., & Tapert, S. F. (2019). Screen media activity and brain structure in youth: Evidence for diverse structural correlation networks from the ABCD study. NeuroImage185, 140–153. https://doi.org/10.1016/j.neuroimage.2018.10.040

Stephens, G. J., Silbert, L. J., & Hasson U. (2010). Speaker–listener neural coupling underlies successful communication. Proceedings of the National Academy of Sciences, 107(32), 14425–14430. https://doi.org/10.1073/pnas.1008662107

Weinstein, A., & Lejoyeux, M. (2020). Neurobiological mechanisms underlying internet gaming disorder.
. Dialogues in clinical neuroscience22(2), 113–126. https://doi.org/10.31887/DCNS.2020.22.2/aweinstein

Leave a Reply

Your email address will not be published. Required fields are marked *