Social anxiety and loneliness may drive AI dependence among students
AI dependence among university students is driven less by routine use and more by the reasons students turn to the technology, with escape from daily stress and the search for social connection emerging as the strongest warning signs, according to a new study. The findings suggest that risk rises when artificial intelligence becomes an emotional refuge, a substitute for difficult social interaction or a tool for avoiding distress, rather than when it is used simply for study or productivity.
The study, titled Using Machine-Learning and Network Analysis to Investigate the Risk Factors of AI Dependence: The Crucial Role of Escape and Social Motivation, was published in Behavioral Sciences and is based on responses from 1,258 Chinese university students. The research used machine-learning models and network analysis to show that escape motivation, social motivation and social anxiety were the strongest predictors of AI dependence, a term the authors use to describe excessive reliance on AI tools rather than a clinically defined addiction.
Escape and social motivation are the strongest predictors
The researchers examined a broad set of possible risk factors, including Big Five personality traits, self-efficacy, depression, social anxiety, adverse childhood experiences and several motivations (escape, social connection, instrumental use and entertainment) for using AI.
Across the analysis, escape motivation and social motivation caught attention. Students who used AI to avoid daily problems, emotional pressure or uncomfortable situations were more likely to report dependence. Similarly, students who used AI for social reasons, including interaction and emotional connection, also showed higher dependence risk.
The result shifts attention away from the technology itself and toward the psychological role AI plays in users’ lives. AI tools can help with study, writing, research and task completion, but the findings indicate that dependence risk rises when students use AI as an emotional refuge or a substitute for difficult social interaction.
Machine-learning results consistently ranked escape and social motivation among the leading predictors. Social anxiety also emerged as a major factor. These results were supported by traditional regression analysis, which found that neuroticism, social anxiety, AI escape motivation, AI social motivation and AI instrumental motivation significantly predicted AI dependence.
The study used four machine-learning algorithms: Elastic Net, Random Forest, XGBoost and LightGBM. The tree-based models showed strong results on training data but weaker results on test data, suggesting overfitting. Elastic Net showed better generalization and was judged more stable for identifying predictors. On the test set, it explained a modest share of variance in AI dependence, which the authors treated as a more reliable result than the stronger but less stable tree-based outcomes.
Escape motivation and social motivation were the most central variables in the network, meaning they were highly connected with other risk factors. The strongest link in the network was between social motivation and escape motivation, suggesting that students who use AI to escape may also use it to meet unmet social needs.
This pattern suggests that problematic AI reliance may not begin with AI use alone. It may develop when emotional discomfort, loneliness, social anxiety and poor coping strategies combine with easy access to responsive AI systems.
Social anxiety and emotional vulnerability shape AI reliance
Students with higher social anxiety were more likely to show dependence on AI. This aligns with earlier research suggesting that socially anxious users may turn to conversational AI because it offers interaction without the uncertainty, rejection risk or pressure of real-world communication.
For some students, AI may feel easier than human interaction because it is always available, responsive, nonjudgmental and personalized. These very features can make it useful as a learning tool, but they may also increase risk for users who already struggle with social contact.
The researchers also found that neuroticism predicted AI dependence. Students higher in neuroticism are more likely to experience emotional instability, stress and worry. For them, AI may become a convenient tool for reassurance, distraction or avoidance. The study suggests that these personality and emotional factors should be treated as part of the risk environment surrounding AI dependence.
Depression was positively correlated with AI dependence, but it did not remain a key predictor in the same way as escape motivation, social motivation and social anxiety. This distinction is important. The findings indicate that mental distress alone may not be the most direct signal. The stronger indicator may be how students respond to that distress, especially whether they use AI to avoid problems or meet social needs.
Adverse childhood experiences were also examined because earlier research has linked them to other forms of behavioral dependence, including internet addiction. In this study, adverse childhood experiences showed a positive correlation with AI dependence but did not emerge as one of the strongest predictors in the final models.
Self-efficacy also did not play the role some previous research might have suggested. The study measured general self-efficacy, or a person’s broad belief in their ability to handle problems. It was not significantly linked to AI dependence in the main regression results and did not rank among the strongest machine-learning predictors. The authors suggest the relationship may be more complex and may depend on mediating factors such as academic stress.
Grade level also showed a negative predictive effect, suggesting students earlier in their academic path may rely more heavily on AI than more senior students. The study does not claim that younger students are inherently more vulnerable, but the result raises questions about AI literacy, academic confidence and the need for early guidance in responsible AI use.
Overall, AI dependence appears to be tied to a cluster of psychological and social pressures. Students who feel anxious, isolated or emotionally overwhelmed may be more likely to use AI not only as a tool but as a coping mechanism. That difference may determine whether AI use remains productive or begins to interfere with daily functioning.
Researchers urge support, not simple restrictions
The researchers argue that responses to AI dependence should not be limited to cutting screen time or discouraging AI use. Such approaches may miss the underlying problem.
If a student relies on AI because they feel lonely, anxious or unable to manage academic and social pressure, simply reducing access may not solve the issue. It could even remove a coping mechanism without replacing it with healthier support. The study instead points to interventions that strengthen social connection, emotional coping and critical thinking.
AI literacy programs should go beyond academic integrity and plagiarism concerns in universities. Students may need guidance on when AI support is useful, when it becomes excessive and how to recognize emotional reliance. Academic support systems should also address stress and confidence, particularly among students who use AI heavily for reassurance or problem avoidance.
Mental health services may also need to consider AI use patterns during assessment. Heavy AI use should not automatically be treated as dependence, but clinicians and counselors may need to ask why the student is using AI. Use for task support is different from use driven by escape, loneliness or fear of social interaction.
The findings also raise questions for AI designers. Conversational systems are increasingly built to be emotionally responsive, personalized and human-like. Those features can improve user experience, but they may also increase attachment among vulnerable users. The study suggests that AI platforms could include safeguards for unusually heavy or emotionally dependent use, such as prompts that encourage users to seek real-world social support when appropriate.
The authors note that the term addiction remains debated in the AI context and warn against overpathologizing normal engagement with emerging tools. Instead, they use AI dependence to describe potentially harmful overreliance that may affect mental health, critical thinking or daily functioning.
The study is not without constraints. To begin with, it relied on self-reported questionnaire responses, which can be affected by memory errors or social desirability. The sample was limited to university students in China and was heavily female, which may affect how widely the findings apply. The design was cross-sectional, so it cannot prove that escape motivation or social anxiety causes AI dependence. It can identify strong associations and predictors, but not final causal pathways.
The researchers call for future studies that track real AI use behavior, include wider age groups and use longitudinal or experimental designs. Such work could test whether reducing loneliness, improving social confidence or strengthening critical thinking changes patterns of AI dependence over time.
- FIRST PUBLISHED IN:
- Devdiscourse

