Internet scams trigger long-term mental health crisis

Victims voiced dissatisfaction with the response from financial institutions and regulatory bodies like the Australian Financial Complaints Authority (AFCA), which they felt lacked empathy, accountability, and timeliness. Many reported that these interactions compounded their distress rather than alleviated it. Several also criticized the lack of follow-up and transparency from police investigations. This compounded the feeling of abandonment by institutions tasked with their protection.


CO-EDP, VisionRICO-EDP, VisionRI | Updated: 18-06-2025 18:28 IST | Created: 18-06-2025 18:28 IST
Internet scams trigger long-term mental health crisis
Representative Image. Credit: ChatGPT

A new research warns that victims of internet scams suffer lasting emotional trauma and lack access to effective support systems. The findings, published in the International Journal of Environmental Research and Public Health (2025, Vol. 22, Issue 938), call for human-like AI mental health solutions to bridge critical care gaps.

The study titled “The Mental Health Impacts of Internet Scams” sheds light on the widespread psychological devastation caused by internet scams. Beyond the well-documented financial consequences, the paper presents new evidence of long-term mental health impacts that include anxiety, depression, post-traumatic stress disorder (PTSD), and social isolation.

The research integrates a literature review with an intrinsic case study of a group of Australian victims targeted by an elaborate investment scam. These individuals experienced emotional consequences extending well beyond the initial loss. Insomnia, persistent anxiety, depressive symptoms, relationship breakdowns, and even suicidality were reported among the 25 participants. These effects often endured for months and, in many cases, years. Even those who recovered some of their losses continued to suffer psychological distress, indicating that the damage goes far beyond financial terms.

According to the study, a major driver of trauma is the shame and embarrassment victims feel, which often discourages them from seeking help. The social stigma of falling for scams silences victims, delaying psychological recovery and exacerbating feelings of isolation. This results in a disturbing loop, victims feel both violated and ashamed, preventing them from accessing care and support systems that might otherwise assist their recovery.

How did the case study illustrate real-world harms?

The case study portion of the research is particularly compelling due to the author’s dual role as both investigator and victim. Balcombe conducted the study using archival data, interviews, and a questionnaire among scam victims from across Australia who were scammed between 2023 and 2024. The investment scam used fake websites and advertisements to lure victims, impersonated financial advisors, and directed victims to deposit funds into phony accounts.

Participants described a pattern of emotional deterioration following the scam. Initial disbelief gave way to fear, loss of self-esteem, and guilt. Some reported using antidepressants or anxiolytics for over a year, with PTSD symptoms persisting regardless of the extent of financial loss. The psychological harm was not limited to the individual; many victims reported that the scam disrupted their social lives, leading to emotional withdrawal and in some cases, the breakdown of intimate relationships.

The findings further reveal systemic inadequacies. Victims voiced dissatisfaction with the response from financial institutions and regulatory bodies like the Australian Financial Complaints Authority (AFCA), which they felt lacked empathy, accountability, and timeliness. Many reported that these interactions compounded their distress rather than alleviated it. Several also criticized the lack of follow-up and transparency from police investigations. This compounded the feeling of abandonment by institutions tasked with their protection.

Can artificial intelligence improve support for scam victims?

The study concludes with a call for urgent innovation in mental health support systems, particularly the integration of digital mental health (DMH) services and artificial intelligence. It highlights the potential of emotionally attuned, trauma-informed AI chatbots to serve as accessible and stigma-free companions for victims. These AI tools could help guide users through emotional regulation, promote resilience, and offer crisis support when traditional systems fall short.

While apps such as Wysa, Woebot, and Youper are already showing promise in managing anxiety and depression, the study emphasizes the importance of designing AI chatbots specifically tailored to scam victims. These tools would ideally combine memory-based emotional intelligence with lived experience-informed design, enabling them to offer personalized support in moments of acute emotional need.

AI can also assist in upstream prevention by flagging suspicious activity, educating users about scam tactics, and integrating with financial platforms to offer real-time alerts. However, the paper stresses that AI solutions must be designed ethically, incorporating safeguards against data loss, algorithmic bias, and hallucination. Importantly, such tools should not replace human care but rather complement it, especially in crisis scenarios requiring real-time intervention.

The study asserts that building trustworthy, inclusive, and accessible digital mental health infrastructures is not just a technical challenge, it is a moral imperative. Scam victims deserve effective, compassionate support that recognizes the complexity of their trauma. AI tools can help fill that gap, but only if deployed with transparency, cultural sensitivity, and sustained institutional backing.

  • FIRST PUBLISHED IN:
  • Devdiscourse
Give Feedback