AI-driven emotion detection could revolutionize dementia diagnosis and care

The AI pipeline extracted valence (positive or negative emotion) and arousal (emotional intensity) features from participants’ facial expressions using two convolutional neural networks trained on the large-scale AffectNet dataset and fine-tuned for the task. These emotion-derived features were then analyzed using three machine learning models, K-Nearest Neighbors, Logistic Regression, and Support Vector Machine, to classify participants by cognitive status. A nested cross-validation framework ensured that performance metrics were unbiased despite the relatively small dataset.


CO-EDP, VisionRICO-EDP, VisionRI | Updated: 09-10-2025 13:37 IST | Created: 09-10-2025 13:37 IST
AI-driven emotion detection could revolutionize dementia diagnosis and care
Representative Image. Credit: ChatGPT

A team of Italian researchers has unveiled an artificial intelligence approach that uses facial emotion analysis to support early and differential diagnosis of dementia. Their work, titled “AI-Based Facial Emotion Analysis for Early and Differential Diagnosis of Dementia” and published in Bioengineering highlights a potentially cost-effective, non-invasive screening method that could help identify cognitive decline in its earliest stages and distinguish Alzheimer’s disease from other types of dementia.

The research, conducted by scientists from the Politecnico di Torino, the LINKS Foundation, and the University of Torino, points to an important development at a time when early detection is critical for deploying new disease-modifying treatments for Alzheimer’s disease. The findings also underscore the growing role of AI-powered tools in complementing traditional, often expensive and invasive, diagnostic methods.

Bridging the gap in early dementia detection

The study addresses one of the pressing questions in dementia care: whether AI-based tools can assist in detecting cognitive impairment before it progresses to overt dementia. The researchers also explored whether such tools can help differentiate Alzheimer’s disease from other dementia types, a challenge that often complicates clinical decision-making.

Data were collected from 64 participants, including 28 healthy controls, 26 patients with mild cognitive impairment (MCI), and 10 patients with overt dementia, among whom 17 had a confirmed diagnosis of Alzheimer’s disease. Participants were recorded as they watched an eight-minute video featuring standardized image–sound pairs that evoked moderate emotional responses. This approach provided a controlled, ethically sound way to capture facial reactions linked to emotional processing, an area often affected in dementia.

The AI pipeline extracted valence (positive or negative emotion) and arousal (emotional intensity) features from participants’ facial expressions using two convolutional neural networks trained on the large-scale AffectNet dataset and fine-tuned for the task. These emotion-derived features were then analyzed using three machine learning models, K-Nearest Neighbors, Logistic Regression, and Support Vector Machine, to classify participants by cognitive status. A nested cross-validation framework ensured that performance metrics were unbiased despite the relatively small dataset.

Accuracy in differentiating cognitive decline

The study assesses how well the AI-based system could distinguish between different stages and types of cognitive impairment. The results showed encouraging performance:

  • 76.0% accuracy in distinguishing MCI from healthy controls
  • 73.6% accuracy in distinguishing dementia from healthy controls
  • 64.1% accuracy in three-class classification (MCI vs. dementia vs. healthy controls)
  • 75.4% accuracy in differentiating Alzheimer’s disease from other types of dementia

These findings are particularly significant for early detection. The system’s strong performance in identifying MCI highlights its potential for spotting cognitive decline at a stage when interventions can be most effective. Furthermore, the ability to distinguish Alzheimer’s disease from other dementia subtypes could guide tailored treatment strategies and improve patient outcomes.

The research also focuses on the importance of the emotion-eliciting protocol used in the study. By relying on standardized and internationally validated stimuli, the team ensured that the emotional responses measured were consistent, ethically appropriate, and clinically meaningful.

Toward accessible and non-invasive dementia screening

One of the key advantages of the approach is its non-invasive and cost-effective nature, making it a potential complement to existing diagnostic methods such as neuroimaging and biomarker testing. This accessibility is particularly important in resource-limited settings where advanced diagnostic tools may not be readily available.

The authors highlighted several next steps for improving the AI model’s utility. Expanding the dataset to include participants from more diverse demographic and ethnic backgrounds would help ensure generalizability. Addressing imbalances in sample sizes, particularly the relatively small number of overt dementia cases, could further refine the system’s accuracy. They also suggested exploring advanced AI architectures, such as visual transformers and hybrid BiLSTM models, to better capture temporal and dynamic aspects of facial expressions. Integrating multimodal features, including speech and other behavioral cues, could enhance the model’s ability to reflect the complex patterns associated with cognitive decline.

The study calls for longitudinal research to examine how facial emotion responses evolve over time in patients with different forms of cognitive impairment. Such data could strengthen the role of facial emotion analysis in continuous monitoring and risk stratification.

  • FIRST PUBLISHED IN:
  • Devdiscourse
Give Feedback