AI-Powered App Developed to Detect Depression by Facial Expressions

Sat Mar 02 2024
icon-facebook icon-twitter icon-whatsapp

LONDON: In a significant breakthrough for mental health, scientists have introduced an AI-powered smartphone application called MoodCapture, designed to detect mood and early signs of depression by analyzing facial expressions and environmental cues.

Developed by researchers at Dartmouth, MoodCapture utilizes artificial intelligence and facial-image processing software to analyze facial expressions and surroundings captured by the smartphone’s front camera during regular phone use.

The app’s innovative approach involves evaluating images for clinical indicators associated with depression, such as facial expressions, eye movements, head positioning, muscle rigidity, and environmental features like lighting, colors, and the presence of others in the image.

MoodCapture Identifying Symptoms of Depression

In a study involving 177 persons diagnosed with major depressive disorder, MoodCapture demonstrated an impressive 75% accuracy in identifying early symptoms of depression. Researchers correlated self-reports of feeling depressed with specific facial expressions and background details captured by the app.

MoodCapture operates in real-time, analyzing a sequence of images each time the user unlocks their phone. Over time, the app establishes personalized connections between expressions and background details, enabling it to identify user-specific features indicative of the onset of depression.

The application aims to recommend proactive steps, such as spending time outdoors or reaching out to a friend, rather than directly notifying individuals about the possibility of entering a depressive state.

With further development, MoodCapture could potentially be available to the public within the next five years, revolutionizing how individuals manage and address depressive symptoms through proactive detection and monitoring of mental health.

icon-facebook icon-twitter icon-whatsapp