An estimated 300 million people globally, roughly 4% of the world’s population, are affected by some form of depression. However, recognizing this mental health condition can be difficult, particularly when individuals do not express their feelings of distress to friends, family, or healthcare providers.
To address this challenge, Sang Won Bae, a professor at Stevens Institute of Technology, is developing several AI-powered smartphone applications designed to non-invasively alert individuals—and those around them—when they may be experiencing depression.
“Depression is a major challenge,” Bae states. “We want to help.” She emphasizes that since most people rely on smartphones daily, these devices could serve as valuable tools for early detection.
Eye Snapshots Reveal Mood Changes
One of the systems Bae is working on, called PupilSense, continuously captures snapshots and measurements of a user’s pupils through their smartphone.
“Research conducted over the past three decades has shown a consistent correlation between pupillary reflexes and depressive episodes,” Bae explains.
PupilSense calculates pupil diameters by comparing them to the surrounding iris during 10-second photo bursts taken when users unlock their phones or access various apps.
In a preliminary study involving 25 volunteers over four weeks, the system analyzed around 16,000 phone interactions once the pupil-image data was gathered. After training an AI to distinguish between typical and atypical pupil responses, Bae and her team processed the data and correlated it with the participants’ self-reported moods.
The most effective version of PupilSense, known as TSF, which utilizes only selected high-quality data points, achieved 76% accuracy in identifying moments when users reported feeling depressed. This accuracy surpasses that of the leading smartphone-based depression detection system currently under development, called AWARE.
“We will continue to advance this technology now that we’ve established its potential,” Bae notes, having previously created smartphone systems aimed at predicting binge drinking and cannabis use.
The system was first presented at the International Conference on Activity and Behavior Computing in Japan earlier this year and is now available as an open-source project on GitHub.
Analyzing Facial Expressions for Insights
Bae and her team are also working on a second system called FacePsy, which analyzes facial expressions to provide insights into mood states.
“Numerous psychological studies suggest that depression can be identified through nonverbal cues like facial muscle movements and head gestures,” Bae points out.
FacePsy operates in the background of a phone, capturing facial images whenever a user opens their device or commonly used applications. Importantly, these facial images are deleted almost immediately after analysis to protect user privacy.
“We weren’t initially certain which facial gestures or eye movements would correlate with self-reported depression,” Bae explains. “Some findings were expected, while others were quite surprising.”
For example, the pilot study indicated that increased smiling may not signify happiness, but rather potential signs of a depressed mood.
“This could be a coping mechanism—people putting on a ‘brave face’ for themselves and others while feeling down,” Bae suggests. “Alternatively, it could be an anomaly of the study. Further research is necessary.”
The preliminary data also highlighted other potential indicators of depression, such as reduced facial movements in the morning and specific patterns of eye and head movements. For instance, yawning or side-to-side head movements during the morning were strongly linked to increased depressive symptoms.
Interestingly, more open eyes during morning and evening hours were also associated with potential depression, implying that outward signs of alertness or happiness might sometimes mask deeper feelings of sadness.
“Existing AI systems for detecting depression often require wearing devices or multiple pieces of equipment,” Bae concludes. “We believe the FacePsy pilot study represents a promising step toward creating a compact, affordable, and user-friendly diagnostic tool.”