When Detecting Depression, the Eyes Have It
Stevens researcher develops two new AI-powered smartphone apps to spot warnings in our pupils, unconscious facial expressions and head movements
Hoboken, N.J., September 26, 2024 – It has been estimated that nearly 300 million people, or about 4% of the global population, are afflicted by some form of depression. But detecting it can be difficult, particularly when those affected don’t (or won't) report negative feelings to friends, family or clinicians.
Now Stevens professor Sang Won Bae is working on several AI-powered smartphone applications and systems that could non-invasively warn us, and others, that we may be becoming depressed.
“Depression is a major challenge,” says Bae. “We want to help.”
Snapshot images of the eyes, revealing mood
One system Bae is developing, with Stevens doctoral candidate Rahul Islam, is called PupilSense. It works by constantly taking snapshots and measurements of a smartphone user’s pupils, then analyzing them.
“Previous research over the past three decades has repeatedly demonstrated how pupillary reflexes and responses can be correlated to depressive episodes,” explains Bae.
The system accurately calculates pupils’ diameters, as compared to the surrounding irises of the eyes, from 10-second “burst” photo streams captured while users are opening phones or accessing social media applications.
In one early test with 25 volunteers over a four-week period, the system — embedded on those volunteers’ smartphones — analyzed approximately 16,000 interactions once the pupil-image data were collected. After teaching an AI to differentiate between “normal” responses and abnormal ones, Bae and Islam processed the photo data and compared it with volunteers' self-reported moods.
The best iteration of PupilSense — one known as TSF, which uses only selected, high-quality data points — proved 76% accurate at flagging occasions when people did indeed feel depressed. That’s better than the best smartphone-based system currently being developed and tested for detection depression.
“We will continue to develop this technology now that the concept has been proven,” adds Bae, who previously developed smartphone-based systems that can predict binge drinking without reading private documents, photos or other data.
The PupilSense system was first unveiled at the International Conference on Activity and Behavior Computing in Japan in spring 2024, and is now available to researchers open-source on the GitHub platform.
Facial expressions also tip depression's hand
Bae and Islam are also concurrently developing a second system known as FacePsy that powerfully parses facial expressions for insight into our moods.
"A growing body of psychological studies suggests that depression is characterized by nonverbal signals such as facial muscle movements and head gestures," notes Bae.
Like PupilSense, FacePsy runs in the background of a phone, taking facial snapshots whenever a phone is opened or commonly used applications are opened. (Importantly, it deletes the facial images themselves almost immediately after analysis, protecting users' privacy.)
"We didn't know exactly which facial gestures or eye movements would correspond with self-reported depression when we started out," Bae explains. "Some of them were expected, but some of them were surprising."
Increased smiling, for instance, appeared in the pilot study to correlate not with happiness but with potential signs of a depressed mood and affect.
"This could be a coping mechanism, for instance people putting on a 'brave face' for themselves and for others when they are actually feeling down," says Bae. "Or it could be an artifact of the study. More research is needed."
Other apparent signals of depression revealed in the early data included fewer facial movements during the morning hours and certain very specific eye- and head-movement patterns. Yawing, or side-to-side, movements of the head during the morning seemed to be strongly linked to increased depressive symptoms, for instance.
Interestingly, our eyes being detected as more wide-open during the morning and evening hours seems to be associated with potential depression, too — suggesting outward expressions of alertness or happiness may sometimes mask depressive feelings hiding beneath cheerful exteriors.
"Other systems using AI to detect depression require the wearing of a device, or even multiple devices," concludes Bae. "We think this FacePsy pilot study is a great first step toward a compact, inexpensive, easy-to-use diagnostic tool that can help people learn when they're not feeling well and get help."
The FacePsy pilot study’s findings were presented at the ACM International Conference on Mobile Human-Computer Interaction (MobileHCI) in Australia in early October.
About Stevens Institute of Technology
Stevens Institute of Technology is a premier, private research university situated in Hoboken, New Jersey. Since our founding in 1870, technological innovation has been the hallmark of Stevens’ education and research. Within the university’s three schools and one college, 8,000 undergraduate and graduate students collaborate closely with faculty in an interdisciplinary, student-centric, entrepreneurial environment. Academic and research programs spanning business, computing, engineering, the arts and other disciplines actively advance the frontiers of science and leverage technology to confront our most pressing global challenges. The university continues to be consistently ranked among the nation’s leaders in career services, post-graduation salaries of alumni and return on tuition investment.
Stevens Media Contact
Kara Panzer
Director of Public and Media Relations
Division of University Relations
845-475-4594
[email protected]