Students Showcase Life-Changing Tech at 10th Annual Stevens Innovation Expo
Projects supporting health and wellness take center stage at the 2025 exhibition of senior design projects
At Stevens Institute of Technology’s 10th annual Innovation Expo on Friday, May 9, 2025, senior design teams will put their empathy, engineering and ingenuity on full display in projects that tackle real-world health challenges head-on.
Reflecting the university’s commitment to health and medicine research as one of its six foundational pillars, three of this year’s student innovations will focus on transforming lives through tech. From an at-home AI-guided ear infection diagnostic tool, to a brain-powered hand exoskeleton, to a wearable navigation system for people who are visually impaired, these innovations demonstrate how Stevens students are designing with impact and humanity in mind.
Clear Ear At-Home Ear Infection Diagnosis
Just about every parent and childcare provider has been there, unsure of what to do while watching a crying child tugging at an ear, unable to explain what’s wrong. Is it an ear infection? Does the child need to go to the pediatrician or urgent care — which means the caregiver may need to miss work for the appointment? Would antibiotics help, or would they simply contribute to antibiotic resistance? Could waiting too long lead to hearing loss or speech delays?
A mission to mitigate that family distress inspired five Class of 2025 Stevens biomedical engineering students to develop a way to diagnose ear infections from home. Their Innovation Expo project, Clear Ear, is an at-home diagnostic device for pediatric middle ear infections.
“I remember how painful my own frequent childhood ear infections were,” said Caitlyn Cianci, who will be working with Johnson & Johnson MedTech as an associate clinical account specialist after graduation. “An at-home device like Clear Ear would have alleviated the stress my parents felt.”
Since September, the team has been conducting extensive research into childhood ear anatomy and the limitations of existing diagnostic solutions.
“We spoke with an ear, nose and throat specialist who expressed concerns about the inaccuracy and hassle of existing at-home devices,” said Natalie Rofail, who plans to continue in the Stevens Accelerated Master’s Program in biomedical engineering before pursuing pediatrics in medical school.
The Clear Ear device integrates a digital otoscope (to see into the ear canal and eardrum) and an affordable tympanometer (to measure eardrum movement) into a single unit. The connected mobile app guides parents through testing and uses AI-driven analysis to classify an infection, fluid buildup, a healthy eardrum or an earwax obstruction. It then recommends next steps, including connecting users to telehealth providers for treatment options.
“It has been fascinating learning about and building this device, and it has been so rewarding to see what has come from this project,” said Stephanie Cueva, who will be studying abroad before earning her master’s degree in public health. “Our Clear Ear solution will allow caretakers to have an in-depth, at-home assessment of their child’s ear health, a novel solution in this market. It can make a difference in a family’s life.”
This impressive technical achievement is also a reflection of the team’s commitment to empathy, innovation and real-world impact.
“These students have shown a tremendous amount of growth,” said Sally Shady, teaching associate professor and associate chair of undergraduate studies in the Department of Biomedical Engineering, and faculty advisor to the team. “They have a positive attitude and a passion for solving clinical problems. I am sure they will make the Stevens community proud.”
SYNC Designs Brain-Powered Hand Exoskeleton
Injuries and other medical conditions can impair something as simple — and essential — as the ability to use your hands. Inspired to action by how stroke-related impairments have affected their loved ones, the members of the Stevens SYNC Designs team set out to lend their own creative helping hand. Their wearable hand device responds to brain signals, helping stroke survivors regain manual control — one thought at a time.
The team includes Class of 2025 electrical engineering students Owen Deem and Cooper Foote and software engineering students Nick Accardo and David Frost.
“Other solutions also use electromyography (EMG) sensors to detect muscle movements,” noted Deem, who plans to pursue his Ph.D. after graduation. “We believe that just using brain signals will help users strengthen the mind-muscle connection, restoring the use of their hands.”
The simple setup uses complex electrical and software engineering techniques. Wearing an electroencephalography (EEG) headset that reads eight signals from the brain, the user calibrates and begins to train the system by merely thinking about opening and closing the hand.
A connected pipeline of Python programs sorts, cleans and displays the real-time brain signal data, then sends those signals to a small Raspberry Pi computer on the wearable exoskeleton. As the user mentally pictures opening or closing the hand, the exoskeleton makes it happen in real life.
This all-hands-on-deck project required significant teamwork. Fundamental challenges included creating the exoskeleton prototype with no mechanical engineers, and rapidly becoming familiar with advanced EEG software platforms.
“The first time we fully processed calibration data and saw the results, we were ecstatic,” said Accardo, who plans to work in software engineering after graduation. “We were in the Electrical and Computer Engineering lab, and when the output charts popped up on our monitor, we found other groups behind us rooting for our success. It was a turning point, because we saw how impactful this solution could be, and we realized that we could accomplish our goal.”
Along the way, the students leaned not only on what they learned in class, but also on each other — growing their skills, confidence and resilience with every conquered obstacle.
“I have been most impressed by the knowledge they've obtained from Stevens courses and their own outside interests, as well as their ability to adapt to problems and come up with innovative solutions,” said Bernard Yett, teaching assistant professor, Department of Electrical and Computer Engineering, and the team’s faculty advisor.
Helping others use their brains to solve physical problems has also been a valuable exercise for the team in using their minds to overcome tangible hurdles.
“These challenges will only increase the quality of our product, and each challenge has been worth the reward of seeing us get one step closer to our goal,” said Frost, who will be working as a full stack software engineer after graduation. “We’re excited because we can see how our idea could help transform the daily lives of millions of people.”
C-ALL Wearable Navigation System for the Visually Impaired
When Jules, a college student who is blind, shared the difficulties in navigating unfamiliar, unseen spaces — even with the help of her beloved guide dog, Ruby — four Stevens students knew they had found their Innovation Expo project.
Their first-of-its-kind wearable navigation system combines powerful light detection and ranging (LiDAR) tech with gentle vibrations and a custom iPhone app. Dubbed C-ALL (Cognitive Assistance with LiDAR Localization), this smart tech can help users avoid obstacles and get where they’re going — safely and confidently.
“Guide dogs and canes are helpful for people who are blind or visually impaired, but a trained dog can be expensive to acquire and maintain, and a cane only alerts the user to things it can touch,” said Neeti Mistry, who is majoring in software engineering with a minor in computer science. “Many existing tech-based mobility solutions are either too costly or impractical for widespread use. We wanted to build an option that is affordable, adaptable and powered by technology that’s already in people’s hands to enable greater independence and safety.”
C-ALL consists of two main parts:
A mobile app for iPhones 12 Pro or newer, which already include LiDAR sensors. The user enters the destination on the app and wears the iPhone on a lanyard with the app open. As the user walks, the system collects real-time spatial data to detect obstacles — including face-level hazards that canes often miss — to continuously recalculate and adjust cues.
Two wearable gloves. One glove delivers vibration feedback that guides the user around obstacles to find the safest path, while the other directs the individual along the best route through GPS guidance. The vibration and distance settings are adjustable for a personalized experience.
“We had to design and code the mobile app, build custom hardware and integrate them,” Mistry shared. “The most difficult part was making sure the software and the hardware spoke the same language. There was a lot of trial and error, but every breakthrough reminded us why we started this in the first place.”
Core to the entire project was their commitment to a user-centered approach. They incorporated feedback from Jules and other visually impaired individuals from the beginning.
“We didn’t want to overwhelm the user or add to any stigma,” Mistry explained. “It had to be subtle, effective and customizable.”
The team aims to publish the app to the iOS App Store and continue refining the system through user testing.
“These students tackled a real-world challenge with a user-focused, interdisciplinary approach,” said David Darian Muresan, teaching professor, Department of Systems and Enterprises, and faculty advisor to the team. “Their work seamlessly integrates electronics, mechanical design and software programming into a cohesive, functional system that’s both technically impressive and socially transformative. Their commitment to improving mobility and independence for visually impaired individuals highlights the power of human-centered design. They exemplify the Stevens spirit — blending technical excellence with meaningful impact.”