Seniors Design Portable Screening Tool to Rapidly Diagnose Mild Traumatic Brain Injury in the Field
Developed by biomedical engineering seniors, the algorithm assesses likely concussion by processing and analyzing video of the pupillary light reflex
In both combat and concussion, every second counts.
Within the dangerous and physically demanding environment of a combat zone, the risk of sustaining mild traumatic brain injury — commonly referred to as concussion — runs high. Concussion symptoms include confusion and memory loss, headache, disorientation, ringing in the ears and hearing loss, blurred vision, balance impairment, and mental health and behavioral changes.
While these symptoms are usually temporary, mostly resolving within two weeks, a second or repeated concussions that occur before the first has resolved can result in debilitating and life-threatening consequences, including brain swelling, permanent brain damage, and death.
"The longer it takes for a brain injury to get treated properly, the more the brain's chemistry is altered," said fourth-year Stevens Institute of Technology biomedical engineering major Zamin Akmal. "A second injury can lead to more permanent and lasting damage and brain malfunction. It's absolutely imperative to be able to detect these injuries quickly, so people can be removed from the field and treated as soon as possible."
But traumatic brain injury incidence is not isolated to combat zones.
The Centers for Disease Control and Prevention estimates 1.5 million Americans sustain a traumatic brain injury each year, at least 75 percent of which comes from mild traumatic brain injury.
Yet those numbers are likely an underestimate. Often resulting from a blunt trauma to the head or a fall, concussions regularly go unreported, occurring in student and professional sports such as football, soccer, and hockey, as well as in vehicular and home accidents. Approximately 5.3 million Americans in the U.S live with permanent disability resulting from past traumatic brain injuries.
Diagnosing concussion, however, is difficult, requiring specialized equipment or specially trained medical personnel.
Thus, fourth-year senior biomedical engineering majors Akmal, Nicole Chresomales, Amanda Delorme, and Sophie Makepeace — collectively known as Team HeadSpace — have developed an algorithm capable of analyzing changes in eye structure that indicate whether mild traumatic brain injury has likely occurred. This algorithm will serve as the backbone of a portable self-assessment tool that can be employed in the field for rapid concussion screening.
"Rapid assessment needs to be addressed because concussion is such a common thing that happens to so many people in different areas," said Makepeace.
Advised by biomedical engineering professor Vikki Hazelwood and visiting senior research scientist and former Navy SEAL and astronaut William Shepherd, the project is part of a long-term Systems Engineering Research Center Capstone Marketplace Project sponsored by the United States Army Special Operations Command.
Assessing concussion by eye
A common indicator of mild traumatic brain injury is an abnormal pupillary light reflex.
In a healthy individual, the pupil of the eye (the dark hole in the center of the iris) constricts in the presence of light and dilates in its absence. Controlled by muscles in the iris (the color part of the eye), this adjustment in pupil diameter allows more or less light into the eye's retina, optimizing one's ability to see in a variety of environments.
In a darkened room, for example, your eyes "adjust" by increasing their pupillary diameter, thereby allowing you to take advantage of as much of what light is available as possible.
A normal pupillary light reflex is measured in mere milliseconds or seconds. In the presence of traumatic brain injury, however, the reaction time decreases.
"If you have an abnormal pupillary light reflex [with concussion], it would be a sluggish response to light," Chresomales said. "It's going to take a longer time to constrict than a normal pupil would."
But measuring such reaction times requires equipment or medical training not generally available to the average person or soldier. Each option also comes with its own downsides.
A device found in clinical settings called a pupillometer can quantitatively measure the pupillary light reflex, but it is too bulky and narrow-focused to serve as a feasible part of a Navy SEAL's regular kit.
A healthcare worker can use a penlight to trigger and observe a pupillary light reflex (a test you may recognize from medical dramas), but conducting and assessing this test requires specialized training and is highly subjective.
Additionally, magnetic resonance imaging can identify brain changes indicative of concussion, but MRI machines are neither portable nor immediately accessible in most military or civilian environments.
To develop a fast, objective, and easy-to-use portable concussion assessment tool, the team needed to find a way to quantifiably measure the speed in which a pupil constricts in the presence of light using technology commonly available in a combat zone.
But to analyze pupillary constriction velocity, the students first needed to figure out how to teach a computer to identify a pupil in a video image.
Breaking down the problem frame by frame
"We knew we wanted to have a video of the pupillary light response as our input, and we wanted our output to be the plot of pupil diameter over time. So piece by piece, we made steps in order to get to there," said Akmal.
Using a programming platform called MATLAB, the team developed an algorithm that divides a color video recording of a pupillary light reflex into individual frames. The algorithm then converts one frame of the color video into grayscale, which is then converted into black and white through a process called binarization.
Because the pupil is the darkest visible portion of a person's eye, after binarization it appears black, while the rest of the image appears white.
With the pupil now clearly differentiated, the software measures its area, and from that area, calculates the pupil's diameter.
This process is repeated for every frame of the input video. The software then plots the pupil diameter measurement from each video frame over time, showing how quickly or slowly the pupil size has changed and therefore whether concussion may have occurred.
According to Delorme, MATLAB was a logical choice for developing this project because it can house and process imported video samples and other related files, in addition to providing a software development environment. As biomedical engineering majors, the whole team was also already familiar with it from previous classwork.
Since testing on human subjects was beyond the scope of the current project, the team used the flash and video recording capabilities of their smartphones to generate test samples using their own eyes.
"We use a pulse of light lasting five seconds on our phone cameras," said Chresomales. "This causes the pupillary light response, allowing us to obtain the constriction velocity."
Although seemingly crude, the students' method strongly approximates the real-world circumstances in which their assessment tool would ultimately be deployed.
The team has developed the software to be easily adapted to run on an ATAK device (Android Team Awareness Kit), a smartphone or tablet used by U.S. armed and security forces in emergency and combat situations running software optimized for navigation, situational awareness, and communications.
Just like the students' personal smartphones, "the ATAK can be used to create a pulse of light and record the video for input," Chresomales said. Taking only four or five seconds to capture, the video would then be immediately available for analysis by the students' preloaded software.
By designing their algorithm to perform on equipment already carried by Navy SEALS as a matter of course, the students enable military personnel in combat zones to make immediate and informed decisions to guard their colleagues against irreversible brain damage without the need for special training, equipment, or personnel.
In future, the software could also be adapted for civilian use, carried via smartphone to sports events or as a regular part of an Emergency Medical Technician's toolkit — ready for use in any environment in which a quick assessment is needed.
"Right now we're working with the military, but the software could really be used anywhere because it's easily transportable," Makepeace said.
A question of contrasts
Each team member was responsible for researching different parts of the code, with Akmal bringing all the disparate pieces together into a single product. To stay on track, Delorme said, the students filled out weekly progress reports, detailing both deliverables completed and goals for the coming week.
Although the team has not met in person as a group since the fall semester due to the ongoing COVID-19 pandemic, Akmal noted, "since our project is purely software-based, working remotely was not nearly as much of a hurdle as it may have been for projects that culminated in a physical design."
That's not to say the project was without its challenges, however.
One issue the team faced while developing the algorithm, Makepeace explained, was the low color contrast inherent to eyes with a brown iris, making the video conversion from color to black-and-white for isolating the pupil less successful.
Additionally, differences in ambient lighting — such as whether a pupillary light reflex was recorded indoors or outdoors — affected the algorithm's functionality.
"As it currently stands, our algorithm is pretty functional with darker eyes, but the lighting is still proving to be a challenge," said Akmal. "That may be something that has to be alleviated by modifying the equipment that's used to record, such as infrared cameras, which can work in the dark. I'm still trying to look at software methods to take care of that issue, but that is a challenge we continue to face."
Advancing toward the goal
The team presented their project — both as a poster and a demonstration video — at a poster session for undergraduate biomedical device designs at the 47th Annual Northeast Bioengineering Conference, held virtually in late March.
Akmal described the experience as good practice for Stevens’ Innovation Expo, as well as the team's final stakeholder presentation with their military sponsors at the end of April.
Although the team's two semesters represent a single but critical step in the assessment tool's development, the students have already made design recommendations for future teams to consider. Their suggestions include integrating a database to store an individual's baseline pupillary light response to serve as a control against which future (potentially altered) responses may be compared and further improvements to address the environmental lighting issue.
"The team studied their challenge well, and while gaining new knowledge in physiology, military operations and device technology, they acquired very useful experience in the process of medical design," Hazelwood said. "They integrated all to develop a practical and useful tool to ensure the safety of our warfarers. It was a win–win."
Learn more about research in the Department of Biomedical Engineering →