Research & Innovation

Stevens Innovations Impress at 2018 LifeSciences Summit

Emerging research in artificial intelligence and orthopedic robots impress experts and industry leaders

Stevens professor Koduvayur Subbalakshmi speaks with students Zongru (Doris) Shao and Harish Sista at the 2018 LifeSciences Summit. CREDIT: Shreya Parekh
Stevens professor Koduvayur Subbalakshmi speaks with students Zongru (Doris) Shao and Harish Sista at the 2018 LifeSciences Summit. CREDIT: Shreya Parekh

The 2018 LifeSciences Summit, which took place earlier this summer, was an industry conference connecting academic researchers with investors and business partners in the hopes of moving healthcare innovations through clinical development. Stevens Institute of Technology impressed with two research projects from its newly launched Institute for Artificial Intelligence: an AI-driven app that can help patients self-diagnose Alzheimer’s disease, and two robotic devices that help stroke and ALS patients walk better.

Replacing a medical team with an "AI machine"

Alzheimer’s is the sixth-leading cause of death in the United States, according to the Alzheimer's Association. It has no cure and treatment options are minimal. The best chance an Alzheimer’s patient has at a relatively normal life is early, accurate diagnosis.

Enter the Core Cognitive Assessment (CoCoA) interactive chatbot.

CoCoA is an AI mobile application that uses verbal and visual commands to walk users through a self-administered diagnostic. It tests their visual perspective and speaking abilities in a way that’s similar to the leading clinical diagnostic. The difference is that the app’s diagnostic can be completed anywhere—and without medical supervision.

"I made an app that walks the patient through all those questions on their own," says app creator and Ph.D. student Harish Sista. He describes his goal in creating the app as "trying to replace a medical team with an AI machine. This test can be taken by anyone, anywhere in the world, at any time."

The app utilizes features people are already familiar with like touch sensing and verbal cues, so it’s easy to adapt to. It also introduces features that are entirely new to the field like video integration and AI-driven diagnostic algorithms.

All of these features work together to help patients quickly and efficiently self-diagnose and seek treatment. Once Harish incorporates more rigorous audio and video diagnostic tools created by Stevens post-doctoral student Zongru (Doris) Shao, CoCoA will be a comprehensive, unprecedented tool for detecting early signs of Alzheimer’s and other neurodegenerative diseases.

"We’re trying to integrate both image and audio analysis into the same platform," Sista says. "It’s very challenging, using graphics to analyze the way a user touches the screen. That’s a totally new thing for a developer to do. It’s the most complicated technology I’ve worked with—but I like challenging!"

Doris South and Harish Sista. CREDIT: Shreya Parekh
Zongru (Doris) Shao and Harish Sista. CREDIT: Shreya Parekh

The app is available in a limited test version on the iOS platform right now. Sista is refining the app for a wide release this summer, as well as working on Android and open source versions for release later this year. He is also working on ways to allow patients to send test scores to doctors and have doctors recommend early intervention options.

The app is an extension of the machine learning research emerging from the Stevens Institute for Artificial Intelligence (SIAI). Led by professor and SIAI founding director Koduvayur Subbalakshmi as well as professor Rajarathnam Chandramouli, SIAI’s research combines machine learning techniques and natural language processing tools to create diagnostic tools for neurodegenerative diseases. CoCoA is also supported by a generous gift from LGS Innovations.

Replacing a medical team with a robotic orthotic

Nearly 800,000 people annually in the United States suffer a stroke, according to the National Institutes of Health. The ones who survive have difficulty walking. Their best options for improvement are being corrected by—and connected to—a machine, or having a team of therapists physically move their legs on a treadmill. Neither option is comfortable or affordable.

The Stevens Ankle-Foot Electromechanical (SAFE) orthosis is a simpler, safer option.

The device, built by Ph.D. student Yufeng Zhang, adds sensors and motors to an existing orthotic framework to help a patient correct their walk themselves. It is also lightweight, more comfortable and easier to use than existing ankle rehabilitation devices.

"The SAFE device vibrates, giving patients instant feedback," Zhang says. "This helps them adjust movement and walk correctly in real-time."

The sensors track the force of each step, and if a patient isn’t using enough force to complete the step the device gently pushes them. These sensors use sophisticated algorithms to learn the phase of a patient’s walk cycle and predict the necessary actions for the robotic mechanics to take in order to compensate for it. Zhang created the device’s control system while '18 valedictorian Roger Kleinmann worked on the motors. "Everything runs in real-time as you walk with the orthosis," Zhang says.

Right now, Zhang is working to refine and test the device on stroke patients at Kessler Institute for Rehabilitation, a top-tier rehabilitation hospital in the nation. "We are excited about this opportunity and eager to improve the device based on this feedback," says Zhang’s advisor and assistant professor Damiano Zanotto.

Zhang is also working to make it more consumer-friendly. "I hope this device can be commercialized, and that many people can benefit from it," he says.

Helping a medical team with a smart insole

Spinal muscular atrophy (SMA) is a neurodegenerative disorder affecting 9,000 people in the United States, according to the SMA Foundation. Patients who suffer from SMA experience muscle atrophy and weakness along the spine, making walking difficult—particularly for elderly patients.

SportSole is working to correct that.

The device, built by Ph. D. student Huanghe Zhang, is a lightweight wireless insole that gives patients feedback on correcting their walk. Using sensors to track different kinds of movement, SportSole is designed to be a portable and affordable treatment option for clinicians to treat patients by offering feedback on different therapies and predicting relevant improvements with a particular focus on endurance and falls.

SportSole is also more comprehensive with its data—and more accurate—than other commercial systems.

"The insoles will work in any shoes," Zhang says. "They measure temporal, spatial, kinetic and ​inter-limb parameters at a frequency of 500 hertz," meaning they use complex algorithms to learn every aspect of a patient’s step for each foot, and are fast and responsive enough to consistently capture data. After that, the sensors predict corrective actions for the patient and their clinical team to take. "We have lots of data, and now we have to move into software development to make the most of it," he adds. "We found a problem and know how to solve it."

Zhang "developed the whole system from zero to 100 percent" and is now working to improve the accuracy of the sensors, primarily in measuring data from the left and right feet as a system. He is also testing the insole with patients in conjunction with Columbia Medical Center, as well as developing a pediatric version of the insole and a version for runners.

Yufeng Zhang and Huanghe Zhang presenting at the summit. CREDIT: Shreya Parekh
Yufeng Zhang and Huanghe Zhang presenting at the summit. CREDIT: Shreya Parekh

Both orthotics build on the mechanical mobility research in Stevens’ Wearable Robotic Systems Laboratory. The lab, led by Zanotto, develops wearable robotic technologies to help people suffering from movement disorders.

A future driven by AI

Encouraging as these devices are, the responses to them at the LifeSciences Summit was even more encouraging. "Everyone had experience speaking with Alexa and other chat machines, but none of [the summit attendees] had experience with an app that spoke intelligently to them," Sista says. "Everyone was really excited by how different the experience was, and they were surprised AI could be used this way."

"People were very familiar with other commercial systems, but they were surprised and impressed by the accuracy of SportSole," Zhang says.

Still, there are challenges to overcome before releasing these devices to the public. "People tended to repeat themselves a lot when speaking with an AI application," Sista says. "They think they’re speaking with a real person, and they kept pushing the button rather than giving the application time to work. We had to teach them how to adjust to working with AI in an app."

Stevens is hoping to develop all of these technologies in light of connections made at the 2018 LifeSciences Summit.