Skip to Content
Engineering professor uses machine learning to help people with paralysis regain independence
Zachary Danziger, an assistant professor in the Department of Biomedical Engineering

Engineering professor uses machine learning to help people with paralysis regain independence

December 26, 2019 at 11:00am

Imagine a device that allows you to interactively communicate with a computer by recording your brain activity and decoding your intentions. This is known as a brain-computer interface, a machine learning system that translates brain patterns into instructions for a personal computer, or even a powered wheelchair, to understand what task a person wants to achieve.

Machine learning is an application of artificial intelligence (AI) that provides systems the ability to automatically learn and improve from experience without being explicitly programmed. It focuses on the development of computer programs that can access data and use it to learn for themselves.   

To help modify current brain-computer interfaces, Zachary Danziger, an assistant professor in the Department of Biomedical Engineering housed within the College of Engineering & Computing, was awarded a $1.6 million grant from the National Institutes of Health (NIH).

Danziger is creating a less invasive and more interactive model, called the joint angle brain-computer interface (jaBCI), to help improve brain-computer interface’s brain decoding algorithms before the device is implanted in the brain. This model is referred to as the jaBCI because it is powered by people’s finger joint angles instead of neurons.

The system can potentially help people who are paralyzed to regain their independence and ability to communicate. Essentially, the brain-computer interface device sits inside a person’s brain and records the communication of neurons’ electrical activity, which is communicated to a computer. The computer then interprets the brain signals into what the person desires. 

“The cool thing about the jaBCI model is that it puts a neurotypical person in the position of having control and interacting directly with these computers trying to interpret their intentions,” Danziger says. “We are using that interaction to help optimize brain-computer interfaces.”

Danziger is using virtual hardware, like data gloves and body sensors, to let people interact directly with the brain computer interface algorithms. Currently, Danziger is recruiting test subjects who don’t suffer from brain disorders to interact with computers in the Applied Neural Interfaces, the lab Danziger supervises. He and his team will study the interactions between the participants and computers to help improve the brain decoding algorithms. The healthy test volunteers are not the target audience. However, this will create a foundation to determine how to implement it in the target test subject who may experience paraplegia or a neurological disorder.

Once the model is deployed, scientists from around the world will test the technology and have a chance to use the model to help optimize brain-computer interface algorithms. There is support from scientists because the model will be able to simplify the complex process of finding human test participants by skipping the step of finding people who have had electrodes, which are thin and insulated wires, implanted in their brain. People who have electrodes in their brain are those who experience severe paralysis, like quadriplegia or advanced multiple sclerosis.

Danziger came up with the concept of this project when he was studying a game with human participants in his lab. The subjects wore virtual reality gloves and wiggled their fingers to move the cursor across the computer screen, in an effort to test how people do complicated tasks. Danziger realized the technology was similar to a brain-computer interface and decided to create a new platform for people.

Danziger says: “I have always been interested in the philosophy of neuroscience, asking myself questions like, ‘how does the brain learn new and complicated tasks?’” .

The NIH grant will help support Danziger’s research by funding a postdoctoral researcher, graduate students, test participants, a laboratory technician and the purchase of the virtual reality hardware. The grant will also fund a major collaboration between Danziger and Lee Miller, professor of physiology in the Department of Physical Medicine and Rehabilitation at Northwestern University. Miller is the co-investigator tasked with collecting brain data, which will then be studied at FIU to help construct the jaBCI model.

Danziger hopes that the model will be useful for people to better understand how humans do complicated or simple tasks based on brain patterns and decoding algorithms. Another goal of his is to create a more efficient and useful platform to test and improve current brain-computer interface algorithms without needing to implant a device in someone else’s brain first. Those who would potentially benefit from this model could regain their mobility, communication and independence again at a much faster rate than before.