That’s how users without control of their limbs would operate a robotic wheelchair being developed by Lafayette students. The chair operates with input from a neuro-headset that records and then translates opening and closing of the eyes and brain wave patterns into actionable signals by a computer. A gyroscope in the headset detects head turns.

It’s the latest example of how students and faculty are collaborating to shift the engineering paradigm from solving technical problems to improving the human condition.

“We created this wheelchair because there are a lot of people with disabilities who can’t move about by themselves or use traditional wheelchairs,” says Brandon Smith ’17. “If they can’t use their arms, then they can’t really use any kind of wheelchair. If they don’t have the muscle strength, they can’t turn wheels or find the dexterity to use a joystick. This one uses the brain-computer interface, so all it requires is the ability to move their head. They don’t have to rely on another person to take them everywhere.”

Smith came to Lafayette determined to make a difference in the field of designing prosthetics.

“When I was in high school, I read an article for a class that talked about a prosthetic that actually took input from neurological signals,” he says. “It was something that I found very interesting because I like technology, but I also want to help people, and this was one way I  could do that.”

Smith started working with Yih-Choung Yu, associate professor of electrical and computer engineering, on brain-computer interface technology after his first year. The collaboration continued every summer during his time at Lafayette.

Yu is continuing to refine the wheelchair in collaboration with Lisa Gabel, associate professor of psychology and neuroscience, along with students they recruited to help make design improvements.

Current challenges involve regulating the wheelchair’s speed and installing a sensor as a safeguard for when the user fails to notice an obstacle, proximity to a wall, or another immovable object.

“Right now we’re using all of the options to operate the wheelchair,” Yu says. “But we must look at what we should use as an emergency stop, considering users don’t have their fingers or hands to quickly push a lever or button.”

Determining the threshold of the user’s brain signals is another key consideration.

“This operation requires some sort of setup to determine the strength of your brain signal,” he says. “Your eye blinking, that’s an eye closure. So when you close your eyes, there’s an alpha wave in the back of your head that goes up. How high does that go? Everyone is quite different.”

Because it’s not convenient to type in the numbers to customize the brain interface setup, software is under development to allow a user to drag a bar up and down to make adjustments.

“It’s not just research to show it works, to prove the concept; you want to make it a real, useful thing,” Yu adds. “There are many little things we have to put into consideration so it can be useful. Just like when you buy a product, you also have to consider whether it’s easy to use, like your cell phone. How easy is it to navigate your interface?”

Smith, an avionics software engineer for Rockwell Collins, hopes he can someday work in the biotech industry to develop products that can help people achieve greater levels of independence and mobility.

“For those who would eventually use the wheelchair, assistance will still be required from caregivers,” he says. “But the hope is that it’s useful change.”