A new wearable system from the Massachusetts Institute of Technology will help blind users navigate through indoor environments.

Paul Parravano uses a combination of a white cane, third-party GPS alerts, and an electronic Braille display to navigate through the streets near his campus, the Massachusetts Institute of Technology.

When the MIT advisor moves from the outdoors to, say, the Marriott Hotel in Boston’s Kendall Square, however, he frequently must ask someone in the lobby to help him find a place to sit.

Researchers at the university’s Computer Science and Artificial Intelligence Lab (CSAIL) have developed a wearable technology to address the challenge that many blind people like Parravano face: detecting potential obstacles in indoor environments. The prototype system’s combination of imaging technology, vibrating sensors, and a Braille display can alert wearers of nearby objects, including empty chairs.

The Wearable System

The components of the prototype system. (Image Credit: MIT)

GPS systems work adequately enough for outside environments, but indoor areas such as hotels or airports feature unpredictable obstructions that navigation technology cannot address.

“In my usual commute, I know where the stairs are. But if I travel, some of those obstacles are important to know,” said Parravano, co-director of MIT’s Office of Government and Community Relations. “If you’re walking through the airport, you’re not always going to know where the stairs are, and they can sneak up on you.”

To provide richer feedback for blind users, Daniela Rus, professor and director of the CSAIL, and her students conceptualized and created a wearable technology: a combination of a 3D camera, belt, and Braille feedback device. The team presented their work last week at the International Conference on Robotics and Automation in Singapore.

The camera, worn like a pendant, employs object-recognition algorithms to spot items and communicate their presence via the sensor-equipped belt worn on the abdomen. The belt features five evenly spaced, vibrating motors, and the reconfigurable Braille interface is attached at the user’s side.

Using a repository of millions of hand-labeled images, the computer vision algorithm is trained to recognize an object of interest – chairs, couches, trees, or an empty chair, for example.

To identify an empty chair, the algorithm uses pixel information to determine the presence of a surface – and if the surface is parallel to the ground and falls within a prescribed range of heights.

“The system can actually decode [the objects’] location and communicate that information to the user,” said Rus. “This is something that you would never be able to do with a walking stick.”

Finding a Way Through the Maze

To test the research prototype device, Parravano, along with other visually impaired subjects, walked through a cardboard maze set up by the MIT engineers; those in the study navigated first with a walking cane and then the wearable technology. The vibrating sensors, located on the left, center, and right of the user’s buckle, indicated the presence of obstacles.

Walking directly in front of the cardboard wall activates vibrations from the center motor; a wall to the right similarly set off the right motor. The vibrations can also be configured to vary in intensity, frequency, and duration.

To get through the maze, a wearer had to turn until he or she found free space and did not feel vibrations – a unique idea, but one that took some adjustment, according to Professor Parravano.

“It took me a while to get used to that, but I got proficient at being able to walk around in that space essentially without my cane and just being able to get information from the vibration motors telling me that there was an opening,” Parravano said.

According to Rus, an additional test showed that blind participants had an 80-percent success rate when attempting to identify an empty chair in the space. In chair-finding mode, the system sends a “double pulse” that indicates the direction in which a chair with a vacant seat can be found.

The system, said Rus, overcomes the limitations of assistance tools like walking canes and audio alerts from a GPS system. Canes often collide with passersby and cannot always help a user determine the specific nature of an obstacle – an empty or occupied chair. GPS alerts are often unhelpful in busy, loud environments.

“The bottom line is it’s not desirable for the system to talk to the user and say, ‘There’s a chair; it’s ten meters in front of you,’” said Rus. “You can display that same information on this Braille buckle by programming the pins to essentially narrate the way the world looks.”

The electronic feedback device displays Braille lettering through raised and lowered pins, providing users with abbreviated messages of nearby obstacles: “C” for chair perhaps, or “T” for table.

From Prototype to Product

The researchers have a follow-up paper that will demonstrate an augmentation of the system with laser scanners – a technology arrangement similar to the setup of autonomous cars.

Rus and the team will next seek to turn the research prototype into a more refined product. Parravano sees potential in the technology concept, especially once the hardware undergoes a more user-friendly, human-scale, wearable redesign.

“I think we’d all agree that there’s lots of promise and possibility that a product like this might be very useful for a person who’s trying to be as independent as possible,” said Parravano.

Rus took the system to Milan in October 2016 for trials and said the reactions to the system were very enthusiastic.

“It was so extraordinary to see how people would put the system on and, without touching any walls, would be able to walk through cardboard mazes we made,” Rus said, including one blind man who approached her at the end of the demo.

“He said to me, ‘Now I just want to keep it and I want to go to Piazza del Duomo, find an empty bench, and sit down and feed the pigeons.’”

The tests took place at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL). The research was conducted by Robert Katzschmann, a graduate student in mechanical engineering at MIT; fellow first author Hsueh-Cheng Wang, a former postdoc at MIT and current assistant professor of electrical and computer engineering at National Chiao Tung University in Taiwan; Santani Teng, a postdoc in CSAIL; Brandon Araki, a graduate student in mechanical engineering; and Laura Giarré, a professor of electrical engineering at the University of Modena and Reggio Emilia in Italy.

What Do You Think?

Will this wearable system become a mainstream assistive technology for the visually impaired? Write your comments below.

Related Content: