The Human-Machine Interaction & Innovation (HMI2) Lab at Santa Clara University is creating a versatile intelligent robot. (Credit: Clearpath Robotics)

The evolution of robotics and artificial intelligence (AI) has opened the door to the development of advanced, sophisticated assistive robots capable of adapting to dynamic environments. These robots may not only provide personalized assistance but also have the ability to learn from their interactions. Assistive robots can now be designed to support individuals with disabilities or other limitations in performing daily activities and could enhance independence, mobility and overall quality of life.

Developing assistive robots is a challenging research area, especially when integrating these systems into human environments such as homes and hospitals. To tackle these challenges, the Human-Machine Interaction & Innovation (HMI2 ) Lab at Santa Clara University is creating a versatile intelligent robot.

Alex: The Multi-Tasking Assistive Robot

A dynamic hospital environment simulated in Gazebo. (Credit: Clearpath Robotics)

The assistive robot, named ‘Alex’, consists of Ridgeback Clearpath Robotics’s versatile indoor omnidirectional platform. The Ridgeback uses its omni-drive to move manipulators and heavy payloads with ease. In this case, the Ridgeback mobile robot has a Franka arm mounted on top to guide people in crowded environments such as hospitals. Furthermore, the system includes two cameras.

The Intel D405 camera is mounted on the end-effector of the robotic arm and is used for recognizing objects in the environment and for improving the grasping points. The second camera, an Intel D455 camera, is mounted on the mobile base in order to recognize the objects in the environment and improve navigation.

Adaptive AI At Its Best

As of now, the team has tested the robot in two very different but equally dynamic environments to showcase the robot’s versatility. They focused on two main tasks: Walking assistance in crowded areas such as hospitals, and collaborative cooking at home.

For the first task, Alex adeptly guides patients through a crowded corridor of a hospital avoiding collisions. To accomplish this task the team developed a sophisticated framework using deep reinforcement learning. The framework facilitates co-navigation between patients and robots with both static and dynamic obstacle avoidance. The framework is evaluated and implemented in Gazebo. The framework and environment are both built upon an open-source Robot Operating System (ROS).

The assistive robot, named ‘Alex’, consists of Ridgeback Clearpath Robotics’s versatile indoor omnidirectional platform. The Ridgeback uses its omni-drive to move manipulators and heavy payloads with ease. (Credit: Clearpath Robotics)

In the second task, Alex demonstrates the ability to comprehend speech instructions from humans and fetch necessary items for collaborative cooking. Spoken language interaction is a natural way for humans to communicate with their companions. With recent breakthroughs in Natural Language Processing (NLP) with ChatGPT, humans and robots can now communicate in natural, unstructured conversations. Leveraging these tools, the team has introduced a framework called Speech2Action that uses as input spoken language and automatically generates robot actions based on the spoken instructions for a cooking task.

The team conducted a study with 30 participants to evaluate their framework. The participants completed both structured and unstructured commands in a collaborative cooking task. The study showed how robot errors influence our perception of their utility and how they shape our overall point of view of these interactions. The final results showed that contrary to expectations, the majority of the participants favored an unscripted narrative, leaning towards a preference for robots to engage in spontaneous conversation and meaningful dialogue. This fosters a more human-like and adaptable exchange between humans and robots.

The successful implementation of these tasks not only demonstrates a remarkable level of the robot’s versatility but also paves the way for new possibilities in the field of human-robot interaction and assistive robotics.

The team chose Ridgeback for its reliability and Clearpath’s extensive ROS support.

A Bright Future for Alex

The team plans to extend their frameworks to other tasks such as robotic assistance with getting ready for work. Furthermore, they are interested in developing solutions for people who are blind or visually impaired. They plan to evaluate their systems with end-users as it is important for the team to develop inclusive robots.

The team members involved in this project consist of Maria Kyrarini (Assistant Professor), Krishna Kodur (PhD student), Mazizheh Zand (PhD student), Aly Khater (MS student), Sofia Nedorosleva (MS student), Matthew Tognotti (undergraduate student), Aidan O’Hare (undergraduate student), Julia Lang (undergraduate student).

This article was written by Sophia Munir, Clearpath Robotics. For more information, contact Maria Kyrarini, This email address is being protected from spambots. You need JavaScript enabled to view it. or visit here . A video of the technology is available here .