Showing its capabilities as a body surrogate, a PR2 controlled remotely by an individual with profound motor deficits picks up a cup in a research laboratory at the Georgia Institute of Technology. (Credit: Phillip Grice, Georgia Tech)

An interface system that uses augmented reality technology could help individuals with profound motor impairments operate a humanoid robot to feed themselves and perform routine personal care tasks such as scratching an itch and applying skin lotion. The web-based interface displays a “robot’s eye view” of surroundings to help users interact with the world through the machine.

The system could help make sophisticated robots more useful to people who do not have experience operating complex robotic systems. Study participants interacted with the robot interface using standard assistive computer access technologies — such as eye trackers and head trackers — that they were already using to control their personal computers.

Studies show how such “robotic body surrogates” — which can perform tasks similar to those of humans — could improve the quality of life for users. The work could provide a foundation for developing faster and more capable assistive robots.

The wheeled robot has 20 degrees of freedom, with two arms and a “head,” giving it the ability to manipulate objects such as water bottles, washcloths, hairbrushes and even an electric shaver.

In their first study, researchers made the PR2 available to a group of 15 participants with severe motor impairments. The participants learned to control the robot remotely, using their own assistive equipment to operate a mouse cursor to perform a personal care task. Eighty percent of the participants were able to manipulate the robot to pick up a water bottle and bring it to the mouth of a mannequin.

In the second study, the researchers provided the PR2 and interface system to Henry Evans, a California man who has been helping Georgia Tech researchers study and improve assistive robotic systems since 2011. Evans, who has very limited control of his body, tested the robot in his home for seven days and not only completed tasks, but also devised novel uses combining the operation of both robot arms at the same time — using one arm to control a washcloth and the other to use a brush.

The interface allowed Evans to care for himself in bed over an extended period of time. “The most helpful aspect of the interface system was that I could operate the robot completely independently, with only small head movements using an extremely intuitive graphical user interface,” Evans said.

The web-based interface shows users what the world looks like from cameras located in the robot’s head. Clickable controls overlaid on the view allow the users to move the robot around in a home or other environment and control the robot’s hands and arms. When users move the robot’s head, for instance, the screen displays the mouse cursor as a pair of eyeballs to show where the robot will look when the user clicks. Clicking on a disc surrounding the robotic hands allows users to select a motion. While driving the robot around a room, lines following the cursor on the interface indicate the direction it will travel.

Source