Purdue industrial engineering graduate student Mithun Jacob uses a prototype robotic scrub nurse with graduate student Yu-Ting Li. Researchers are developing a system that recognizes hand gestures to control the robot or tell a computer to display medical images of the patient during an operation. (Purdue University photo/Mark Simons)

Kinect Lends a ‘Hand’ to Robotic Surgery

Since surgeons need to review medical images and records during surgery, a robotic scrub nurse may someday increase operating room efficiency by recognizing hand gestures and calling up the specific images that the surgeon needs to view while operating. Vision-based hand gesture recognition technology could help reduce not only the length of surgeries but also the potential for infection, according to Juan Pablo Wachs, an assistant professor of industrial engineering at Purdue University.

The system under development at Purdue uses a camera and specialized algorithms to recognize hand gestures as commands that instruct a computer or robot. Further research is needed to enable computers to understand the context in which gestures are made, so that they can discriminate between intended and unintended gestures.

A prototype robotic scrub nurse is currently working with faculty in the university’s School of Veterinary Medicine. Meanwhile, researchers hope to refine the algorithms to isolate the hands and apply “anthropometry” — predicting the position of the hands based on knowledge of where the surgeon’s head is. The tracking is achieved through a camera mounted over the screen used for visualization of images. The system uses a type of camera developed by Microsoft, called Kinect, that senses three-dimensional space. Eventually, the researchers plan to integrate voice recognition, but will continue to focus primarily on gesture recognition research.

Incorporating a better sense of touch into robotic-assisted surgery systems is another application for Microsoft Kinect. Electrical engineering grad students at the University of Washington recently adapted this technology for use with surgical robots by writing code that directs the Kinect to map and react to environments in three dimensions. The system then sends spatial information about these environments back to the user. Such technology could provide surgeons with useful feedback about how far to move the tool; for example, the system could be programmed to define off-limits areas such as vital organs.

Robots: Friends in Hostile Situations

PERL Research is working with the U.S. Army to develop the Dynamic Injury Severity Estimation System (DISE), an integrated system of intelligent software and sensors on a robot. (PERL Research)
A recent Georgia Institute of Technology study found that patients generally responded more positively to a robotic nurse’s touch when they believed the robot intended to clean their arm, as opposed to when they believed the robot intended to comfort them. While robots may not ever be able to compete with humans when it comes to displays of compassion, it is this same lack of humanity that makes them so useful in other situations, particularly in the line of fire.

Robots can be especially valuable in hostile or high-risk environments that present a danger for humans, such as search and rescue operations or on the battlefield. Since medics become vulnerable to danger whenever they are called upon to assess an injured soldier in the battlefield, scientists at PERL Research (Huntsville, AL) (www.perlresearch.com) were awarded a contract from the U.S. Army Medical Research and Materiel Command’s Telemedicine and Advanced Technology Research Center to develop a system to provide real-time, continuous estimation of a patient’s health status: the Dynamic Injury Severity Estimation System (DISE).

“Our assessment technology is based on thermographic sensing,” explained PERL Research senior scientist Paul Cox. “With our sensor suite, we are able to remotely perform Glasgow Coma Scale (GCS) and establish triage categories. The robotic triage system is an integrated system of intelligent software and sensors that optimally integrates a medic's remote assessment with an artificial intelligence decision algorithm to determine the status of the injured soldier remotely.”

The thermographic camera measures the person’s heart rate, respiration rate, and skin temperature. Also included are a spinal injury sensor and handheld triage computer that can determine injury severity based on the person’s vital signs. The system integrates the medic’s assessment (based on remote video monitoring and audio interaction) along with automated processing of the sensor data to determine the condition of the injured soldier.

For initial capabilities (remote triage and GCS), the system could be used in the field in less than two years. For more detailed patient assessment capabilities, such as the detection of internal hemorrhaging, more research is needed. PERL has completed initial testing with subjects using a lower body negative pressure chamber to simulate blood loss that can occur when a limb is severed or severely damaged. Early results indicate a correlation between the parameters extracted from the thermographic camera and stroke volume data (the amount of blood being pumped with each heartbeat).

Stroke volume is an indicator of hemorrhaging but is not practical to measure in the field using current devices. As more research takes place, someday this robotic technology will be able to aid in the early detection of internal hemorrhaging in soldiers on the battlefield.