In October of 2010, the first all-robotic surgery reportedly took place at Montreal General Hospital in Montreal, Canada. Intuitive Surgical’s da Vinci surgical robot worked in conjunction with the McSleepy anesthesia robot to perform a prostatectomy.

A prototype of a miniature robotic fly, about 0.04
Robotic surgery may be most closely associated with prostatectomies, but researchers across the world are working to develop an impressive range of medical applications for robots. The medical robots of the future will serve various functions, from highly visible roles (such as a robotic scrub nurse), down to invisible ones (like delivering drugs directly to infected tumors in the body).

Unfortunately, prototypes may be slow to progress from the research stage into commercial development. In order to ensure that concepts are able to translate into clinically feasible products, robotics engineers might consider learning more about medicine, according to Dr. Nobuhiko Hata, PhD., associate professor and director of the Surgical Navigation and Robotics Laboratory at Brigham and Women’s Hospital and Harvard Medical School.

“Professional training specifically tailored to medical robotics, but with enough exposure to the reality of medicine, is highly needed,” said Dr. Hata. He cites nanotechnology, material science, chemistry, tissue engineering, and artificial organs as some of the areas he would like to study more in depth as he proceeds with his work.

While robots won’t displace human surgeons anytime soon, they are gaining recognition as potentially viable aids in the operating room and other medical applications. What follows is a sample of concepts in development that demonstrate the bright future of robots in medicine.

Miniature Robot Fly

Researchers at the Technion — Israel Institute of Technology have developed a prototype of a miniature robotic fly sized at about a millimeter in diameter and 14 millimeters long. It fits on the tip of a finger and is made out of biocompatible silicone and metal. In the future, researchers hope it will be able to enter the body to diagnose diseases and potentially deliver drugs directly to infected tumors.

The robot is based on micro-electromechanical systems (MEMS) technology and its micro legs are steered either by an external magnetic force or through an on-board actuation system. a magnet that is moved over the body from the outside. Researchers plan to add a tiny camera to the robot so it can potentially be used in brachytherapy, short-distance radiation therapy, which is commonly used to treat head, neck, and prostate cancer. Another next step is to scale the robot down to a tenth of its current diameter — about 100 microns.

Cardiac Therapy

The HeartLander, developed at the Robotics Institute at Carnegie Mellon University, is a crawling robot that delivers minimally invasive therapy to the surface of the beating heart.
The HeartLander, a miniature mobile robot that delivers minimally invasive therapy to the surface of the beating heart, is under development at the Robotics Institute at Carnegie Mellon University. The crawling robot features two body sections that are each 5 mm tall, 8 mm wide, and 10 mm long. Locomotion is made possible by a wire transmission that runs through the tether to offboard motors. A graphical interface shows the exact location of the robot on the heart; real-time location is measured using a miniature magnetic tracking sensor (microBIRD, Ascension Technology) located on the front body of the crawling robot.

The robot can be driven using a joystick, or it can automatically walk to a specific target location on the heart. It is inserted into the body through a skin incision directly below the sternum, providing direct access to the heart without requiring deflation of the left lung. The surgeon then makes another incision in the pericardium (the sac that encloses the heart) and places the robot directly on the surface of the heart.

The HeartLander uses suction to adhere to the epicardial (outer) surface of the heart, a technique used by FDA-approved medical devices that stabilize the heart. The vacuum pressure is monitored and controlled by the computer.

Purdue industrial engineering graduate student Mithun Jacob uses a prototype robotic scrub nurse with graduate student Yu-Ting Li. Researchers are developing a system that recognizes hand gestures to control the robot or tell a computer to display medical images of the patient during an operation. (Purdue University photo/Mark Simons)

Kinect Lends a ‘Hand’ to Robotic Surgery

Since surgeons need to review medical images and records during surgery, a robotic scrub nurse may someday increase operating room efficiency by recognizing hand gestures and calling up the specific images that the surgeon needs to view while operating. Vision-based hand gesture recognition technology could help reduce not only the length of surgeries but also the potential for infection, according to Juan Pablo Wachs, an assistant professor of industrial engineering at Purdue University.

The system under development at Purdue uses a camera and specialized algorithms to recognize hand gestures as commands that instruct a computer or robot. Further research is needed to enable computers to understand the context in which gestures are made, so that they can discriminate between intended and unintended gestures.

A prototype robotic scrub nurse is currently working with faculty in the university’s School of Veterinary Medicine. Meanwhile, researchers hope to refine the algorithms to isolate the hands and apply “anthropometry” — predicting the position of the hands based on knowledge of where the surgeon’s head is. The tracking is achieved through a camera mounted over the screen used for visualization of images. The system uses a type of camera developed by Microsoft, called Kinect, that senses three-dimensional space. Eventually, the researchers plan to integrate voice recognition, but will continue to focus primarily on gesture recognition research.

Incorporating a better sense of touch into robotic-assisted surgery systems is another application for Microsoft Kinect. Electrical engineering grad students at the University of Washington recently adapted this technology for use with surgical robots by writing code that directs the Kinect to map and react to environments in three dimensions. The system then sends spatial information about these environments back to the user. Such technology could provide surgeons with useful feedback about how far to move the tool; for example, the system could be programmed to define off-limits areas such as vital organs.

Robots: Friends in Hostile Situations

PERL Research is working with the U.S. Army to develop the Dynamic Injury Severity Estimation System (DISE), an integrated system of intelligent software and sensors on a robot. (PERL Research)
A recent Georgia Institute of Technology study found that patients generally responded more positively to a robotic nurse’s touch when they believed the robot intended to clean their arm, as opposed to when they believed the robot intended to comfort them. While robots may not ever be able to compete with humans when it comes to displays of compassion, it is this same lack of humanity that makes them so useful in other situations, particularly in the line of fire.

Robots can be especially valuable in hostile or high-risk environments that present a danger for humans, such as search and rescue operations or on the battlefield. Since medics become vulnerable to danger whenever they are called upon to assess an injured soldier in the battlefield, scientists at PERL Research (Huntsville, AL) (www.perlresearch.com) were awarded a contract from the U.S. Army Medical Research and Materiel Command’s Telemedicine and Advanced Technology Research Center to develop a system to provide real-time, continuous estimation of a patient’s health status: the Dynamic Injury Severity Estimation System (DISE).

“Our assessment technology is based on thermographic sensing,” explained PERL Research senior scientist Paul Cox. “With our sensor suite, we are able to remotely perform Glasgow Coma Scale (GCS) and establish triage categories. The robotic triage system is an integrated system of intelligent software and sensors that optimally integrates a medic's remote assessment with an artificial intelligence decision algorithm to determine the status of the injured soldier remotely.”

The thermographic camera measures the person’s heart rate, respiration rate, and skin temperature. Also included are a spinal injury sensor and handheld triage computer that can determine injury severity based on the person’s vital signs. The system integrates the medic’s assessment (based on remote video monitoring and audio interaction) along with automated processing of the sensor data to determine the condition of the injured soldier.

For initial capabilities (remote triage and GCS), the system could be used in the field in less than two years. For more detailed patient assessment capabilities, such as the detection of internal hemorrhaging, more research is needed. PERL has completed initial testing with subjects using a lower body negative pressure chamber to simulate blood loss that can occur when a limb is severed or severely damaged. Early results indicate a correlation between the parameters extracted from the thermographic camera and stroke volume data (the amount of blood being pumped with each heartbeat).

Stroke volume is an indicator of hemorrhaging but is not practical to measure in the field using current devices. As more research takes place, someday this robotic technology will be able to aid in the early detection of internal hemorrhaging in soldiers on the battlefield.



Magazine cover
Medical Design Briefs Magazine

This article first appeared in the May, 2011 issue of Medical Design Briefs Magazine (Vol. 1 No. 4).

Read more articles from this issue here.

Read more articles from the archives here.