Researchers at Purdue University and the Indiana University School of Medicine are developing an "augmented reality telementoring" system to provide effective support to surgeons on the battlefield from specialists located thousands of miles away.

In telementoring, a surgeon performing an operation receives guidance remotely from an expert using telecommunications. However, current systems require the surgeon to shift focus to a nearby apparatus called a "telestrator," diverting attention from the operating table, said Juan Wachs, an associate professor of industrial engineering at Purdue.

"Telementoring is widely used, but it's still primitive in that it has not kept pace with advances in information technology and computer graphics," he said. "It is usually done with a telestrator displaying a video of the surgery overlaid with graphical annotations, which requires the surgeon to look away from the operating table while receiving mentoring advice."

The new System for Telementoring with Augmented Reality (STAR) harnesses various technologies such as transparent displays and sensors to improve the quality of the communication between mentor and surgeon.

"It is a step toward overcoming current limitations in telementoring by using so-called augmented reality to enhance the sense of 'co-presence,' " Wachs said.

The system integrates the mentor annotations and illustrations directly into the field of view of the surgeon using a transparent display, eliminating the need to shift focus away from the operation, said Voicu S. Popescu, a Purdue associate professor of computer science. The transparent display is placed in between the surgeon and the operating field.

"The surgeon sees the operating field, the instruments, and their hands as if the display were not there, yet the operating field is enhanced with the mentor's graphical annotations," he said.

Researchers have been evaluating the system, and findings are detailed in a paper1 appearing online earlier this year in The Visual Computer. The paper also will appear in an upcoming print issue of the journal.

"Optimal surgery and trauma treatment integrates different surgical skills frequently unavailable in rural and field hospitals, and telementoring can provide the missing expertise," said Gerardo Gomez, a professor of surgery and director of the Division of Trauma, Surgical Critical Care and Emergency Surgical Services at the IU School of Medicine. "Two primary applications are surgeries in the battlefield and in rural regions where specialists might not be available."

The researchers tested the STAR platform in a simulated "Role 2" military medical facility context, which consists of a forward surgical team with limited resources.

"STAR may be useful in providing instructional simulations to support doctors serving in Iraq and Afghanistan in traumatic care on the battlefield," said Brian Mullis, an associate professor of orthopedic surgery at the IU School of Medicine and chief of Orthopaedic Trauma at Eskenazi Health in Indianapolis. Mullis is a Commander in the U.S. Navy who has served in combat in Afghanistan.

Augmented reality refers to enhancing the visual perception of the real world with instructive annotations, such as a line showing the surgeon where to make an incision, virtual instruments, and icons for hand gestures that instruct the surgeon to carry out specific actions. The transparent display is implemented with a tablet positioned between the surgeon and the operating field and held in place either with a robotic arm or a mechanical holder controlled by a surgical assistant. The tablet acquires a video stream of the operating field as the surgery proceeds. The video stream is sent to the mentor who enhances it with annotations, which are sent back to the surgery site and where they appear on the transparent display.

Researchers are testing the system while performing procedures commonly carried out in operating rooms: cricothyrotomy, in which a tube is inserted into the throat to establish an airway; laparotomy, in which an incision is made in the abdomen to examine internal organs and structures in the abdomen; and fasciotomy, a limb-saving procedure that involves cutting the fascia, a layer of fibrous connective tissue surrounding muscles, to relieve pressure.

"We want to know whether the system as a whole is better than the state-of-the -art in telementoring," Wachs said.

The system uses computer vision algorithms in attempts to keep the annotations aligned with the quickly changing images of the surgery.

"If I take a picture now, two seconds from now the camera has changed position and the patient has changed position, so the system has to keep the annotations in the proper place by aligning specific points that exist in every image," he said.

The annotations are then said to be "anchored" from one frame to the next so that they remain in the same location no matter how much the patient and camera move.

Researchers have tested the system thus far with animals and a manikin-like "synthetic patient simulator."

"The study provides a preliminary indication that the system allows trainees to follow some mentor instructions more accurately than existing telementoring systems," Wachs said. "Data suggest the system can provide meaningful improvements to the accuracy of surgical tasks."

The Visual Computer research paper was authored by Purdue graduate student Daniel Andersen; Popescu; Purdue graduate students Maria Cabrera and Aditya Shanghavi; Gomez; Sherri Marley, a surgical trainer at the IU School of Medicine; Mullis; and Wachs.

Ongoing research focuses on improving the robustness of annotation anchoring with surgical field changes, the anchoring frame rate and the transparent display simulation fidelity. One complication is that because the surgeon's hands are between the camera and the surgical field, they sometimes momentarily obstruct the mentor's view. An algorithm could detect the obstruction and render the surgeon's hands semi-transparently.

Another limitation is that the mentor's view of the surgical field is slightly different from the view from the tablet's camera. The researchers are working to create images that mimic the realism of looking through a window.

"The video acquired by the tablet will be warped to the view of the surgeon, which will require acquiring the operating field with a depth camera, similar to the Kinect camera, and will require tracking the surgeon's head," Popescu said.

The system requires only off-the-shelf equipment such as consumer electronics.

"We are using tablets that exist, robots that exist, we even did some experiments using Google Glass," Wachs said.

Such a system might one day be combined with sophisticated surgical robots, allowing specialists to actually perform procedures remotely.

This work was supported by the Office of the Assistant Secretary of Defense for Health Affairs under Award No. W81XWH-14-1-0042. Opinions, interpretations, conclusions and recommendations are those of the author and are not necessarily endorsed by the Department of Defense.

Writen by: Emil Venere