The needle tip hovers.

All eyes watch Anthony Meyer, professor of surgery, attempt the biopsy again. Another swipe of the ultrasound reveals the target, a possibly cancerous tumor somewhere inside the patient’s liver.

Meyer narrates for a video camera as he tries again; “I can see through an aperture provided by digital reconstruction,” he says. “Okay, I’ve opened the probe. It’s armed.”

Meyer inserts the five-centimeter-long biopsy needle into the patient’s abdomen. Once the needle penetrates a suspect tumor, with a push of a button he can secure a tiny core sample.

I’ll shoot the probe. Is anything there?” he asks. Assistants lean in as he wipes the probe on a piece of blue surgical cloth. “Yes, it’s green, there’s grape!” Everyone cheers. The patient doesn’t even blink.

It’s a successful biopsy.

The diseased “liver” is really a bowl full of jello with a green grape standing in for the tumor. The patient is a plastic mannequin named Tall Paul. But the ultrasound, the probe, and the digital images making it possible are all very real.

The idea of augmenting one’s reality before surgery most definitely wouldn’t pass the Institutional Review Board without a hitch. But surgeons collaborating with computer scientists have brought augmented reality (AR) a step closer to the operating room. Like it’s more famous cousin virtual reality, AR renders three-dimensional images. The biggest dif-ference rests in the headset. In virtual reality, the headset blocks out all other sight. An AR headset leaves room for the surgeon’s peripheral vision.

A team of computer scientists create images of organs you couldn’t otherwise see inside a patient. Meyer works closely with Henry Fuchs, professor of computer science; Andrei State, senior research associate; and several other graduate students and research faculty. All are members of the Medical Image Display and Analysis Group (MIDAG).

We’d like the doctor to have as much awareness of the real world as possible,” says Jeremy Ackerman, a third-year M.D.-Ph.D. student working on a biomedical engineering degree. “So we’re combining a three-dimensional image, acquired via computed tomography and ultrasound, with what the surgeon would normally see.”

In other words, X-ray vision.

For instance, when Meyer donned the AR headset for surgery, two liquid crystal displays—much like mini-laptop comp-uters—perched like a pair of thick sun-glasses before his nose. Glancing up, Meyer saw what the computer had drawn, a hole superimposed on the patient’s abdomen. A computed tomography slice, a CAT scan, showed the internal organs in position. When he moved his head, the hole and the organs inside moved, too. Right inside the cavity, an ultrasound image of the tumor gave Meyer a target. The headset conveniently overlaid all the images, including a bright green line showing his surgeon’s needle in relation to it all.

Like scavengers at a salvage yard, the scientists put together the headset from available parts found at medical and computer-equipment tradeshows. They also built some of it from scratch. On top of the headset, three light-emitting diodes —like those in flashing bicycle reflectors—flicker steadily. A tracking device above the operating table picks up the flashes and sends it to a computer. As Meyer’s headset moves like a boat—pitch, yaw, and roll—in three dimensions, the computer knows exactly where he is in relation to the patient.

The offset takes a little getting used to,” Ackerman says. “That’s the difference between what you see and what the headset shows you.”

The quirks about using augmented reality intrigue Ackerman the most. He’s studying whether AR improves surgery by coming up with simulated tasks that ferret out the tech-nology’s weak points. “I want to know the key component. Is it because current technology—the displays and tracking—isn’t good enough, or do we not draw the pictures well enough?”

Some of the most minute problems may be the hardest to solve. Already AR is being used by architects who visualize structures before they’re built. Boeing fits aircraft workers with headsets so they can install miles of 747 wiring without having to look at a wiring diagram.

In aircraft construction there are other ways to verify the work. But here you have someone’s life on the table,” Ackerman says, pointing to the operating table. “The stakes are much higher. We’re talking about operating on blood vessels where a millimeter or two in the wrong direction can be fatal. You have to be right the first time.

Dr. Fuchs says we need to take on the biggest challenge, so when we succeed, we’ve really accomplished something,” he says.

Fuchs would know. The computer scientist has racked up a number of awards, including the Satava Award for outstanding contribution to virtual reality as a tool for health care. He won the award at the 1997 Medicine Meets Virtual Reality conference, where physicians, health care providers, researchers, educators, and, yes, investors rub shoulders. While his college classmates sit astride silicon empires, Fuchs found success in academia.

Why? Because here I have the latitude to do all sorts of wild and crazy things,” he says. “In a really good department like this one, they don’t ask you how well do you do X,Y, or Z. They ask, `How well do you do the things that you’ve chosen to do?’ If I were asked to do a particular specific thing, I’d probably fail. I’m more self-directed than is probably good for me.”

Nearly 11 years ago, Fuchs watched his pregnant wife undergo her first amniocentesis, a test for possible abnormalities in their developing baby. The process looked awkward. What if an ultrasound image could go inside the patient? Fuchs thought. “It seemed pretty obvious,” he says. “If you could just put it on your head-mounted display and have it registered, you could see, `Where’s the baby?’” Now he sees other applications, from liver biopsies to televised 3-D surgery. Already, the AR headset has been used with success in two breast biopsies.

Part of Fuchs’ success depends on his free-wheeling research style. During the liver biopsy experiment, no one researcher called the shots. A graduate student like Ackerman traded suggestions with Fuchs and Meyer. “I love it that way,” Fuchs says. “A new student can suggest new things that will take us in a new direction. What helps is having a mix of people who can thrive in that kind of chaos.”

As he escorts Meyer to Sitterson Hall’s back door, Fuchs asks about the grape experiment. The door opens to a blustery winter afternoon. They can’t decide when next to meet, since Fuchs will be at a conference and Meyer’s schedule looks well booked. Meyer explains that with some organs, it’s much harder to tell by feel where a tumor is, thus the imaging will be that much more important. He says maybe they should come up with a softer target, rather than a grape. Fuchs agrees. He looks at his watch. He says he’s got time to try one more idea. Fuchs wants to run it by the others. 

Christopher Hammond was a student who formerly contributed to Endeavors.