Virtually Real

“I’m interested in game technology, but as an artist, I want to use it as a medium for exploring ideas.” — Joseph Farbrook

by Alexander Gelfand

In the late 1950s, filmmaker Morton Heilig developed a remarkable machine called the Sensorama. Seated before it, a viewer watched and listened to a 3-D stereoscopic film of a motorcycle ride while his seat tilted or vibrated to simulate the feel of the moving bike and fans blew wind scented with outdoor smells in his face. It was the beginning of virtual reality.

It would take computers decades to become powerful enough to approach Heilig’s achievement. Today, drawing on that power, scientists and engineers are beginning to realize the long-hoped-for promise of virtual and augmented reality. At WPI, researchers are working at the forefront of this field, developing technology that lets users experience virtual worlds with all of their senses, helping surgeons use real-time medical imaging to look inside the body as they operate, and using virtual reality to help us see the real world in new ways.

Mirroring Real Life in Second Life

“I’m interested in game technology, but as an artist, I want to use it as a medium for exploring ideas,” says Joseph Farbrook, assistant professor in WPI’s Humanities and Arts and Interactive Media & Game Development programs and an artist who helps students learn to create provocative art using digital technology.

Thus was born Strata-Caster, a virtual art installation that allows viewers to navigate an artificial world using a controller that looks — and behaves — like a wheelchair.

Users sit in the chair and spin its wheels to move through a series of themed installation spaces — or rooms — that are projected on a large screen. The rooms were built in Second Life, the popular virtual-reality world where players can adopt identities, or avatars, of their choosing. Each space is filled with objects that Farbrook designed himself, or bought within Second Life.

And therein lies the crux of the piece. “People are purchasing things in Second Life and making replicas of all the things we have in the real world,” Farbrook says. “We’re bringing our physical culture into a place where it doesn’t have any relevance. But we’re so programmed by our present culture that we can’t let go of it.”

The notion that we are unnecessarily importing real-world concepts into virtual-reality environments applies to more abstract baggage, as well. Conflict, social hierarchy, economic disparity . . . these things are also replicated in Second Life, Farbrook says. “People say that’s just the way things are. But it’s not necessarily the way things have to be. These ideas have just become so ingrained in our culture that we take them for granted.”

Hence the wheelchair. The idea, Farbrook explains, is to “de-familiarize” viewers by putting them in a setting they are not used to, and then have them roll through a series of environments that do much the same thing, thereby calling attention to “cultural constructions that are totally arbitrary, and that may not even be relevant to our time.”

Making high-tech, high-concept art isn’t easy. The undergraduate computer science and robotics majors who designed the wheelchair interface used light sensors to track the motion of the wheels, but Second Life only accepts arrow-key input or the letter-key equivalent. So after coding a virtual wheelchair that obeys the laws of physics, they had to translate all the wheel data into a series of keystrokes.

It took a lot of work, but by the time Farbrook presented Strata-Caster at the 37th annual SIGGRAPH conference on computer graphics and interactive techniques in the summer of 2010, “the wheelchair really behaved like a wheelchair.”

The conference organizers built an enclosed space for him veiled with big black curtains, and a projector was rigged to avoid casting shadows on a 15-foot diagonal screen.

“It was a very immersive experience,” Farbrook says.

Gregory Fischer, right, and PhD candidate Hao Su with a robot designed to implant electrodes for deep-brain stimulation while inside an MRI scanner.

A Virtual Assist for Surgeons

Gregory Fischer is quick to point out the potential benefits of using robots and magnetic resonance imaging (MRI) to guide delicate surgical procedures like inserting electrodes deep in the brain or implanting tiny radioactive seeds in the prostate to kill cancerous tumors.

For one thing, MRI images can be continuously updated, allowing doctors to compensate for the way internal organs shift and swell when poked and prodded by needles and probes. “Instead of working from stale images on a light box in an operating room, we are focusing on using real-time, hi-res images,” says the assistant professor of mechanical engineering.

And using robotic devices to align and insert those needles and probes from inside the MRI scanner would obviate the need for doctors to work within the constraints of a tube that is roughly 5 feet long and 2 feet wide. There’s only one problem: MRI scanners wreak havoc on electronic devices, and electronic devices and metallic objects wreak havoc on MRI scanners.

MRI scanners work by bathing patients in a magnetic field 30,000 to 60,000 times the Earth’s magnetic field, bombarding them with radio waves, and interpreting the electromagnetic signals that the molecules within their bodies generate in response. That 1.5-3 Tesla magnetic field has been known to suck in wheelchairs and hospital beds, so putting ferrous materials inside the scanner is clearly prohibited. Meanwhile, placing anything inside the scanner’s bore that generates its own electrical signals will create image-marring artifacts. That rules out just about all standard electromagnetic parts in robots, including actuators, sensors, and controllers.

The solution? Fischer is developing robots made without metal, using custom-built electronic components that will neither interfere with, nor be destroyed by, MRI technology. He and his team have so far experimented with pneumatic, hydraulic, and ceramic piezoelectric actuators. And he’s using fiber-optic force sensors to build remote-controlled robots that can provide haptic feedback: a doctor standing several feet from the scanner, guiding a robot arm as it inserts a needle in someone’s body, will feel the same resistance as the robot itself. “Doctors,” he says, “are reassured when the sense of touch is restored.”

A robotic needle-placement system customized for prostate work is currently being tested at Brigham and Women’s Hospital in Boston, and Fischer plans to begin testing a system tailored for deep-brain stimulation as soon as possible at the University of Massachusetts Medical School. In each case, he’s trying to build systems that doctors and nurses can maintain with minimal fuss. “We’re trying to make these systems very easy to use — simple and reliable,” he says.

After all, brain surgery is hard enough without having to reboot your robot helper.

“In virtual reality, researchers typically focus on one of the senses. But in the real world, we use all of our senses together. So I think to be really effective, we need to look at the senses not in isolation, but in concert.”
— Robert Lindeman

So Real You Can Touch It

Imagine that you are standing inside a room in a virtual environment. You open a window, and feel a breeze caress your cheek. You hear a voice behind you, and as you shift your weight to walk toward it, you feel your right shoulder graze the windowsill.

Most virtual reality (VR) environments don’t feel quite so real. But Robert Lindeman, associate professor of computer science and director of the Human Interaction in Virtual Environments (HIVE) lab, is hoping to change that.

“In VR, researchers typically focus on one of the senses,” Lindeman says. “But in the real world, we use all of our senses together. So I think to be really effective, we need to look at the senses not in isolation, but in concert.”

That integrated, holistic approach has led Lindeman to investigate everything from sound and sight to touch and smell, all with an eye toward increasing a person’s sense of “presence”— the subjective feeling one has of being totally immersed in a virtual environment.

Inside the TactaCage, Robert Lindeman, associate professor of computer science, right, and PhD candidate Paulo de Barros, outfit a mannequin with the TactaVest, a device that adds the sense of touch to virtual worlds. PhD candidate Jia Wang, seated, left, and master’s candidate Tonje Stolpestad run a student-designed aerial surfing simulation inspired by the Marvel Comics superhero Silver Surfer.

And those investigations have led to a series of inventions, like the TactaVest, a modular neoprene garment studded with miniscule pager motors of the sort that make cell phones and Sony PlayStation controllers vibrate. The TactaVest includes segments for the shoulders, elbows, and back; and individual motors can be made to vibrate with varying intensity when the person wearing the device brushes up against objects in a virtual world or is hit by a bullet in a first-person shooter game.

One of Lindeman’s students is adapting this “vibrotactile” approach to represent sensor data from teleoperated robot systems like military drone planes. The data from such robots is typically represented graphically on computer screens, forcing operators to interpret a great deal of visual information. Offloading some of that information to other senses gives them “an experience that’s closer to the real world, because it’s multimodal and multisensory.”

Another student is adapting the kind of balance board used in Nintendo’s Wii Fit exercise system to let users “surf ” through virtual space simply by shifting their weight as they stand atop an elevated platform loaded with motion sensors. And Lindeman himself recently investigated the use of bone-conduction technology to generate directional sound in augmented and virtual reality environments. The technology transmits vibrations through the mastoid bone to the cochlea, allowing computer-generated sounds to mingle with real-world ones.

Now, Lindeman has combined the fruits of these various endeavors to create the TactaCage, an octagonal frame built out of PVC tubes and outfitted with cameras and PC fans. The cage’s occupant dons a VR headset, headphones, and a TactaVest, and stands on top of a balance board dubbed the “Silver Surfer.” He shifts his weight to move forward, and feels fan-driven air move against his body. He hears sounds with specific points of origin, and feels something in his back or elbow when he bumps into a virtual wall. Before long, a user will be able to don a motion capture suit festooned with light-emitting diodes while cameras in the cage capture his physical movements and represent them in the virtual environment. Imagine raising your hand in Second Life and seeing your avatar do the same.

“We’re trying to get closer to reality,” Lindeman says.