Meet Melvin, the Collaborative Robot

by Charles Rich

Charles Rich, right, and Aaron Holroyd, a senior majoring in computer science and robotics engineering, in the “play kitchen” with Melvin.

If human-like robots are ever going to move freely among us, we will need to understand how to program them to collaborate with us smoothly and naturally.

If you believe what you see in science fiction movies, human-like robots are just around the corner—if not already operational in secret research labs. The truth is, though robots are common in manufacturing and industry today, they do not interact directly with humans. For safety reasons, most robots operate inside cages or in restricted areas. (The turtle-like floor-cleaning robots developed by iRobot, a Boston-area company, are notable exceptions.)

If human-like robots are ever going to move freely among us in a broad range of situations (assisting the handicapped and elderly, conducting search-and-rescue operations, or playing roles in sales and entertainment), we will need to do more than make them safer; we will need to understand how to program them to collaborate with us smoothly and naturally.

Smile When You Say That

His designers gave Melvin an expressive face so he could mimic a range of human emotions. Can you guess which emotions he is expressing in these three photos? (Answer at bottom of page)

From top: surprised, angry, bewildered

When you work with another human, your interaction consists of not only what you say (verbal behavior), but how you move your body (nonverbal behavior). This intricately timed physical dance has tacit rules regarding, for example, where you look, when you nod your head, how you gesture with your hands, how you orient your body, and how long you wait for a response. A robot that does not correctly follow these nonverbal interaction rules will be difficult to work with.

When I joined WPI last year, I brought with me a unique tool for this kind of research: a humanoid robot named Melvin (he is one of only two such units—the other is at Indiana University). Built by the University of Sherbrooke and RoboMotio Inc., Melvin has a moveable head and arms, and an expressive face—for a total of 19 degrees of freedom (including his two-wheeled mobile base). A speaker, microphone array, and stereo camera let him talk, hear, and see. Melvin is connected to several computers that run various kinds of artificial intelligence software, including programs for computer vision, natural language and speech understanding and generation, planning, and dialogue modeling. Together, these programs support his totally autonomous interaction with humans.

The current focus of our research with Melvin is on two-participant collaborations. We are beginning by videotaping pairs of people (no robot) interacting in a “play kitchen” setup. By studying the tapes, we will develop a detailed catalog of the subjects’ nonverbal behaviors. Next, we will program Melvin to follow the nonverbal interaction rules we discover through this analysis. Finally, we will test our work by conducting formal user studies with Melvin and a human subject (in the same play kitchen) to see how easy it is for a human to collaborate with Melvin.

Our future research plans include studying three-participant collaborations (two humans and one robot), which have much more complicated rules; introducing robot mobility, to see if Melvin can appropriately manage his body position, gaze, and gestures as he talks with someone while walking down a hall; and investigating the role of facial emotion expression (left) in effective collaboration.

As with all research at WPI, this project offers opportunities for both graduate and undergraduate student participation. Already, it is attractive to students enrolled in WPI’s new, first-of-its-kind undergraduate degree program in robotics engineering.

Rich is a professor of computer science at WPI and a member of the Interactive Media and Game Development faculty. This work is supported, in part, by the National Science Foundation under award IIS-0811942.