So, Where’s My Robot?

Thoughts on Social Machine Learning

Inspirations from a Robot Actor


If you’re in the Boston area, you have two more chances to see the AUR robotic lamp in its theatrical debut. Guy Hoffman‘s robot is performing in a play written by another Media Lab student, Rony Kubat.

The robot moves expressively, orienting its lamp head in a social way with the actors and changing its light color, reminiscent of Pixar’s Luxo Jr. Most of the short play consists of the two actors taking turns talking with the robot. With only head orientations and color changes, the robot is really able to convincingly hold up it’s end of the conversation, as an empathic listener.

Several of us went to the opening night performance yesterday. I think one of the most interesting aspects of the play, for a robotics and HRI researcher, is the truly natural interaction. Any person that is around robots a lot knows that most of the time they do not work, and that they are subject to randomly not working after long periods of working. Thus, it was surprising to see the actors touching the robot, putting their face very close to the robot, and generally having very human-like interactions with the robot. I know that all of us roboticists in the audience were holding our breath, hoping there would be no loose connections or disastrous software glitches. However, I think that we were the only ones, the actors looked completely comfortable, and the audience seemed like this kind of human-robot interaction was completely normal…sure, robot actors, no big deal.

We can blame Hollywood for the greater than fiction expectation that people have about robots. To me, this play pointed out a particular aspect of human-robot interaction that we don’t always address with the robots in our research labs. People are going to want to touch and be close to the robot…. So, the robot has to be able to handle being touched (i.e., are your motors back-drivable?), and also has to move in a way that is safe for a human to be very close to (i.e., what does your robot do when it hits something?).

On that note, I’ll end with this video as food for thought. It’s a demonstration from the German Aerospace Center, showing a safe collision avoidance detection algorithm for physical human-robot interaction. I think I’ll wait for version 2.0, it still looks pretty painful. The video is long, if you tire of the chest and arm collisions, fast forward to the head collisions at the end (!)

May 10th, 2007 Posted by | Fun, HRI | no comments