Every so often there’s a list categorizing robots, the top ten X robots. Like this fun one that IEEE Spectrum recently did on humanoid robots, highlighting ASIMO and four more recently developed bots.
Alternatively, Heather Knight has been getting some press for the opposite kind of list. Rather than a top-10 she’s doing a robot census at CMU. Frank Tobe of the Robot Report also left some interesting stats in the comments, pointing to current robot usage numbers and projected future usage.
So far she’s tallied ~550 robots on campus, and is getting the word out to count robots more broadly. So log in and count yer bots!
As some may have noticed, this blog tends to go through a hiatus from time to time, for one reason or another over the years. The hardest part of a hiatus is starting up again with that first post. It feels like it needs to be so significant, given how much time has passed. So, to get over this first-post-back syndrome, I’ve decided that surely the most important thing to happen in robotics and HRI since December was that Barbie’s 126th career will be Computer Engineer! Which is pretty darn close to a roboticist…they were asking for accessory suggestions, and I put in my vote for a robot dog.
A fun video from Willow Garage last week. The last couple of years they have been getting an army of interns over the summer. When I visited last summer, I was told that their staff doubled when the interns showed up! Not sure if that was true this summer as well, but it does look like the interns had some fun with the PR2 in the 3 day Intern Challenge.
The challenge was a waiter task: take a drink order, deliver the drink, pick up the mess. Both teams used a mixture of autonomous behavior and teleoperation, and it’s not completely obvious from the video what is autonomous or not. One of the teams created a more humorous interaction that was fun for the audience. I’m not sure if they thought of it this way, bot others have suggested that self-deprecating humor can be a good interaction technique. It lowers the user’s expectations in a way that is not disappointing.
It looks like the most challenging HRI task is the object handoff, it was awkward every time. The human didn’t know if they were supposed to wait for the object to come to them, or meet the robot halfway. Larry Page was looking at the hand, and waiting, and then looking to a person near the camera, and then he finally got his drink. It looks like the robot should expect people to be helpful and meet it halfway (at least), especially if its moving slowly. This and some simple force sensing to tell if the objet in hand is being manipulated would be fun to try and see if the handoff works a little better.
Previously, I’ve talked about approaching social robot behavior generation as an animation problem. I think this is a fun challenge because in many ways animating a social robot is similar to a Pixar character like Wall-E or Luxo Jr., but there are interesting ways in which is it different.
I’m thinking of this again since the Luxo Jr. animatronic recently made it’s debut at the Disney Hollywood Studios! It is quite fun to see this classic character come to life. And it is interesting to see the differences between the robot character and the on-screen version. I think in large part because it has much slower motion, this creates a completely different personality. However, it is still quite expressive and fun.
Beki Grinter and other colleagues at Georgia Tech recently had a paper at Ubicom on their ethnographic study of people’s Roomba usage. It colllected quite a bit of press, an AP article, robots.net, and even inspired Comedy Central’s Colbert Report to feature robots as the #1 Threat Down…nice!
These Roomba studies always leave me wondering about the people that we are not hearing from. I personally know a lot of people that stopped using a Roomba, they either didn’t find that it cleaned very well, or got tired of “roomba-izing” their houses. It’s facinating to learn about the Roomba fans, but as a counterpoint it would be great to see some interviews and analysis of the non-fans too.
They highlighted a couple of the big social robots themes, healthcare and eldercare. A highlight was definitely when Diane Sawyer was moving Domo’s arm around asking if it could “feel” it (compliant and force sensitive manipulators was Aaron’s focus with Domo) …and when she pulled too hard Domo said “ouch.”
The robot moves expressively, orienting its lamp head in a social way with the actors and changing its light color, reminiscent of Pixar’s Luxo Jr. Most of the short play consists of the two actors taking turns talking with the robot. With only head orientations and color changes, the robot is really able to convincingly hold up it’s end of the conversation, as an empathic listener.
Several of us went to the opening night performance yesterday. I think one of the most interesting aspects of the play, for a robotics and HRI researcher, is the truly natural interaction. Any person that is around robots a lot knows that most of the time they do not work, and that they are subject to randomly not working after long periods of working. Thus, it was surprising to see the actors touching the robot, putting their face very close to the robot, and generally having very human-like interactions with the robot. I know that all of us roboticists in the audience were holding our breath, hoping there would be no loose connections or disastrous software glitches. However, I think that we were the only ones, the actors looked completely comfortable, and the audience seemed like this kind of human-robot interaction was completely normal…sure, robot actors, no big deal.
We can blame Hollywood for the greater than fiction expectation that people have about robots. To me, this play pointed out a particular aspect of human-robot interaction that we don’t always address with the robots in our research labs. People are going to want to touch and be close to the robot…. So, the robot has to be able to handle being touched (i.e., are your motors back-drivable?), and also has to move in a way that is safe for a human to be very close to (i.e., what does your robot do when it hits something?).
On that note, I’ll end with this video as food for thought. It’s a demonstration from the German Aerospace Center, showing a safe collision avoidance detection algorithm for physical human-robot interaction. I think I’ll wait for version 2.0, it still looks pretty painful. The video is long, if you tire of the chest and arm collisions, fast forward to the head collisions at the end (!)
Something light for Friday…. This little cartoon, reminds me of a study that the People and Robots lab at CMU did a while back. One of the things they found was that its limitations was one of the endearing qualities of the Roomba, people liked that it needed their help and assistance. I don’t think this means we should make robots that don’t work on purpose, people would presumably prefer a robot that works as advertised. But it does say something about how a robot can take advantage of the human in the loop, and argues for taking this into accout during the design process. (Thanks SB.)
A little late due to my brief hiatus in Brazil, but the recent Keepon YouTube craze has reminded me to post about my favorite papers at HRI 2007 — all related to the importance of timing and rhythm in social robots.
M.P. Michalowski, S. Sabanovic, H. Kozima, “A dancing robot for rhythmic social interaction.” This work is motivated by the observation that timing and rhythm is a fundamental aspect of social interaction. While the field tends to focus on dialog or gestures or the content of interaction, this work aims to understand how robots can get the underlying timing of social interaction right. To test these ideas, they are working with the robot Keepon, and giving it the ability to dance. Looking at, for example, when Keepon is not dancing to the beat of the music, whether kids follow the beat of the music or the beat of Keepon in a social interaction.
Recently Marek’s video got to the front page of YouTube, was a big hit, and got tons of response videos. – People – love – dancing – robots…
Guy Hoffman and Cynthia Breazeal, “Effects of Anticipatory Action on Human-Robot Teamwork: Efficiency, Fluency, and Perception of Team.” This is another unique perspective on social robots, and was the best student paper. It’s inspired by the idea that two people working together on a task get better with practice. The team speeds up as the timing of the joint activity is learned and each partner specializes their role. I think this is interesting work because people don’t often think about the behavior of a robot needing practice. It’s programmed and it’s done. But when we’re talking about Human-Robot Interaction and joint activity, the concept of practice and learning to anticipate the partner’s actions becomes fundamentally important.
Gil Weinberg and Scott Driscoll, “The Interactive Robotic Percussionist: New Developments in Form, Mechanics, Perception, and Interaction Design.” And finally, I thought that Halie robot was among the best work presented at HRI. This project is about machine listening, the robot has a single arm for playing the drums, and is able to listen to the drumming of a partner and play along. Like the above papers, this work acknowledges that much of the intelligence of social robots is in the timing of the interaction and anticipating your partner. Music is a prime example where, with bad timing, everything breaks down. But this concept of timing and rhythm transfers to all forms of social interaction.