Every so often there’s a list categorizing robots, the top ten X robots. Like this fun one that IEEE Spectrum recently did on humanoid robots, highlighting ASIMO and four more recently developed bots.
Alternatively, Heather Knight has been getting some press for the opposite kind of list. Rather than a top-10 she’s doing a robot census at CMU. Frank Tobe of the Robot Report also left some interesting stats in the comments, pointing to current robot usage numbers and projected future usage.
So far she’s tallied ~550 robots on campus, and is getting the word out to count robots more broadly. So log in and count yer bots!
Charlie Kemp and his students took Georgia Tech’s PR2 down to the CNN studios last week for a live demo! They showed off some RFID assisted manipulation, the robot autonomously drove up and delivered a pill bottle to the newscaster. Their demo set up some comments from Willow Garage about the future of personal robotics, where robots are going to take over our repetitive tasks to free up our time for creative human endeavors. When asked when the PR2 or other such robots are going to be affordable for everyday folks, Keenan Wyrobek says its not 20 years out, but still a couple years away.
The electrical signals inside Lyric’s chips represent probabilities, instead of 1s and 0s. While the transistors of conventional chips are arranged into components called digital NAND gates, which can be used to implement all possible digital logic functions, those in a probability processor make building blocks known as Bayesian NAND gates. … Whereas a conventional NAND gate outputs a “1″ if neither of its inputs match, the output of a Bayesian NAND gate represents the odds that the two input probabilities match. This makes it possible to perform calculations that use probabilities as their input and output.
Sounds like their initial impact will be in the flash memory market, making error-checking faster and more efficient. But I can definitely see how this kind of hardware could have a major impact in Machine Learning and Robotics. Most of statistical machine learning has its roots in the kind of math that this processor is designed for. This could make reasoning about larger (more real-world) problems increasingly feasible.
Paro as emotional companion in the nursing home, and Autom as personal weight loss coach are the most prominently featured examples in this article that explores the touchy subject of robot companions, and what kind of relationships we might let robots hold in our lives.
“Shimon is an autonomous marimba-playing robot designed to create interactions with human players that lead to novel musical outcomes. The robot combines music perception, interaction, and improvisation with the capacity to produce melodic and harmonic acoustic responses through choreographic gestures. We developed an anticipatory action framework, and a gesture-based behavior system, allowing the robot to play improvised Jazz with humans in synchrony, fluently, and without delay. In addition, we built an expressive non-humanoid head for musical social communication.”
The new head is particularly nice–a 6-degree of freedom addition, that aesthetically matches the industrial looking mallet arms. It uses the camera in the head to perceive other band members, and times its head beats to the music. As they say, the purpose of the head is to create the social responsiveness and reaction that the other humans in the jazz group expect from a teammate.
And I’m not the only one who thinks they did a nice job on the social cues, the local Atlanta news anchors liked it too (though they did list the article under the heading “bizarre”).
Update: This work just received a best paper award at ICRA 2010, congrats to Guy and Gil!
This recent article in the New York Times looks at the complicated work of service dogs and argues that this suggests that there is more to dog intelligence than perhaps previously assumed. This reminds me of one of my favorite classes that I took as a graduate student at the MIT Media Lab. It was called “Cognitive Dog” and was taught by Bruce Blumberg (who is now at Blue Fang games, and teaches a version of this class at Harvard)
The premise of the class (and Bruce’s AI research) is that for “socially intelligent machines” perhaps we shouldn’t really be shooting for human-level intelligence, what would it take to get dog-level intelligence? Dogs are interesting because they are so expertly capable of social interaction with humans: reading social and emotional cues from humans, learning skills/tasks from humans and working collaboratively with humans. Sounds like everything I want a service robot to do!
I continue to find this idea of dog-level social intelligence inspirational from an HRI perspective because it forces you to admit that the problem is not about speech, language, or a common morphology for doing similar actions. Dogs don’t do any of these things, yet they accomplish so much in collaboration with humans. I think Dr. Blumberg is right, I want a robot that is as smart as my dog (…see fig. 1).
Their survey asked people about their willingness to have a robot in their home doing a variety of tasks, and older adults were more likely to say “Warn about a danger in my home” or “Inform my doctor if I have a medical emergency,” were important tasks.
This goes against the idea that older adults are generally assumed to be late adopters of technology. It is possible that the ability to live independently rather than move to an assisted living center will be a huge motivating factor that makes older adults the early adopters of social robots in the home.