So, Where’s My Robot?

Thoughts on Social Machine Learning

Mr President, Let’s not call it AI

I had a unique opportunity yesterday, I was invited to participate in a PCAST workshop (the President’s Council of Advisors on Science and Technology).   The theme of the meeting was Bio/Info/Nano tech, what exciting opportunities are happening in these fields that will create jobs in the US, and what the government can do to spur innovation.   I’ll probably have a couple of SWMR posts about the discussion, and thought I’d start off with one about my contribution to the discussion, since I was the only roboticist in the room.

They had a wide range of discussants, several of us were early career researchers, which I think were invited to share our “what’s new and exciting that’s going to create jobs” point of view.  Another contingent of the discussion group were more seasoned researchers and entrepreneurs, that had a sort of from the trenches perspective of how the government’s support of basic research has changed over the years.

Each discussant had the opportunity in a three minute introduction to make a statement to the council.  Here’s a recap of what I said:

The technology opportunity I decided to highlight is service robotics, because they have the potential to dramatically impact such a diverse set of societal needs.  Robots that are capable of working alongside people will revolutionize workplaces, for example in manufacturing.

Robotics represents perhaps our best opportunity to achieve higher levels of domestic manufacturing agility and overall productivity needed to retain high-value manufacturing jobs in the U.S., provided that the current state of the technology can be significantly advanced.

Today’s industrial robots lack the capabilities required to do more than just blindly execute pre-programmed instructions in structured environments.   This makes them expensive to deploy and unsafe for people to work alongside.

There is an opportunity to usher in a new era of agile and innovative manufacturing by developing service robots as co-workers in the manufacturing domain.  These capable assistants would work safely in collaboration and close proximity to highly skilled workers.   For example, providing logistical support, automatically fetching parts, packing/unpacking, loading, stacking boxes, emptying bins, detecting and cleaning spills.

Very similar logistical robotic support could help streamline the operation of hospitals, driving healthcare costs down.

In order to realize this vision, we need to move beyond robots only operating in relatively static structured environments.  This presents several research challenges, and I think that the following three are most critical to progress.

– This requires advances in sensing and perception technology, allowing robots to keep track of a dynamically changing workplace.

– Manipulation is a key challenge as well, robots need the flexibility to be able to pickup and use objects in the environment without tedious pre-programming of specialized skills.

– Finally, an important challenge in bringing these robots to fruition is advances in human-robot interaction.  We need these robots to work safely and efficiently in collaboration with human workers.  People can’t just be seen as an obstacle for the robot to navigate around, the robot needs to reason about and understand people as interaction partners.

Recently, over 140 robotics experts across the country have come together to articulate a national robotics initiative, a robotics research roadmap.  This roadmap lays out the target areas where we think robotics research efforts need to be supported in order to bring about robot technology that will have the biggest impact on our economy and our society.

The comment I got from one of the council members was interesting, she said (I’m paraphrasing) “Aren’t you leaving out the challenge of Sentience or AI needed?”  I only had time for a short answer, and said something to the effect that, yes, I think that the notion of AI cuts across all of the three areas I mentioned, but particularly human-robot interaction.  In order for a robot to work side-by-side with a human partner it will need human compatible intelligence capabilities.

But here on SWMR, I’ll give the longer answer….that, no I don’t think we need AI for service robots.  Or I don’t think that’s what we should call it.  Yes, perception and manipulation and HRI and autonomy in general all fit under the big umbrella term of AI.  But the term AI is so vague, and it makes people think of science fiction, which then makes you feel like robots in society is some pipe dream far in the future.  So, particularly in settings like PCAST where people want to hear about concrete objectives and job creation, it does our field no good to just lump everything under the term AI.

If instead we talk about the specific intelligence challenges suddenly it all seems much more achievable, and you can imagine some semi-autonomous form of service robots being deployed in the not so distant future.   We see that, hey sensing technology is getting better and better, and look at all the academic and industrial partners working on the manipulation problem, that seems achievable.  And in terms of AI for human-robot interaction, yes we need to make some significant advances in computational models of social intelligence before robots can truly interact with people in unstructured environments.  But do we need to solve AI?  I don’t think so.

June 23rd, 2010 Posted by | Conferences, HRI, Industry | no comments

Postdoc Position: Interactive Robot Learning

The Socially Intelligent Machines Lab at Georgia Tech is looking for a postdoc.

The work will focus on HRI for robots that learn interactively from human teachers. This will involve algorithm development, robot control implementation, and system evaluation with human subjects. The experience will include working with undergraduate, MS and PhD students, and with interdisciplinary faculty collaborators.

Applicants should have a Ph.D. in Computer Science or a field clearly related to the research area above.

Qualified applicants should provide the following materials:

  • Cover letter briefly describing your background (including information about PhD institution, dissertation, and abstract) and career plans
  • Date of availability to start the postdoc
  • CV
  • Names and contact information for at least three references including the PhD advisor,
  • Link to a research web site.

These documents should be submitted as a single PDF to Prof. Andrea L. Thomaz with the email subject line: “Postdoc Candidate”

The position is guaranteed for a year from the start date, with a possible second year extension.   The position has been open since June 4.  We are currently reviewing applications and hope to fill the position as soon as possible.

June 15th, 2010 Posted by | Announcements, GT Lab Updates | no comments

Telepresence Robots

There’s been a recent flurry of announcements of telepresence robots.  QB is now available from the CA based Anybots (shipping Fall 2010).  Texai is the WillowGarage platform (not yet for sale).  And Vgo from startup, Vgo Communications, was recently covered in the Boston Globe.

Some visions of these platforms include: telecommuters logging in to interact with their co-workers at the office, or attending a meeting with co-workers in another city, or using it to check out a situation at the factory from the comfort of your office in another location.  While you’re probably not going to buy one for personal use, (QB has a $15K price tag, and Vgo is about $6K) you might just start to see them roaming around your office.

Vgo and Texai look more like laptops on wheels, but QB has a more anthropomorphic presence that I think is nice.  It’s adjustable height lets it get fairly tall, which is necessary for the telecommuter to get any reasonable view of the remote location, and allows it to participate in standing or sitting conversations.   Yet it is small/skinny/light which makes it safe and easy to operate in close proximity to people.   You drive it with the arrow keys on your keyboard, to make it go L,R,forward,back.

One thing I think it missing from all of these telepresence robots, but particularly from an anthropomorphic one like QB is a neck tilt.   For utility purposes, I imagine it would be nice to scan up and down with the camera.  But also for social purposes, it would be nice for the person to trigger head nods (and shakes if we could add another DOF).   If this robot is going to be standing in for people in conversations, it could be awkward if it doesn’t display any backchannel communication.  People will get used to the robot not doing it, but it’s presence in the conversation would be much stronger with a backchannel.  These are the little “yeahs” and “uh-huhs” that people mutter to let the speaker know “I’m with you, keep going.”   Much of this happens over the speech channel, which QB will capture, but we also use body language to communicate this back channel info to the speaker.   Allowing QB to nod would let the telecommuter give both physical and verbal backchannel to the person they are speaking with.

I’m excited to see how people start using their Anybots and Vgos, the water cooler will never be the same!   Here’s a video of QB in action.

June 9th, 2010 Posted by | HRI, Industry | 3 comments

PR2 Graduation + GT Spotlight

Nice to see our GT team highlighted on the Willow Garage page today:

Robots like the PR2 may be able to help older adults stay in their homes longer with a high quality of life. The Georgia Tech team aims to make progress towards this long-held dream.  Rather than try to guess what seniors want, the team will work with older adults to better understand their needs and how robots can help. The team will also write code to make the PR2 perform helpful tasks at home. By working closely with seniors throughout the research process, the team hopes to better meet real needs and accelerate progress. To make everything more realistic, the robot will spend some of its time in a real, two-story house on the Georgia Tech campus, called the Aware Home.

They will be doing a spotlight for each of the eleven PR2s heading off to research labs this summer.   The robots were sent off with quite the fanfare, this video (via IEEE Spectrum) captures the “graduation” event nicely, including a brief interview with someone from each team.  The group as a whole is tackling a wide variety of personal robotics challenge problems!

June 7th, 2010 Posted by | GT Lab Updates, HRI, Industry | no comments