So, Where’s My Robot?

Thoughts on Social Machine Learning

Junior at AAAI

Junior at AAAI

My lab participated in the Robot Workshop and Exhibition at AAAI this year. We presented a project that Maya Cakmak is working on with this little Bioloid robot we call Junior.

The project is about affordance learning, or learning about the effects of your actions in the world. The system gathers a training dataset, by playing with objects. It collects examples of the form: (perceptual context)(action performed)(observed effects). It learns SVM classifiers and is then able to predict effects for given action-context pairs.

We are interested in what would be different when the robot explores an environment by itself versus when it has a human teacher helping it explore the environment. Our intuition is that a teacher should make the process faster and more efficient, and we just finished an experiment that looks in detail at what exactly changes between self and social learning. In the experiment, people helped Junior by placing objects in the workspace for him to play with, helping learn affordances like “rollable,” “liftable,” “moveable.”

We’re in the process of analyzing and writing up the results, but we can say that both types of data sets result in reasonable classifiers. Social data sets are different in some interesting ways, like having much higher representation of positive examples. Junior was also able to consistently get people to provide help at just the right time, by using a gazing gesture when it couldn’t quite reach an object.

Details to come, but you can read more about the project here.

July 22nd, 2008 Posted by | Conferences, HRI, Machine Learning | no comments