Degree Type


Date of Award


Degree Name

Master of Science


Electrical and Computer Engineering

First Advisor

Alexander Stoytchev


Many tasks that humans perform on a daily basis require the use of a container. For example, tool boxes are used to store tools, carafes are used to serve beverages, and hampers are used to collect dirty clothes. One long term goal for the field of robotics is to create robots that can help people perform similar tasks. Yet, robots currently lack the ability to detect and use most containers. In order for a robot to have these capabilities, it must first form an object category for containers.

This thesis describes a computational framework for learning a behavior-grounded object category for containers. The framework was motivated by the developmental progression of container learning in humans. The robot learns the category representation by interacting with objects and observing the resulting outcomes. It also learns a visual model for containers using the category labels from its behavior-grounded object category. This allows the robot to identify the category of a novel object using either interaction or passive observation.

There are two main contributions of this thesis. The first contribution is the new behavior-grounded computational framework for learning object categories. The second contribution is that the visual model of an object category is acquired in the last step of this learning framework, after the robot has interacted with the objects. This is contrary to traditional approaches to object category learning, in which the visual model is learned first before the robot has even had the chance to touch the object. Because the visual model is learned in the last step, the robot can ascribe to a novel object the functional properties of its visually identified object category.


Copyright Owner

Shane David Griffith



Date Available


File Format


File Size

117 pages