Task-Oriented Active Perception and Motion Planning for Manipulating Piles of stuff

Project Abstract/Statement of Work:

We envision domestic assistance robots that are capable of performing many practical tasks, such as cooking, cleaning, and laundry, for elderly or disabled people. To be effective for these kinds of tasks a robot must be able to perceive and manipulate various types of objects in dense clutter. These objects vary in size, from small cups to large chairs; they may vary in structure, from rigid utensils to articulated home gadgets like can openers, even to fully deformable objects like clothing and blankets; they may be composed, such as stacks of cups; they may commingle, such as a collection of plates, glasses and utensils on a tray. The standard perceive-then-act approach, which registers CAD models of objects to sensor data before planning to manipulate, is impractical in such densely cluttered scenarios as some objects may be unknown, some can change shape, and many will be partially or fully occluded. Instead of the perceive-then-act approach, we propose to solve the problem of manipulating heterogeneous objects in dense clutter through task-oriented active perception and manipulation. The key insight of our approach is that task-oriented active perception allows us to probe only the task-relevant parts or properties of the environment, avoiding the complexity of fully-segmenting and registering all objects.

PI and Co-PI: