|
Animated human characters in everyday scenarios must interact with the
environment using their hands. Captured human motion can provide a
database of realistic examples. However, examples involving contact
are difficult to edit and retarget; realism can suffer when a grasp
does not appear secure or when an apparent impact does not disturb the
hand or the object. Physically based simulations can preserve
plausibility through simulating interaction forces. However, such
physical models must be driven by a controller, and creating effective
controllers for new motion tasks remains a challenge. In this project,
we present a controller for physically based grasping that draws from
motion capture data. Our controller explicitly includes passive and
active components to uphold compliant yet controllable motion, and it
adds compensation for movement of the arm and for gravity to make the
behavior of passive and active components less dependent on the
dynamics of arm motion. Given a set of motion capture grasp examples,
our system solves for all but a small set of parameters for this
controller automatically. We demonstrate results for tasks including
grasping and two-hand interaction and show that a controller derived
from a single motion capture example can be used to form grasps of
different object geometries.
|