Martin de Lasa:
email@example.com | www.dgp.toronto.edu/~mdelasa/
Martin was the recipient of the 2011 CAIAC Best Dissertation Award. Below is the abstract from his thesis Feature-Based Control of Physics-Based Character Animation, completed at the Computer Science Department of the University of Toronto under the supervision of Aaron Hertzmann. You may also download pdf copy of his full thesis.
As children we learn to move and interact with the world through trial and error.
Over time, clumsy motions become refined, until we are able to move with agility
and robustness. Designing algorithms and representations for motor control that
allow machines and synthetic characters to mimic this process is a long standing
scientific challenge, shared across many disciplines.
Unfortunately, current learning approaches and representations are ill-suited to the
continuous high-dimensional spaces required for motor learning. At the same time,
a scientific understanding of the principles of locomotion control remains elusive,
even for ordinary tasks such as walking.
This thesis describes an approach to locomotion in which control is expressed in
terms of features. Each feature describes a high-level, physically-relevant property
of character state, such as center-of-mass, angular momentum, or end-effector
position. The key questions we address are:
What high-level features can be used to create locomotion?
How should these features be controlled?
How can we use the resulting representation for action selection/planning?
We show that locomotion control can be expressed in terms of a small, intuitive set
of features. Control of multiple features is coordinated by a novel optimization
scheme that first solves the most important features and then solves less important
Resulting control has a number of benefits: human-like qualities such as arm-swing,
heel-off, and hip-shoulder counter-rotation emerge automatically, control is robust
to changes in body parameters and may be mapped onto entirely new bipeds of
different stature and weight, a single controller is able to generate a continuum of
gaits including walking, jumping, jogging, and transitions, while following high-level
user commands over challenging terrain at interactive rates. This method uses no
motion capture data or offline learning.
Feature-based control has many potential applications. Flexible and controllable
characters that respect physical constraints would be of enormous value for
animation in interactive environments, such as urban simulations and video games.
This approach could also form the basis of bipedal robots or intelligent prostheses
that move with the dynamic capabilities of humans and help us understand the
biomechanics and motor control of human movement.