Real-time Physics-based 3D Biped Character

Animation Using an Inverted Pendulum Model

 

 

Yao-Yang Tsai1, Wen-Chieh Lin2, Kuang-You Cheng1

Jehee Lee3, Tong-Yee Lee1*

 

1National Cheng-Kung University, Tainan, Taiwan

2National Chiao-Tung University, Hsinchu,Taiwan

3Seoul National University, Seoul, Korea

*Corresponding Author

 

 

Real-time physics-based 3D character animation generated by our framework.

 

Abstract

We present a physics-based approach to generate 3D biped character animation that can react to dynamical environments in real-time. Our approach utilizes an inverted pendulum model to online adjust the desired motion trajectory from the input motion capture data. This on-line adjustment produces a physically-plausible motion trajectory adapted to dynamic environments, which is then used as the desired motion for the motion controllers to track in dynamics simulation. Rather than using Proportional-Derivative controllers whose parameters usually cannot be easily set, our motion tracking adopts a velocity-driven method which computes joint torques based on the desired joint angular velocities. Physically-correct full body motion of the 3D character is computed in dynamics simulation using the computed torques and dynamical model of the character. Our experiments demonstrate that tracking motion capture data with real-time response animation can be achieved easily. In addition, physically-plausible motion style editing, automatic motion transition, and motion adaptation to different limb sizes can also be generated without difficulty.

 

Paper

IEEE_TVCG_IPM.pdf

Videos

Main_Video_Demo

 

Side-by_side_comparison

 

Examples

 

         

Example 1: Character twisting its upper body to pass through a narrow walkway. In this example, the input motion capture data is walking straight forward and our method can be used to modify the motion style by simply adjusting the orientation of the torso.

 

 

Example 2: Snapshots of a character walking while stooping under a barrier. This example is generated by bending the back and lowering the COM of the character in a motion capture data of normal walking. Note that the squatting motion of the lower body is generated automatically by our method.

 

 

 

 

 

Status

to appear in IEEE Transactions on Visualization and Computer Graphics 2009 or 2010

Acknowledgments

The authors would like to thank anonymous reviewers helpful comments to improve this paper. We are

also grateful to KangKang Yin and Michiel van de Panne to help us to perform experimental study with

their work[6]. This work is supported in part by the Landmark Program of the NCKU Top University

Project (Contract B0008), and the National Science Council (Grants NSC-97-2628-E-006-125-MY3, NSC-96-2628-E-

006-200-MY3, NSC-96-2221-E-006-244-MY2, and NSC-96-2221-E-009-152-MY3), Taiwan, Republic of China.