[sldev] Body motion and facial expression tracking, Microsoft did it

Jan Ciger jan.ciger at gmail.com
Fri Jun 5 08:39:40 PDT 2009


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Argent Stonecutter wrote:

> That's why you don't try and solve it computationally. You don't  
> replace normal animation, you use this for minor adjustments to the  
> existing animation, and you limit the strength of the adjustment to  
> small angles and specific joints.

Ah, really? Did you actually try this? This idea has been around for a
while, I have actually former colleagues that have build PhDs on this
type of motion re-targeting using IK.

How specifically do you decide which joints are relevant for a given
case? Also which angles are "small enough" before the keyframe doesn't
work anymore?

>> E.g. in one case I have seen the solver to keep the hands next to the
>> avatar's waist but stick the waist forward to reach a goal.
> 
> Wouldn't happen, unless the person selected the waist as the joint  
> that would move, and unless the waist was already close to the goal.

And what kind of interface you would like to give to the user? A
skeleton of joints to pick from?

> 
>> IK is a nice tool, but extremely hard to use unless you have an  
>> animator
>> guiding it.
> 
> Which is the point.

Then I am a bit confused - I thought we are talking about real-time
automatic animation, not something to produce better keyframe animation
to be played at a later time (which is of course doable).

Regards,

Jan


-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.9 (GNU/Linux)
Comment: Using GnuPG with Mandriva - http://enigmail.mozdev.org

iD4DBQFKKTw8n11XseNj94gRAqMtAKDuuqQg9utwN0sYDSmVSChzSq4rKQCYkiNI
5F9iAqT8dDa+OBKdXhlK+A==
=+6mS
-----END PGP SIGNATURE-----


More information about the SLDev mailing list