[sldev] Body motion and facial expression tracking, Microsoft did it
jan.ciger at gmail.com
Fri Jun 5 08:39:40 PDT 2009
-----BEGIN PGP SIGNED MESSAGE-----
Argent Stonecutter wrote:
> That's why you don't try and solve it computationally. You don't
> replace normal animation, you use this for minor adjustments to the
> existing animation, and you limit the strength of the adjustment to
> small angles and specific joints.
Ah, really? Did you actually try this? This idea has been around for a
while, I have actually former colleagues that have build PhDs on this
type of motion re-targeting using IK.
How specifically do you decide which joints are relevant for a given
case? Also which angles are "small enough" before the keyframe doesn't
>> E.g. in one case I have seen the solver to keep the hands next to the
>> avatar's waist but stick the waist forward to reach a goal.
> Wouldn't happen, unless the person selected the waist as the joint
> that would move, and unless the waist was already close to the goal.
And what kind of interface you would like to give to the user? A
skeleton of joints to pick from?
>> IK is a nice tool, but extremely hard to use unless you have an
>> guiding it.
> Which is the point.
Then I am a bit confused - I thought we are talking about real-time
automatic animation, not something to produce better keyframe animation
to be played at a later time (which is of course doable).
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.9 (GNU/Linux)
Comment: Using GnuPG with Mandriva - http://enigmail.mozdev.org
-----END PGP SIGNATURE-----
More information about the SLDev