Facial motion capture

For live:

  • Performance → Live → GameEngine

For Postproduction:

  • Performance → Recording Deck → Analyser → parametric data → Retargeter (plugin for Maya, Motionbuilder)

Rigging best prectices for Faceware

Rigging best practices

For rigging in realtime for Live, the best bet is to start with the basic shapes that Live streams. You can find a list of them here: creating-characters-for-live. The top part of the list are the most important shapes, so really dial those in and make sure that you’re happy with them. Once you are, you can add the secondary shapes if you want.

Second, talked through using some of our example character models as guidelines for what you should be aiming for with your own characters you’re planning to build out. Feel free to grab these and explore/deconstruct them.

Sample character models


Finally, best way to learn about the software is to download a free trial through their website and utilize their dedicated knowledge base of software guides, tutorial videos and training assets for you to get up and running quickly. I’d encourage you and any of your students/faculty members to request a trial when you’re ready.

Free trial




Facewaretech. Faceware Analyzer and Retargeter for high quality, non real-time production animation, and Faceware Live for real-time, interactive animation. All works with single-lens, RGB video cameras.




Plug-in for iClone, Motion LIVE, Unified Motion Capture for Face, Body and Hand

iPhone Face Mocap

The internal IAS solution for facial motion capture is currently FaceStream. The iPhone App streams the facemesh data over OSC to any other device (mainly Unity).

Last updated bysebastien schiesser