top of page

Week 7 - Faceware Analyser & Retargetting in Maya & UE5

Writer's picture: Anna Lisa VegterAnna Lisa Vegter

This week was a follow on from last week, where we used the footage of the facial motion capture in a piece of software called Faceware Analyser. The aim was to use this facial tracked data to then animate the MetaHumans rig we created in the previous class. We had to begin with finding a neutral facial expression, we chose one that was the least animated as we didn't really have a neutral frame. Each dot had to be manually added to the facial feature - eyes, brows and mouth. This was done by dragging the individual points to each of the marked dots on the face.


Once that is complete, you move along the timeline to chose a different expression, and repeat this process. A minimum of 5 keyframes is needed to create successful tracked footage. Each member of our group had a go at this process, and I chose the mouth. Once this process is finished for each facial feature, you click on 'train' and 'track', and the software can then track the facial features across the whole video. I had to add a few extra keyframes for the mouth as it wasn't tracking very well to begin with. Once I did this it fixed the problem.









Comentários


  • Black Instagram Icon
  • Cinema-Imdb-icon
  • Soundcloud

© 2024 ANNA LISA VEGTER

bottom of page