MetaHuman Animator
Accueil » MetaHuman Animator: capture facial performances using an iPhone!

MetaHuman Animator: capture facial performances using an iPhone!

This article is also available in: French

Epic Games unveils MetaHuman Animator, an upcoming feature set for their existing digital human tool, MetaHuman. Thanks to this new features, you will be able to reproduce the facial performance of a real actress or actor as high-fidelity animation on MetaHuman characters. All of this, thanks to an iPhone or a stereo helmet-mounted camera (HMC).

Before we dig deeper in the technical details, here’s what you’ll be able to achieve:

To achieve this, you will need to mount your iPhone on a tripod, or use a professional vertical stereo HMC capture solution, such as the HMCs sold by Technoprops. This will allow you to capture the performance of your subject. The tool will then allow you to transfer the performance to a MetaHuman, that can then be used within Unreal Engine.

Technoprops
A vertical stereo HMC capture solution from Technoprops, the hardware division of ILM. Two cameras capture the face at different viewing angles.

Epic Games explains that they target AAA game developers, Hollywood filmmakers, but also hobbyists and indie studios.

This new feature set relies in part on the tech developped for Mesh to MetaHuman, a feature that allows you to create MetaHumans from a 3D scan or sculpt. Here, explains Epic Games, a a small amount of captured footage will be used to create a MetaHuman Identity. It will be used to interpret the performance and that has the same topology and rig as any other MetaHuman.
Once the performance is interpreted, you’ll be able to check if things went well by comparing frame by frame the capture footage and the animated MetaHuman. If needed, you will be able to adjust the animation for artistic purposes.

MetaHuman Animator

Once this is done, you’ll be able to use this animation on any MetaHuman, since they share the same rig and topology.
Furthermore, timecodes are supported: the facial performance animation can be aligned with body motion capture and audio if needed. Even better, explains Epic Games: “it can even use the audio to produce convincing tongue animation”.

MetaHuman Animator will be available as part of the MetaHuman Plugin for Unreal Engine, and will therefore be free to download. If you want to use an iPhone to capture the performance, you’ll also need the free Live Link Face app for iOS.
At this stage, Android support doesn’t seem to be on the way.

Epic Games hasn’t announced any launch date yet but states this new tech is “coming soon”. We will keep you updated: don’t forget to follow us Youtube, Twitter, Instagram, LinkedIn, Facebook so that you don’t miss our upcoming videos and articles!

Laissez un commentaire

A Lire également