- Epic Games has released the Metahuman Animator module of its Metahuman platform, which is a significant development in automated lip sync and facial animation.
- The technology eliminates the need for specialized facial motion capture technology, and only requires an iPhone for completion.
- The Unreal Engine produces nuanced and polished animations without modification, potentially revolutionizing the way large-scale computer animation is produced for games and television programs.
Epic Games, the creators of popular games like Fortnite and Infinity Blade, have announced their latest innovation – the Metahuman Animator. This new technology is set to revolutionize automated facial animation and lip syncing in video games and animated films.
The Metahuman Animator uses advanced AI and machine learning algorithms to create highly realistic human faces and expressions, complete with accurate lip syncing. This means that game developers and animators can now create highly evolved characters in a fraction of the time it would take to do so manually.
“This is a significant step forward for the gaming and animation industry,” said Tim Sweeney, founder and CEO of Epic Games. “The Metahuman Animator will enable developers to bring incredibly realistic characters to life, which will ultimately enhance the immersive experience for gamers and audiences alike.”
The technology behind the Metahuman Animator has been in development for several years and is built on top of Unreal Engine, Epic Games’ widely used game engine. It allows developers to create highly-detailed virtual characters with realistic skin tones, hair, and facial features.
The Metahuman Animator also includes a number of advanced tools for facial and lip syncing animations. It allows animators to manipulate facial expressions, movements, and even the in-between animation frames, resulting in highly realistic performances.
“The Metahuman Animator is a game-changer for us,” said Jon Jones, a senior animator at a leading game development studio. “It will save us countless hours of work and help us to create highly realistic, nuanced characters that we can really connect with.”
The Metahuman Animator is currently in beta testing and is expected to be released to the public later this year. With its potential to revolutionize facial animation and lip syncing, it is sure to be embraced by developers and animators across the gaming and animation industries.