Link with best gaming leaders in Los Angeles at GamesBeat Summit 2023 this May 22-23. Sign up in this article.
Epic Games confirmed off some new technologies together with the developers of Microsoft’s Ninja Principle for Senua’s Saga: Hellblade II.
The Metahuman Animator is a instrument that can be utilized to develop extremely real looking animations, based mostly on movie captured from human actors and virtually immediately transformed into an animated framework that can be applied to create 3D animations for video games and films.
It was one particular of the great demos that Epic Game titles confirmed at its Point out of the Unreal function at the Game Developers Conference in San Francisco these days.
>>Follow VentureBeat’s ongoing GDC 2023 protection<<
GamesBeat Summit 2023
Join the GamesBeat community in Los Angeles this May 22-23. You’ll hear from the brightest minds within the gaming industry to share their updates on the latest developments.
Epic showed off the tech through the Ninja Theory game, which is a sequel to Hellblade: Senua’s Sacrifice, a game with outstanding human animation from 2016. Melina Juergens, the motion capture expert and lead actor for the game, made an appearance to show how the same tech can now be used on an iPhone as a metahuman animation. It works with the Livelink face application on mobile devices.
The tech can generate a face model from a few captured pictures within a minute or so and convert it into something that can be used in computer-animated films or games. Ninja Theory also gave us a glimpse of what Senua will look like in the upcoming Hellblade II title.
Epic Games also unveiled new tech to make it easier for creators to create using cool 3D animations.
NCSoft’s Songyee Yoon, president and chief strategy officer, showed off imagery from Project M, an upcoming game. It’s an action-adventure game with extremely realistic graphics and stellar human animations.
Yoon was on stage to introduce the company’s latest project.
In the trailer, a digital human version of NCSoft’s chief creative officer (CCO), Taekjin Kim, appeared on screen and guided the viewers through Project M’s world and core gameplay.
This digital human was developed utilizing NCSoft’s AI technology and its advanced art and graphics technological capabilities. The trailer’s digital human speech voice was generated from the company’s AI text-to-speech (TTS) synthesis technology. It is used to translate text information into natural human speech reflecting a certain person’s voice, speech accent, and emotions.
The digital human facial expression and lip-sync were generated with the help of the company’s voice-to-face technology. It is an AI-based facial animation technology that automatically produces facial animation, matching the given text or voice. The AI technology and the company’s visual technologies created the digital human’s realistic facial look.
GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.