Imagine a game in which you could have intelligent, unscripted and dynamic conversations with non-playable characters (NPCs) with persistent personalities that evolve over time and accurate facial animations and expressions, all in your native tongue.
Generative AI technologies are making this a reality, and at COMPUTEX 2023 we announced the future of NPCs with the NVIDIA Avatar Cloud Engine (ACE) for Games. NVIDIA ACE for Games is a custom AI model foundry service that aims to transform games by bringing intelligence to non-playable characters (NPCs) through AI-powered natural language interactions.
“Generative AI has the potential to revolutionize the interactivity players can have with game characters and dramatically increase immersion in games,” said John Spitzer, vice president of developer and performance technology at NVIDIA. “Building on our expertise in AI and decades of experience working with game developers, NVIDIA is spearheading the use of generative AI in games.”
Developers of middleware, tools and games can use NVIDIA ACE for Games to build and deploy customized speech, conversation, and animation AI models in their software and games across the cloud and PC.
Optimized AI foundation models include:
To showcase the power of NVIDIA ACE for Games, and preview how developers will build NPCs in the near future, we partnered with Convai, an NVIDIA Inception startup building a platform for creating and deploying AI characters in games and virtual worlds, to help optimize and integrate ACE modules into their platform.
Take a first look in the NVIDIA Kairos demo:
“With NVIDIA ACE for Games, Convai’s tools can achieve the latency and quality needed to make AI non-playable characters available to nearly every developer in a cost efficient way,” said Purnendu Mukherjee, founder and CEO at Convai.
The Kairos demo leveraged NVIDIA Riva for speech-to-text and text-to-speech capabilities, NVIDIA NeMo to power the conversational AI, and Audio2Face for AI-powered facial animation from voice inputs. These modules were integrated seamlessly into the Convai services platform and fed into Unreal Engine 5 and MetaHuman to bring Jin to life.
Jin and his Ramen Shop scene were created by the NVIDIA Lightspeed Studios art team and rendered entirely in Unreal Engine 5, using NVIDIA RTX Direct Illumination (RTXDI) for ray traced lighting and shadows, and DLSS for the highest possible frame rates and image quality.
This real-time demonstration reveals what is possible using GeForce RTX graphics cards, NVIDIA RTX technologies, and NVIDIA ACE for Games. And exemplifies the possibilities for upcoming games that leverage this game-changing technology.
Game developers and startups are already using NVIDIA generative AI technologies. For instance, GSC Game World is using Audio2Face in the much-anticipated S.T.A.L.K.E.R. 2 Heart of Chornobyl. And indie developer Fallen Leaf is using Audio2Face for character facial animation in Fort Solis, their third-person sci-fi thriller set on Mars. Additionally, Charisma.ai, a company enabling virtual characters through AI, is leveraging Audio2Face to power the animation in their conversation engine.
If you’d like additional information about the generative AI technologies, head over to our NVIDIA Developer blog. Subscribe to learn more about NVIDIA ACE for Games, future developments, and early access programs. And discover more uses of NVIDIA RTX technology and GeForce RTX GPUs in the latest games by heading to GeForce.com.