Bring digital avatars to life with generative AI.
Workloads
Generative AI
Industries
Gaming
Financial Services
Healthcare and Life Sciences
Media and Entertainment
Retail / Consumer Packaged Goods
Telecommunications
Robotics
Business Goal
Innovation
Products
NVIDIA ACE
NVIDIA Riva
NVIDIA Nemotron
NVIDIA Audio2Face
Developers are exploring how generative AI can accelerate their content pipelines and provide new experiences. One area they’re focusing on is the creation of digital humans for customer service, virtual factories, virtual presence, and gaming use cases. Traditionally, creating digital humans has posed numerous challenges. Achieving realism requires capturing intricate human anatomy, facial expressions, and movements. Animating characters with natural movements and behaviors demands sophisticated motion capture systems. And optimizing performance to ensure everything runs in real time is computationally intensive. To address these challenges, teams are turning to generative AI to find innovative ways of creating interactive digital humans.
NVIDIA ACE is a suite of technologies that help developers bring digital humans to life. Several ACE microservices are NVIDIA NIM™—easy to deploy and highly performant microservices. These NIMs are optimized to run on NVIDIA Graphics Delivery Network (GDN), a global network of GPUs delivering low-latency digital human processing to 100 countries, or on NVIDIA RTX™ AI PCs. With individual NIMs, platform developers and system integrators can take portions of the ACE technology suite and incorporate them into their end-to-end platform.
Users can interact with a 3D digital avatar, based on a customer-service workflow using ACE, that can connect with people using emotions, humor and more.
Built on top of NIM microservices, James is a virtual assistant that can provide contextually accurate responses.
Interact with James in real time at ai.nvidia.com.
Quick Links
Built on NVIDIA AI graphics and simulation technologies, NVIDIA ACE encompasses technology for every part of the digital human—from speech and translation, vision, and intelligence to realistic animation and behavior, to lifelike appearance.
NVIDIA® Riva
Technologies that enable digital humans to understand human language, translate responses for up to 32 languages, and respond with natural responses.
NVIDIA Nemotron
Family of large language models (LLMs) and small language models (SLMs) that provide digital humans with intelligence—capable of providing contextually aware responses for humanlike conversations.
NVIDIA Audio2Face™
Technologies that provide digital humans with dynamic facial animations and accurate lip sync. With just an audio input, a 2D or 3D avatar animates realistically.
NVIDIA RTX™
A collection of rendering technologies that enable real-time path-traced subsurface scattering to simulate how light penetrates the skin and hair, giving digital humans a more realistic appearance.
Developers can leverage NVIDIA digital human technologies to build their own solutions from the ground up. Or, they can use NVIDIA’s suite of domain-specific AI workflows for next-generation interactive avatars for customer service, humanoid robots for virtual factories, digital experiences in virtual presence applications, or AI non-player characters in gaming. Generative AI models can be both compute and memory intensive, and running both AI and graphics on the local system will require a powerful GPU with dedicated AI hardware. ACE is flexible in allowing models to be run across cloud and PC depending on local GPU capabilities to ensure the user gets the best experience.
"The combination of NVIDIA ACE microservices and the Inworld AI Engine enables developers to create digital characters that can drive dynamic narratives, opening new possibilities for how gamers can decipher, deduce and play."
Kylan Gibbs, CEO, Inworld AI
“Generative AI-powered characters in virtual worlds unlock various use cases and experiences that were previously impossible … Convai is leveraging NVIDIA ACE technologies such as Riva automatic speech recognition and Audio2Face to enable lifelike non-playable characters with low latency response times and high fidelity natural animation.”
Purnendu Mukherjee, Founder and CEO, Convai
Top game and digital human developers are pioneering ways ACE and generative AI technologies can be used to transform interactions between players and NPCs in games and applications. Learn more about NVIDIA ACE, future developments, and early access programs.