Image of a smartphone held by a woman, showing a person wearing a backpack

Digital Humans

Bring digital avatars to life with generative AI.

Workloads

Generative AI

Industries

Gaming
Financial Services
Healthcare and Life Sciences
Media and Entertainment
Retail / Consumer Packaged Goods
Telecommunications
Robotics

Business Goal

Innovation

Products

NVIDIA ACE
NVIDIA Riva
NVIDIA Nemotron
NVIDIA Audio2Face

How Developers Create Lifelike Digital Humans Using AI

Developers are exploring how generative AI can accelerate their content pipelines and provide new experiences. One area they’re focusing on is the creation of digital humans for customer service, virtual factories, virtual presence, and gaming use cases. Traditionally, creating digital humans has posed numerous challenges. Achieving realism requires capturing intricate human anatomy, facial expressions, and movements. Animating characters with natural movements and behaviors demands sophisticated motion capture systems. And optimizing performance to ensure everything runs in real time is computationally intensive. To address these challenges, teams are turning to generative AI to find innovative ways of creating interactive digital humans.

NVIDIA ACE is a suite of technologies that help developers bring digital humans to life. Several ACE microservices are NVIDIA NIM™—easy to deploy and highly performant microservices. These NIMs are optimized to run on NVIDIA Graphics Delivery Network (GDN), a global network of GPUs delivering low-latency digital human processing to 100 countries, or on NVIDIA RTX™ AI PCs. With individual NIMs, platform developers and system integrators can take portions of the ACE technology suite and incorporate them into their end-to-end platform.

Users can interact with a 3D digital avatar, based on a customer-service workflow using ACE, that can connect with people using emotions, humor and more.

Built on top of NIM microservices, James is a virtual assistant that can provide contextually accurate responses.

Interact with James in real time at ai.nvidia.com.

Harnessing AI for Real-Time Language Understanding, Speech, Animation and Graphics

Built on NVIDIA AI graphics and simulation technologies, NVIDIA ACE encompasses technology for every part of the digital human—from speech and translation, vision, and intelligence to realistic animation and behavior, to lifelike appearance.

NVIDIA® Riva

Technologies that enable digital humans to understand human language, translate responses for up to 32 languages, and respond with natural responses.

NVIDIA Nemotron

Family of large language models (LLMs) and small language models (SLMs) that provide digital humans with intelligence—capable of providing contextually aware responses for humanlike conversations. 

NVIDIA Audio2Face™

Technologies that provide digital humans with dynamic facial animations and accurate lip sync. With just an audio input, a 2D or 3D avatar animates realistically.

NVIDIA RTX™

A collection of rendering technologies that enable real-time path-traced subsurface scattering to simulate how light penetrates the skin and hair, giving digital humans a more realistic appearance. 

Developers can leverage NVIDIA digital human technologies to build their own solutions from the ground up. Or, they can use NVIDIA’s suite of domain-specific AI workflows for next-generation interactive avatars for customer service, humanoid robots for virtual factories, digital experiences in virtual presence applications, or AI non-player characters in gaming. Generative AI models can be both compute and memory intensive, and running both AI and graphics on the local system will require a powerful GPU with dedicated AI hardware. ACE is flexible in allowing models to be run across cloud and PC depending on ‌local GPU capabilities to ensure the user gets the best experience.

"The combination of NVIDIA ACE microservices and the Inworld AI Engine enables developers to create digital characters that can drive dynamic narratives, opening new possibilities for how gamers can decipher, deduce and play."

Kylan Gibbs, CEO, Inworld AI

“Generative AI-powered characters in virtual worlds unlock various use cases and experiences that were previously impossible … Convai is leveraging NVIDIA ACE technologies such as Riva automatic speech recognition and Audio2Face to enable lifelike non-playable characters with low latency response times and high fidelity natural animation.”

Purnendu Mukherjee, Founder and CEO, Convai

Getting Started With Building Avatars Using Generative AI

Developers can start their journey on NVIDIA ACE by visiting NVIDIA Developer to learn more. Get started with NVIDIA ACE NIMs by interacting with a digital human seamlessly through a web browser. Then evaluate our NIMs with an NVIDIA AI Enterprise evaluation license. Developers with an NVIDIA AI Enterprise license can download NIMs and deploy on DGX™ Cloud, any cloud service provider (CSP), or private cloud.

For customers looking for an end-to-end digital human solution, reach out to NVIDIA partners that have integrated ACE microservices, including Convai, Inworld, Data Monsters, Quantiphi, UneeQ, and Top Health Tech.

NVIDIA ACE is a suite of technologies that help developers bring digital avatars to life with generative AI. ACE AI models are designed to run in the cloud or locally on the PC.

Microservice is an architectural style that structures an application as a collection of services. The microservice architecture enables an organization to deliver large, complex applications rapidly, frequently, reliably, and sustainably.

NVIDIA ACE is for independent software vendors (ISVs), global systems integrators (GSIs), software development plans (SDPs), and enterprises that are looking to drive more engaging and natural interactions with their customers through digital interfaces.

NVIDIA ACE is a suite of technologies that bring digital humans to life with generative AI. Unlike other model providers, who specialize in speech and language modalities for cloud, this is the only suite of models trained on commercially safe data and optimized for scalability while offering flexible deployment options through the cloud and PC.

News

Enhance Customer Interactions With Digital Human Technologies

New interactive experiences demonstrate how generative AI can impact customer service across industries.

Deploy the First On-Device Small Language Model for Improved Game Character Roleplay

Learn how NVIDIA ACE NIM provides a more dynamic and immersive gameplay experience in Amazing Seasun Games’ Mecha BREAK.

Digital Humans Transform Industries

NVIDIA ACE enables digital humans to see, understand, and interact with us in human-like ways.

Real-Time Generative AI Healthcare Agents

Meet this AI agent developed in partnership with Hippocratic AI, built with Hippocratic AI’s state-of-the-art technology and NVIDIA ACE microservices.

Get Started

Top game and digital human developers are pioneering ways ACE and generative AI technologies can be used to transform interactions between players and NPCs in games and applications. Learn more about NVIDIA ACE, future developments, and early access programs.