Image of a smartphone held by a woman, showing a person wearing a backpack

Digital Humans

Bring avatars to life with generative AI.

Workloads

Generative AI

Industries

Gaming
Financial Services
Healthcare and Life Sciences
Media and Entertainment
Retail / Consumer Packaged Goods
Telecommunications
Robotics

Business Goal

Innovation

Products

NVIDIA ACE
NVIDIA Riva
NVIDIA Nemotron
NVIDIA Audio2Face

Digital Humans as AI Agents in Gaming, Enterprise, and Public Sector Settings

Developers, enterprises, and the public sector are exploring how generative AI can accelerate productivity, streamline content pipelines, and provide new and more engaging experiences. From 2D to 3D avatars, whether photorealistic or stylized, creating intelligent digital humans can be challenging—especially in cases where avatar realism is required for human-like interactions and immersion. Capturing intricate human anatomy, facial expressions, and movements, as well as animating characters with natural movements and behaviors, demands sophisticated motion capture systems. Optimizing performance to ensure everything runs in real time is also computationally intensive. To address these challenges, teams are turning to generative AI to find innovative ways of creating interactive digital humans.

NVIDIA ACE is a suite of technologies designed to help developers bring digital humans to life. Several ACE microservices are NVIDIA NIM™ microservices—easy to deploy, highly performant, and optimized to run on NVIDIA Graphics Delivery Network (GDN), a global network of GPUs that deliver low-latency digital human processing to 100 countries, or on NVIDIA RTX™ AI PCs. With individual NIM microservices, platform developers and system integrators can take portions of the ACE technology suite and incorporate them into their end-to-end platform. To streamline the workflow for developers and enterprises alike, NVIDIA's Digital Human Blueprint, powered by ACE technologies and NIM microservices, offers a comprehensive solution to the challenges of creating realistic, interactive digital humans. With an option to use 2D avatars, which are less animation- and computation-intensive, the blueprint provides options depending on the intended use case. Users can interact with a 3D digital avatar from ACE or a 2D avatar from NVIDIA Maxine, based on a customer-service workflow that can connect with people using emotions, humor, and more.

Built on top of NIM microservices, James is a 3D virtual assistant, and Aria is in 2D; both can provide contextually accurate responses.

Interact with James or Aria in real time at ai.nvidia.com

 

Harnessing AI for Real-Time Language Understanding, Speech, Animation and Graphics

Built on NVIDIA AI graphics and simulation technologies, NVIDIA ACE encompasses technology for every part of the digital human—from speech and translation to vision and intelligence, realistic animation and behavior, and lifelike appearance.

NVIDIA® Riva

Technologies that enable digital humans to understand human language, translate responses for up to 32 languages, and respond with natural responses.

NVIDIA Nemotron™

A family of large language models (LLMs) and small language models (SLMs) that provide digital humans with intelligence—capable of providing contextually aware responses for humanlike conversations. 

NVIDIA Audio2Face™

Technologies that provide digital humans with dynamic facial animations and accurate lip sync. With just an audio input, a 2D or 3D avatar animates realistically.

NVIDIA RTX

A collection of rendering technologies that enable real-time path-traced subsurface scattering to simulate how light penetrates the skin and hair, giving digital humans a more realistic appearance. 

Developers can leverage NVIDIA digital human technologies to build their own solutions from the ground up. Or they can use NVIDIA’s suite of domain-specific AI workflows for next-generation interactive avatars for customer service, humanoid robots for virtual factories, digital experiences in virtual presence applications, or AI non-player characters (NPCs) in gaming. Generative AI models can be both compute- and memory-intensive, and running both AI and graphics on the local system requires a powerful GPU with dedicated AI hardware. ACE is flexible, in allowing models to be run across cloud and PC, depending on ‌local GPU capabilities to ensure the user gets the best experience.

"The combination of NVIDIA ACE microservices and the Inworld AI Engine enables developers to create digital characters that can drive dynamic narratives, opening new possibilities for how gamers can decipher, deduce and play."

Kylan Gibbs, CEO, Inworld AI

“Generative AI-powered characters in virtual worlds unlock various use cases and experiences that were previously impossible … Convai is leveraging NVIDIA ACE technologies such as Riva automatic speech recognition and Audio2Face to enable lifelike non-playable characters with low latency response times and high fidelity natural animation.”

Purnendu Mukherjee, Founder and CEO, Convai

Getting Started With Building Avatars Using Generative AI

Developers can start their journey on NVIDIA ACE by visiting NVIDIA Developer. Get started with NVIDIA ACE NIM by interacting with a digital human seamlessly through a web browser. Then evaluate our NIMs with an NVIDIA AI Enterprise evaluation license. Developers with an NVIDIA AI Enterprise license can download NIMs and deploy on DGX™ Cloud, any cloud service provider (CSP), or private cloud.

For customers looking for an end-to-end digital human solution, reach out to NVIDIA partners that have integrated ACE microservices, including Convai, Inworld, Data Monsters, Quantiphi, UneeQ, and Top Health Tech.

Getting Started With ACE On-Device Models for PC Deployment

The Unreal Render 5 renderer microservice showcases NPCs interacting using natural language. Developers of middleware, tools, and games can use state-of-the-art, real-time language, speech, and animation generative AI models to bring role-playing capabilities to digital characters rendered in Unreal Engine 5.

For developers using custom engines, the NVIDIA In-Game Inferencing SDK streamlines AI model deployment and integration for PC application developers. The SDK pre-configures the PC with the necessary AI models, engines, and dependencies. It orchestrates AI inference seamlessly across PC and cloud from a unified inference API. And it supports all major inference backends, across different hardware accelerators (GPU, NPU, CPU).  In-Game Inferencing SDK is available in beta.

Learn more and download the beta.

NVIDIA ACE is a suite of technologies that help developers bring digital avatars to life with generative AI. ACE AI models are designed to run in the cloud or locally on the PC.

Microservice is an architectural style that structures an application as a collection of services. The microservice architecture enables an organization to deliver large, complex applications rapidly, frequently, reliably, and sustainably.

NVIDIA ACE is for independent software vendors (ISVs), global systems integrators (GSIs), software development plans (SDPs), and enterprises that are looking to drive more engaging and natural interactions with their customers through digital interfaces.

NVIDIA ACE is a suite of technologies that bring digital humans to life with generative AI. Unlike other model providers, who specialize in speech and language modalities for cloud, this is the only suite of models trained on commercially safe data and optimized for scalability while offering flexible deployment options through the cloud and PC.

News

Enhance Customer Interactions With Digital Human Technologies

New interactive experiences demonstrate how generative AI can impact customer service across industries.

Deploy the First On-Device Small Language Model for Improved Game Character Roleplay

Learn how NVIDIA ACE NIM provides a more dynamic and immersive gameplay experience in Amazing Seasun Games’ Mecha BREAK.

Digital Humans Transform Industries

NVIDIA ACE enables digital humans to see, understand, and interact with us in human-like ways.

Real-Time Generative AI Healthcare Agents

Meet this AI agent developed in partnership with Hippocratic AI, built with Hippocratic AI’s state-of-the-art technology and NVIDIA ACE microservices.

Get Started

Top game and digital human developers are pioneering ways ACE and generative AI technologies can be used to transform interactions between players and NPCs in games and applications. Learn more about NVIDIA ACE, future developments, and early access programs.