NVIDIA Redefines Game AI With ACE Autonomous Game Characters

The term 'AI' has been used in games for decades. These non-playable characters (NPCs) traditionally follow strict rules designed to mimic intelligence, adhere to a guided story, and provide a scripted interaction with the player. However, with the rise of intelligent language models, game AI is primed for a truly intelligent overhaul. 

At CES 2025, NVIDIA is redefining game AI with the introduction of NVIDIA ACE autonomous game characters.

First introduced in 2023, NVIDIA ACE is a suite of RTX-accelerated digital human technologies that bring game characters to life with generative AI.

NVIDIA is now expanding ACE from conversational NPCs to autonomous game characters that use AI to perceive, plan, and act like human players. Powered by generative AI, ACE will enable living, dynamic game worlds with companions that comprehend and support player goals, and enemies that adapt dynamically to player tactics.

Enabling these autonomous characters are new ACE small language models (SLMs), capable of planning at human-like frequencies required for realistic decision making, and multi-modal SLMs for vision and audio that allow AI characters to hear audio cues and perceive their environment.

NVIDIA is partnering with leading game developers to incorporate ACE autonomous game characters into their titles. Interact with human-like AI players and companions in PUBG: BATTLEGROUNDS, inZOI, and NARAKA: BLADEPOINT MOBILE PC VERSION. Fight against ever-learning AI-powered bosses that adapt to your playstyle in MIR5. And experience new gameplay mechanics made possible with AI-powered NPCs in AI People, Dead Meat, and ZooPunk.

Breaking Down Human Decision Making

Let’s begin with a simple model of how humans make decisions.

At the core of decision making is an internal conversation with one’s self, repeatedly asking the same question: “What should I do next?”

To answer that question well, we need a few key pieces of data:

  1. Information from the world around us
  2. Our motivations and desires
  3. Memories of prior events and experiences

To illustrate, let’s say you hear your phone ring - an external perception generated by your senses. Should you answer it?

You remember you are expecting a call and don’t want to miss it. Your motivations and memories combine to provide everything you need to determine what to do next. This is cognition.

You choose to answer the phone - the action and result of one’s cognitions.

These perceptions, cognitions, and actions are often stored in our memories to recall for later decision making.

New NVIDIA ACE AI Models For Autonomous Game Characters

Mimicking these human traits with a traditional rule based AI system is impossible for a developer to code – the number of potential scenarios are infinite. But with the help of generative AI, and large language models trained on trillions of sentences describing how humans react to the world, we can start to simulate human-like decision making.

NVIDIA ACE autonomous game characters are powered by a suite of generative AI models for perception, cognition, action, and rendering that enable developers to create AI agents in games that think, feel, and act more like a human.

Perception - Models To Sense The World

In order for our SLMs to make good decisions, there must be a high frequency stream of perception data provided to the autonomous game character. Several models and techniques are used to capture this sensory data:

  • Audio
    • NemoAudio-4B-Instruct - A new Audio+Text in and Text out SLM that is capable of describing a soundscape in a gaming environment
    • Parakeet-CTC-XXL-1.1B-Multilingual - Transcribes multilingual audio to text
  • Vision
    • NemoVision-4B-128k-Instruct - A new Audio+Image in and Text out SLM capable of simple spatial understanding
  • Game State
    • One of the best sources of information in the game world is the game itself. Game state can be transcribed into text so that a SLM can reason about the game world

Cognition - Models To Think About The World

Based on our esports research, most gamers make about 8-13 micro decisions a second called “sub-movements”. These could be simple tasks like correcting aim direction or deciding when to use a skill, or more complex tasks like deciding to start reassessing strategy.

In general, the task of cognition is very frequent, which requires a small language model to perform the task to meet both latency and throughput requirements. ACE SLMs for cognition include:

  • Mistral-Nemo-Minitron-8B-128k-Instruct - A state of the art small language model that tops the charts in terms of instruction following capabilities, a key competency for Autonomous Game Characters
  • Mistral-Nemo-Minitron-4B-128k-Instruct - Same model, just smaller
  • Mistral-Nemo-Minitron-2B-128k-Instruct - And even smaller! Fits in as little as 1.5GB of VRAM

Action - Models To Act In The World

Taking action comes in many forms - from speech, to game actions, to longer term planning. To effectively perform actions, developers can use a combination of models and strategies:

  • Action Selection - Given the finite actions that can be taken in the game, the SLM can choose the best appropriate action (as in inZOI below)
  • Text-to-Speech - Great text to speech models like Elevenlabs.io or Cartesia can be used to convert a text response to an aural response
  • Strategic Planning - When processing and reasoning about a large corpus of data, these agents can reach out to larger models that can provide a higher level, lower frequency strategy. Often this is a cloud LLM API or a CoT (Chain-of-Thought) series of prompts to the SLM
  • Reflection - One of the important actions is to reflect on the results of prior actions. “Did I choose the right thing?” This action can produce better future actions over time and allows the character to self correct

Memory - Models To Remember The World

Memory is crucial for Autonomous Game Characters to be able to recall their prior perceptions, actions, and cognitions. It’s also useful for tracking long term goals and motivations that may be less relevant in the immediate context. Using a technique called Retrieval Augmented Generation (RAG), developers can use similarity searches to “remember” relevant information to the current prompt:

  • E5-Large-Unsupervised - Using the NVIDIA In-Game Inference SDK, developers can use our optimized embedding model for embeddings within the game process

Using a combination of the models and techniques above, our partners have crafted the first autonomous game character experiences. Let’s take a glimpse into the future.

Autonomous Characters Come To Games - From Smart AI Teammates To Constantly Evolving Enemies

NVIDIA ACE characters act as autonomous squad members, following orders, collecting and sharing loot, and engaging enemies. They offer strategic suggestions by independently perceiving and understanding dynamic events occurring around them, and take the necessary steps to complete an action or order, without additional prompting or assistance from the player.

With the power of AI, leveraging everything mentioned earlier, their autonomy and capabilities create dynamic gameplay encounters typically only experienced with human teammates.

PUBG: IP Franchise Introduces Co-Playable Character

PUBG: BATTLEGROUNDS is the genre-defining battle royale where players compete to be the last one standing. Featuring intense tactical gameplay, realistic combat scenarios, and vast maps, PUBG: BATTLEGROUNDS has become a cultural phenomenon since its launch in 2017, and it’s still one of the top five most played games on Steam each and every day.

In 2025, PUBG IP Franchise is introducing Co-Playable Character (CPC) with PUBG Ally. Built with NVIDIA ACE, Ally utilizes the Mistral-Nemo-Minitron-8B-128k-instruct small language model that enables AI teammates to communicate using game-specific lingo, provide real-time strategic recommendations, find and share loot, drive vehicles, and fight other human players using the game’s extensive arsenal of weapons.

 

“PUBG IP Franchise's PUBG Ally and inZOI's Smart Zoi, rising as the world's first CPC (Co-Playable Character) built with NVIDIA ACE, are unlocking new and unique experiences. At KRAFTON, we’re excited by the possibilities of ACE autonomous game characters, and how AI will enhance the way we create games.”- Kangwook Lee, Head of Deep Learning Division, KRAFTON

NARAKA: BLADEPOINT PC AI Companions

In March 2025, NetEase will release a local inference AI Teammate feature built with NVIDIA ACE for NARAKA: BLADEPOINT MOBILE PC VERSION, with NARAKA: BLADEPOINT on PC also adding the feature later in 2025. NARAKA: BLADEPOINT is one of the top 10 most played games on Steam each week, and NARAKA: BLADEPOINT MOBILE boasts millions of weekly players on phones, tablets, and PCs.

Up to 40 players battle in NARAKA: BLADEPOINT MOBILE PC VERSION’S melee-focused Battle Royale, battling in large environments, using a unique set of traversal abilities, combat skills, and Far Eastern-inspired weapons, which players use to combo, parry, and counter enemies in their quest for victory.

AI Teammates powered by NVIDIA ACE can join your party, battling alongside you, finding you specific items that you need, swapping gear, offering suggestions on skills to unlock, and making plays that’ll help you achieve victory.

 

“NVIDIA enables game developers to push past expected boundaries with AI technology. NVIDIA ACE in NARAKA: BLADEPOINT MOBILE PC VERSION allows us to create AI autonomous characters that allows us to create AI autonomous teammates running on the device locally that naturally assist the player in their epic battles.” – Zhipeng Hu, Head of Thunder Fire BU, SVP of NetEase corp.

inZOI Unveils Smart Zoi, Built with NVIDIA ACE

KRAFTON's inZOI is one of the top 10 most wishlisted games on Steam. The upcoming life simulation game enables players to alter any aspect of their game world to create unique stories and experiences. Developed using Unreal Engine 5, inZOI's realistic graphics allow players to freely visualize their imagination with the game’s easy-to-use creation tools. Customize your character's appearance and outfit, and build your dream home using a wide selection of freely-movable furniture and structures.

Built with NVIDIA ACE, inZOI introduces Smart Zoi, Co-Playable Characters (CPC) that are more reactive and realistic than anything you’ve seen before. Players will experience a comprehensive community simulation, where every ZOI in the city acts autonomously, driven by their life goals, and reacting to their environments and the events happening around them, leading to deeper levels of immersion and complex social situations.

Through a compelling comparison of pre-LM (Language Model) and post-LM scenarios, the video below demonstrates the profound impact of cutting-edge on-device LM technology on character interactions and behaviors in inZOI.

 

NVIDIA ACE Powers First AI Boss In MIR5

NVIDIA ACE characters can also act as antagonists in single-player, co-op, and multiplayer games.

Unlike conventional scripted bosses, you can battle enemies that continuously learn from player behavior and actions, countering your most-used strategies, abilities, and spells, forcing you to adapt. Fight bosses with dynamically adjusting attacks, requiring you to level up your IRL skills instead of simply memorizing attack patterns. And enter massively multiplayer worlds with persistent enemy factions who adapt to player tactics. The possibilities are endless, and with dynamic AI no two fights should be the same, making gameplay more exciting and engaging.

Wemade Next’s MIR5, the latest installment in the immensely popular Legend of Mir franchise, will use NVIDIA ACE to power boss encounters. With ACE technologies, bosses will learn from previous encounters against players, adapting to tactics, skills, and gear used by players in the MMORPG.

MIR5’s AI will evaluate the current loadout and setup of the humans it's currently facing, compare those to past encounters, and decide on the best course of action to attain victory. Using ACE technologies, boss fights are unique, and going back to farm an already-beaten boss results in the fight playing out differently, keeping players on their toes and making gameplay more engaging.

 

“Together with NVIDIA, we have begun a new era of gaming with NVIDIA ACE autonomous game characters. MIR5’s AI bosses are a milestone moment in gaming, enabling unique boss encounters with every play session. We’re excited to see how this technology transforms games.“ - Jung Soo Park, CEO, Wemade Next

NVIDIA ACE Powers New Game Genres With Conversational AI

Using NVIDIA ACE, several developers are crafting entirely new types of gameplay, where conversations, confrontations and interactions are driven exclusively by player prompts and AI-generated responses. These characters are not only conversational, but take action in games giving players new immersive ways of interacting with the game. 

Dead Meat

Meaning Machine’s Dead Meat is an upcoming murder mystery game where players can ask the suspects absolutely anything, using their voice or keyboard. Want to discuss the suspect’s alibi? Probe them on the meaning of life? Confess your love? Your words hold power, and anything goes. Persuade them, manipulate them, threaten them, or even charm them into spilling their secrets. The Interrogation room is your sandbox, and your imagination is the only limit. What will you ask to get to the truth? 

 

At CES 2025, Dead Meat was shown running in real-time on a GeForce RTX 50 Series GPU, generating dialogue locally for the very first time. Previously, the game utilized LLMs in the cloud, but by partnering with NVIDIA, Meaning Machine created a version of Dead Meat that runs locally “on device”.

Using Meaning Machine’s Game Conscious AI, built with NVIDIA ACE and NVIDIA Mistral-NeMo-Minitron-8B-128K small language model (SLM), Eleven Labs’ Text To Speech (TTS), and Open AI's Whisper automatic speech recognition (ASR), the new version of Dead Meat delivers the same character depth that was being achieved in the cloud, but now the dialogue text is generated locally, on a GPU.

To achieve this dialogue quality, Meaning Machine fine tuned the NVIDIA small language model, using their own dialogue data, then integrated that model with their Game Conscious™ system. Get ready to unleash your inner detective later this year, and stay tuned to Dead Meat’s Steam page for the upcoming demo.

AI People

AI People is a unique sandbox game from GoodAI where players create and play out scenarios with AI-driven NPCs that autonomously interact with each other, their environment, and the player, creating a dynamic, AI-generated narrative.

Built with large language and speech recognition models, running locally on a GeForce RTX GPU or in the cloud, each NPC autonomously interacts with the environment and the objects within it. These NPCs learn, experience emotions, pursue their goals, dream, talk, and create a dynamic story for you to live through.

Take a first look at this brand new gameplay experience in the AI People video below:

 

ZooPunk Tech Demo

F.I.S.T.: Forged In Shadow Torch was a well-received Metroidvania-esque game, featuring DLSS, Reflex, and Ray Tracing. Its developer, TiGames, has now unveiled ZooPunk, a tech demo for a future project.

Built with NVIDIA ACE, Audio2Face, and the first use of on-device in-game Stable Diffusion image generation, TiGames’ AI tech demo leads players through conversations and interactions on a floating kitchen above a sea of clouds.

Speak with allies about intel gathered on a mission, and head to the pier to design a new warship that’ll aid Rayton in his war against the Mechanoid Empire. Using locally-hosted Stable Diffusion image generation, request new ship artwork, and adjust its paintjob via Text To Speech inputs, rather than the adjustment of dozens of sliders.

 

Stay tuned to TiGames website to learn more about their upcoming project.

NVIDIA Audio2Face Accelerates Game Creation

In addition to driving autonomous game characters, developers are also using ACE technologies to drive AI-powered facial animation and lip sync in games with NVIDIA Audio2Face. Using Audio2Face, developers can easily animate characters who previously featured only basic lip movement. Voice actors record their lines, then they are fed into Audio2Face where character faces, eyes, tongues, and lips are animated accordingly, saving developers considerable time and money.

Alien: Rogue Incursion

Survios’ Alien: Rogue Incursion is available now on PCVR and PSVR2 with NVIDIA Audio2Face technology. An action-horror Virtual Reality game, Alien: Rogue Incursion features an original story that fully immerses players within the terrors of the Alien universe. This labor of love was designed by Alien fans for Alien fans.

During players’ journey for survival, they will encounter a few humanoid survivors, all of which are powered by Audio2Face, generating realistic facial animation based on voice actors’ recorded dialog.

 

World of Jade Dynasty

Perfect World Games’ World of Jade Dynasty is an Unreal Engine 5-based massively multiplayer online role-playing game, and sequel to the developer’s incredibly popular Jade Dynasty. In the game, you can experience the thrill of massive 100-player battles, the excitement of dungeon exploration, and the freedom of soaring through the clouds on your sword. Whether you're into PVP, PVE, or PVX, there's a unique path to immortality waiting for you.

A new video demonstrates the developer’s progress in adding NVIDIA Audio2Face to a large cast of characters in both Chinese and English, and goes behind the scenes to show how animations are dynamically adjusted based on different variables detected in the recorded voice acting. And should adjustments be needed, developers can of course dive in and make tweaks as they see fit, tailoring dialogue for specific scenes.

NVIDIA ACE: Autonomous Game Characters

With generative AI, players will experience new forms of gameplay that will take gaming to the next level. Top game developers are already starting to showcase the state of the possible with marquee titles like PUBG: BATTLEGROUNDS, inZOI, MIR5, and NARAKA: BLADEPOINT MOBILE PC VERSION. These examples are just the tip of the iceberg - stay tuned for an exciting 2025.

Learn more about NVIDIA ACE, future developments, and early access programs. Also, be sure to check out all of our other GeForce RTX 50 Series announcements to see how we’re further improving your experiences in games and apps, and delivering new innovations that advance the PC industry.