NPCs, or “Non-Playable Characters”, are video game characters that do not correspond to a real player and often function as story elements, allies, or enemies in all kinds of games. They generally have scripted dialogue and very defined, scripted behaviors that don’t allow for much variation or realistic behavior.
However, the rise and evolution of Artificial Intelligence can provide them with much more realistic behaviors and enrich gaming experiences. That is precisely what NVIDIA proposes with its technology NVIDIA ACE either NVIDIA Avatar Cloud Engine.
It is an Artificial Intelligence model that will allow developers to provide greater depth to the NPCs with which the player interacts. Imagine the chances an NPC would have that would respond the way it does, for example, ChatGPT and other language-based AI.
NVIDIA ACE will consist of different technologies ranging from conversational models to voice recognition, text to speech or systems that convert conversations into three-dimensional expressions.
For example, technology NVIDIA Riva will allow NPCs to recognize audio conversations and they can convert text to audio, leading to fluent conversations between player and NPCs. NVIDIA Omniverse Audio2Facewhich we have already seen before, models the facial expressions of a character based on speech.
On the other hand, NVIDIA NeMo allows you to customize the language models by entering data about the story of the game or the background of the characters.
All of these technologies are integrated into NVIDIA ACE, but developers will be able to choose whether to use all or only part of them. ACE will work both on Cloud and local servers, depending on the implementation of each developer.
End of Article. Tell us something in the Comments!