With Nvidia Ace, video games will be more immersive than ever
- Steve Johan
- Mar 19, 2024
- 3 min read

The world of video games is constantly evolving, pushing the boundaries of graphics, gameplay mechanics, and storytelling. One area ripe for innovation is the realm of non-playable characters (NPCs). Often relegated to repetitive lines and scripted responses, NPCs can feel like one-dimensional placeholders rather than living, breathing inhabitants of the game world. Nvidia, a leader in graphics processing technology, aims to change this with its groundbreaking Nvidia Avatar Cloud Engine (ACE) technology.
Nvidia has unveiled a new demonstration of its Nvidia Ace technology, which will make NPCs in video games much more interactive thanks to AI. Several studios are already working to integrate it into their games.
It was during its GTC conference (GPU Technology Conference) that Nvidia unveiled a new demonstration of Ace, its digital avatar technology designed for video games, but also for many other markets and industries.
Announced in May 2023, Nvidia Ace was officially presented at CES 2024 in Las Vegas, a show during which a test on this new way of interacting with NPCs in a video game was conducted. This time, the demo places us in a real game sequence and seems to deliver on its promises in a few minutes.
A new convincing demo
While the previous demonstration was designed in collaboration with Convai, Nvidia's start-up partner for the creation of characters with AI, this one goes beyond the proof of concept.
In Covert Protocol, you play as an agent investigating a hotel lobby to obtain a target's room number. It is possible to interact with three characters who each have their own personality, their way of expressing themselves and answering questions.
Developed this time with the Inworld AI studio, this technological demo shows a certain evolution compared to the first: Nvidia Riva, which transcribes your speech into text to communicate with NPCs, seemed much more reliable to us, as did the voice synthesis of these characters, who seem less and less robotic.
Now that the player has an objective, we see the interactive potential of the technology, which allows you to derail more or less widely from the main narrative thread to offer always different exchanges. Characters organically adapt their tone depending on the questions, if they are repeated several times, or if you strike a sensitive chord. They will thus remain faithful to their persona and will remember your previous interactions with them to always react differently.
Such a technological advance will indeed adapt very well to RPGs, such as Skyrim and even Baldur's Gate III, but also more generally to games offering emergent gameplay such as Immersive Sim. We think in particular of the Hitman series or even Deus Ex (RIP), which offer many ways to achieve your objective, notably by interacting with NPCs.
A technology already ready for use
The Nvidia Ace suite can be easily integrated by game developers into their engine via a plugin for Unreal Engine 5 and Unity, with, we imagine, technical support for all other studios using their own engine. We already mentioned previously the few studios that are experimenting with the technology for their next games, but no commercial project has yet been confirmed.
Other sectors of activity will be able to use these virtual characters, such as customer service, to replace highly scripted chatbots, but also health to have patients interact with virtual assistants for the most benign requests.
Note that companies and developers wanting to exploit Ace technology will go through Nvidia's various partners, and specialized studios such as Convai, Inworld AI, or UneeQ which each have their own expertise.
Comentarios