Nvidia's new generative AI will revolutionize gaming development

The Omniverse Avatar Cloud Engine or ACE can be used in all levels of game development, which opens up a whole world of possibilities.


AI is all the rage these days ever since OpenAI launched ChatGPT. Companies like Ubisoft are also looking to incorporate AI technology into game development.

Jin was created using the Nvidia ACE Generative AI technology.

Nvidia recently unveiled a new AI technology that promises to revolutionize the way character models are made. During Nvidia’s keynote at Computex 2023, CEO Jensen Huang announced "Omniverse Avatar Cloud Engine" or ACE, which can create real-time interactive AI NPC that feature voice dialogues and facial animations.

ACE is a new "suite of real-time solutions" from Nvidia that can be used by game developers to create characters that have the ability to respond to a player’s actions. These NPCs will have voices, facial expressions, and lip-sync to match.

In the demo, a player speaks to the NPC using his own voice. The NPC does its best to respond to what the player is saying and points him to a new side quest that he can complete.

Nvidia partnered with Convai (Conversational AI for Virtual Worlds) in this new tech demo "to help optimize and integrate ACE for Games modules into an immersive and dynamic interaction with a non-playable character named Jin. The demo is also enhanced with ray tracing and performance multiplying NVIDIA DLSS 3."

Nvidia ACE can be run both locally or from the cloud. The ACE suite includes Nvidia’s NeMo tools for deploying large language models (LLMs), Riva speech-to-text and text-to-speech, and other Generative AI technologies.

The demo was built using Unreal Engine 5 and highlights Nvidia’s ray-tracing technologies. The futuristic ramen shop is visually stunning and fully immersive especially when viewed in 4K.

While the visuals of the demo look stunning, the conversation between the player and the NPC leaves much to be desired. There's more work to be done on the facial expressions and tone of the AI character. Honestly, even the player’s voice seems a bit robotic and unnatural.

However, the technology does have some interesting potential. The Verge reports that during the Computex pre-briefing, Nvidia’s VP of GeForce Platform Jason Paul confirmed that the technology can be used for multiple characters. In theory, ACE can allow AI NPCs to talk to each other.

Paul is the first to admit that he hasn't seen this in practice yet.

As the technology grows and advances, we will hopefully see widespread use of Nvidia ACE. It's easy to imagine Cyberpunk 2077’s Night City or GTA 5’s Los Santos filled with NPCs that players can talk to and interact with. The technology will open up a lot of possibilities in terms of immersive gaming and world-building. It would also make for some pretty interesting stories and unique experiences.  

Hopefully, Ubisoft and Blizzard Entertainment can take advantage of this technology as well.


0 Comments

Your email address will not be published. Required fields are marked *

Darryl Lara

Darryl has been gaming since the early 90s, loves to read books and watch TV. He spends his free time outside of gaming and books by riding his motorcycle and taking photographs. You can find Darryl on Instagram. Check him out on Steam and Xbox too.
Comparison List (0)