Search form

NVIDIA Omniverse Avatar Cloud Engine Unveiled

At SIGGRAPH 2022, the company announced its new suite of cloud-native AI models and services that provide businesses of any size the power to build and deploy sophisticated avatars and virtual assistants.

Today at SIGGRAPH 2022, NVIDIA announced the NVIDIA Omniverse Avatar Cloud Engine (ACE), a suite of cloud-native AI models and services that make it easier to build and customize lifelike virtual assistants and digital humans. By bringing these models and services to the cloud, businesses of any size can instantly access the computing power needed to create and deploy assistants and avatars that understand multiple languages, respond to speech prompts, interact with the environment, and make intelligent recommendations.

“Our industry has been on a decades-long journey teaching computers to communicate and carry out complex tasks with ease that humans take for granted,” said Rev Lebaredian, vice president of Omniverse and simulation technology at NVIDIA. “NVIDIA ACE brings this within reach. ACE combines many sophisticated AI technologies, allowing developers to create digital assistants that are on a path to pass the Turing test.”

The ACE is built on NVIDIA’s Unified Compute Framework, which provides access to the software tools and APIs needed to harness the wide range of skills required for realistic, fully interactive avatars. These include NVIDIA Riva for developing speech AI applications, NVIDIA Metropolis for computer vision and intelligent video analytics, NVIDIA Merlin for high-performing recommender systems, NVIDIA NeMo Megatron for large language models with natural language understanding, and NVIDIA Omniverse for AI-enabled animation.

Check out the demo showing Audio2Face in NVIDIA Omniverse Avatar Cloud Engine (ACE) powering a metahuman natural language processing, a custom voice model, and animation behaviors in real-time. MetaHuman in Unreal Engine image courtesy of Epic Games.

NVIDIA expects the assistants and avatars ACE enables to transform interactions in gaming, entertainment, banking, transportation, and hospitality.

Two applications built on ACE include NVIDIA’s Project Maxine and Project Tokkio. Project Maxine brings state-of-the-art video and audio features to virtual collaboration and content creation applications. Project Tokkio enables interactive avatars that see, perceive, converse intelligently, and provide recommendations to enhance customer service in places like restaurants.

Omniverse ACE Support

Developers of virtual assistants and digital humans plan to use ACE to accelerate their avatar development efforts.

“Reallusion developers can build convincing characters quickly and easily, and now we step forward with the latest advancements in artificial intelligence and real-time rendering,” said Elvis Huang, head of innovation, Innovation Development Department, Reallusion, Inc. “Pairing our Character Creator and iClone tools with NVIDIA’s Omniverse Avatar Cloud Engine will be a great way to create life-like avatars that interact with end users realistically.”

“Demand for digital humans and virtual assistants continues to grow exponentially across industries, but creating and scaling them is getting increasingly complex,” noted Kevin Krewell, principal analyst, TIRIAS Research. “NVIDIA's Omniverse Avatar Cloud Engine brings together all of the AI cloud-based microservices needed to more easily create and deliver lifelike, interactive avatars at scale.”

Learn more about Omniverse Avatar Cloud Engine here.

Source: NVIDIA

Dan Sarto's picture

Dan Sarto is Publisher and Editor-in-Chief of Animation World Network.