Search form

NVIDIA Launches ‘Omniverse Avatar Cloud Engine’ Early-Access

New suite of cloud-native AI microservices helps developers build and deploy intelligent virtual assistants and digital humans at scale, including adding animation to any avatar built on virtually any platform – such as Unreal’s MetaHuman tech and cross-platform game avatar development tools - and deployed on any cloud.

NVIDIA has launched the early-access program for NVIDIA Omniverse Avatar Cloud Engine (ACE), a suite of cloud-native AI microservices that make it easier to build and deploy intelligent virtual assistants and digital humans at scale. ACE eases avatar development, delivering the AI building blocks needed to add intelligence and animation to any avatar, built on virtually any engine and deployed on any cloud.

The AI assistants can be designed for organizations across industries, helping creators accelerate development of 3D worlds and the metaverse while enabling organizations to enhance existing workflows and unlock new business opportunities.

Bring Interactive AI Avatars to Life with Omniverse ACE

Omniverse ACE enables seamless integration of NVIDIA’s AI technologies — including pre-built models, toolsets, and domain-specific reference applications — into avatar applications built on most engines and deployed on public or private clouds.

Unveiled in September and shared with select partners to capture early feedback., ACE is now looking for partners to provide feedback on the microservices, collaborate to improve the product, and push the limits of what’s possible with lifelike, interactive digital humans.

The early access program provides access to the following:

  • 3D animation AI microservice for third-party avatars, which uses Omniverse Audio2Face generative AI to bring to life characters in Unreal Engine and other rendering tools by creating realistic facial animation from just an audio file.
  • 2D animation AI microservice Live Portrait, which enables animation of 2D portraits or stylized human faces using live video feeds.
  • Text-to-speech microservice uses NVIDIA Riva TTS to synthesize natural-sounding speech from raw transcripts without any additional information, such as patterns or rhythms of speech.

Watch: “Bringing Avatars to Life with NVIDIA Omniverse Avatar Cloud Engine:”

Program members will also get access to tooling, sample reference applications, and supporting resources to help get started.

Avatars Make Their Mark Across Industries

Omniverse ACE can help teams build interactive, digital humans by providing the following:

  • The easy animation of characters for users with minimal expertise.
  • The ability to deploy on the cloud, making avatars usable virtually anywhere, such as a quick-service restaurant kiosk, a tablet, or a virtual-reality headset.
  • Interoperability between NVIDIA AI and other solutions - enabled by the plug-and-play suite built on NVIDIA Unified Compute Framework (UCF).

Partners, including Ready Player Me and Epic Games, have experienced how Omniverse ACE can enhance workflows for AI avatars.

The Omniverse ACE animation AI microservice supports 3D characters from Ready Player Me, a platform for building cross-game avatars.

“Digital avatars are becoming a significant part of our daily lives,” said Timmu Tõke, CEO and co-founder of Ready Player Me. “People are using avatars in games, virtual events, and social apps, and even as a way to enter the metaverse. We spent seven years building the perfect avatar system, making it easy for developers to integrate in their apps and games and for users to create one Avatar to explore various worlds — with NVIDIA Omniverse ACE, teams can now more easily bring these characters to life.”

Epic Games’ advanced MetaHuman technology transformed the creation of realistic, high-fidelity digital humans. Omniverse ACE, combined with the framework, will make it even easier for users to design and deploy engaging 3D avatars.

In VTubing, a new way of live streaming, users embody a 2D avatar and interact with viewers. With Omniverse ACE, creators can move their avatars into 3D from 2D animation, including photos and stylistic faces. In addition, users can render the avatars from the cloud and animate the characters from anywhere.

Early access to Omniverse ACE is available to developers and teams building avatars and virtual assistants. More information is available here.

Source: NVIDIA

Debbie Diamond Sarto's picture

Debbie Diamond Sarto is news editor at Animation World Network.