Hardware maker NVIDIA is launching a series of developer tools under NVIDIA Omniverse focused on making life in the Metaverse more realistic.
Multinational technology firm and hardware maker NVIDIA is taking a no-holds-barred approach in ramping up its efforts to cement its presence in the Metaverse. Early this week, the company unveiled a new set of developer tools and frameworks focused on Metaverse environments.
The new tools would apparently make life in the Metaverse more realistic. Think about artificial intelligence (AI) capabilities, simulations, and other creative assets. The tools are released under the NVIDIA Omniverse, a platform created to expand and connect the digital world.
Creators who are currently using the NVIDIA Omniverse Kit, together with apps like Nucleus, Audio2Face, and Machinima, would be able to access the new developer tools. According to NVIDIA’s announcement, one primary function of the tools is to help enhance the creation of “accurate digital twins and realistic avatars.”
The Audio2Face application has been updated, this time with a focus on digital identity. According to NVIDIA’s official statement regarding the new toolkit, users will now have the capability to direct the emotion of digital avatars over time, including full-face animation.
The timing of NVIDIA’s latest toolkit is just right, as the quality of Metaverse interaction is controversial in the industry. Both developers and users are considering the quality of users’ experiences over quantity based on past events.
For instance, Decentraland held the first-ever Metaverse fashion week last spring. Those who “attended” it were unanimous in their feedback regarding the lack of quality in the digital environments and garments. Users’ feedback on the avatars they interacted with was also particularly underwhelming.
A Closer Look at NVIDIA Omniverse ACE
The new NVIDIA toolkit is looking to provide the necessary technology for the continuous development of the Metaverse’s nascent infrastructure. It comes in the form of the NVIDIA Omniverse Avatar Cloud Engine (ACE), a new tool for creating 3D-based human models.
This early, developers claim that NVIDIA Omniverse ACE would do wonders in improving the building conditions of “virtual assistants and digital humans.” With Omniverse ACE, developers can build, configure, and deploy their avatar applications across practically any engine, whether in a public or private cloud.
NVIDIA’s ACE tool will use a combination of AI and computer graphics to transform how computing systems understand and articulate language with real people welcoming AI into the Metaverse.
“[Three-dimensional] content is especially critical for the metaverse as we need to put stuff in the virtual world,” said Sanja Fidler, vice president of AI research at NVIDIA, shared in a press meeting. “We believe that AI is existential for 3D content creation, especially for the metaverse,” Fidler added.
Imagine Amazon’s smart home assistant, Alexa, with a face. That’s what the NVIDIA Omniverse ACE would allow developers to produce and customize avatars that are interactive and able to communicate visually.
The hardware maker is adamant that the ACE technology would be critical in fostering the future growth of computer-based interactions within the Metaverse.
“The Metaverse is a multitrillion-dollar opportunity that organizations know they can’t ignore, but many struggle to see a clear path forward for how to engage with it,” said Rev Lebaredian, vice president of Omniverse and simulation technology at NVIDIA.
Besides the Metaverse space, NVIDIA Omniverse ACE might also appeal to developers and creators of projects that aspire to serve its audience in a more digital setting. The ACE technology’s language model can identify “human intent,” suggesting further recommendations. It allows for the seamless development of AI-customer service agents.
With the list of utilities in virtual worlds growing by the minute—real estate and land plot purchases, fashion and shopping experiences, legal services—NVIDIA’s ACE technology might make it easier for users to converse more efficiently with digital bots. Hence, a more effortless user experience is achieved, wherein distinguishing automation from an actual person becomes almost tricky.
This latest iteration from NVIDIA also includes NVIDIA PhysX, an “advanced real-time engine for simulating realistic physics.” It means developers can now include realistic reactions to interactions inside the Metaverse that observe the laws of physics.
NVIDIA’s AI technology has been integral in creating interactive spaces in the digital world thus far. With the NVIDIA Omniverse ACE, the hardware maker is not just focused on improving animations but on evolving the Web3 space into a true, lifelike setting that future Metaverses may want to exemplify.
Get more news updates
Get more NFT news updates at Omnimint News. For more information on Omnimint, and details on how to join our community, please follow our Twitter, or subscribe to our Telegram channel for more updates, and please feel free to submit your article.