Interactive AI Material Generation and Editing in NVIDIA Omniverse

We present an AI-based tool for interactive material generation within the NVIDIA Omniverse environment. Our approach leverages a State-of-the-art Latent Diffusion model with some notable modifications to adapt it to the task of material generation. Specifically, we employ circular-padded convolution layers in place of standard convolution layers. This unique adaptation ensures the production of seamless tiling textures, as the circular padding facilitates seamless blending at image edges. Moreover, we extend the capabilities of our model by training additional decoders to generate various material properties such as surface normals, roughness, and ambient occlusions. Each decoder utilizes the same latent tensor generated by the de-noising UNet to produce a specific material channel. Furthermore, to enhance real-time performance and user interactivity, we optimize our model using NVIDIA TensorRT, resulting in improved inference speed for an efficient and responsive tool.

Authors

Hassan Abu Alhaija (NVIDIA)
Michael Babcock (NVIDIA)
James Lucas (NVIDIA)
David Tyner (NVIDIA)
Rajeev Rao (NVIDIA)
Maria Shugrina (NVIDIA)

Publication Date

Uploaded Files

paper393.25 KB

Award

Real-Time Live! Best in Show