Repo of the Week: Instant Neural Graphics Primitives

ODSC - Open Data Science
3 min readMar 28, 2022

--

A team of researchers from NVIDIA including Thomas Muller, Alex Evans, Christoph Schied, and Alexander Keller, demonstrated a new method that should enable the efficient use of artificial neural networks for rendering computer graphics. Rendering is a notoriously slow process so this is a significant development since prior approaches resulted in relatively slow training times.

The Neural Reflectance Field Textures (NeRF-Tex), which are also shown, are intended to simplify the modeling of complex materials such as fur or fabric. NeRF-Tex represents these materials and can be laid over a classic mesh as a texture.

As the team describes it: “We demonstrate near-instant training of neural graphics primitives on a single GPU for multiple tasks. In a gigapixel image, we represent an image by a neural network. SDF learns a signed distance function in 3D space whose zero level-set represents a 2D surface. NeRF [Mildenhall et al. 2020] uses 2D images and their camera poses to reconstruct a volumetric radiance-and-density field that is visualized using ray marching. Lastly, neural volume learns a denoised radiance and density field directly from a volumetric path tracer. In all tasks, our encoding and its efficient implementation provide clear benefits: instant training, high quality, and simplicity. Our encoding is task-agnostic: we use the same implementation and hyperparameters across all tasks and only vary the hash table size” which trades off quality and performance.

You can view the full detail here.

The resulting Github project, which does require a C++14 capable compiler, allows near-instant training of neural graphics primitives on a single GPU. This is quite impressive as new graphic primitives are quite expensive to train. The tricks seem to be able to reduce the number of floating-point and memory access operations and then the training cost with an input encoding that allows the use of a smaller network without sacrificing quality. This smaller neural network is augmented by a multi-resolution hash table of trainable feature vectors whose values are optimized through stochastic gradient descent. Thus this multi-resolution structure enables the network to disambiguate hash collisions, making for a simple architecture that is easy to parallelize on modern GPU

With “Instant Neural Graphics Primitives” (Instant-NGP), the research group shows a framework with which a neural network can learn representations of gigapixel images, 3D objects, and NeRFs within seconds.

Original post here.

Read more data science articles on OpenDataScience.com, including tutorials and guides from beginner to advanced levels! Subscribe to our weekly newsletter here and receive the latest news every Thursday. You can also get data science training on-demand wherever you are with our Ai+ Training platform. Subscribe to our fast-growing Medium Publication too, the ODSC Journal, and inquire about becoming a writer.

--

--

ODSC - Open Data Science
ODSC - Open Data Science

Written by ODSC - Open Data Science

Our passion is bringing thousands of the best and brightest data scientists together under one roof for an incredible learning and networking experience.

No responses yet