FastNeRF: High-Fidelity Neural Rendering at 200FPS
release_wleeyu33h5cgbjlgfrvsqrrcyu
by
Stephan J. Garbin, Marek Kowalski, Matthew Johnson, Jamie Shotton, Julien Valentin
2021
Abstract
Recent work on Neural Radiance Fields (NeRF) showed how neural networks can
be used to encode complex 3D environments that can be rendered
photorealistically from novel viewpoints. Rendering these images is very
computationally demanding and recent improvements are still a long way from
enabling interactive rates, even on high-end hardware. Motivated by scenarios
on mobile and mixed reality devices, we propose FastNeRF, the first NeRF-based
system capable of rendering high fidelity photorealistic images at 200Hz on a
high-end consumer GPU. The core of our method is a graphics-inspired
factorization that allows for (i) compactly caching a deep radiance map at each
position in space, (ii) efficiently querying that map using ray directions to
estimate the pixel values in the rendered image. Extensive experiments show
that the proposed method is 3000 times faster than the original NeRF algorithm
and at least an order of magnitude faster than existing work on accelerating
NeRF, while maintaining visual quality and extensibility.
In text/plain
format
Archived Files and Locations
application/pdf 1.0 MB
file_jfi2sghbsvdzbluq4npy77y2iu
|
arxiv.org (repository) web.archive.org (webarchive) |
2103.10380v1
access all versions, variants, and formats of this works (eg, pre-prints)