NVIDIA presents work that improves real-time path tracing and tools for content creation for creators, artists, and gamers.
Share on facebook
Share on twitter
Share on pinterest
Share on linkedin
Share on whatsapp

If you buy something from a this link, myelectricsparks Media may earn a commission. See our Read More.

NVIDIA
(Image credit: Nvidia)

In the past couple of years, If you’re a gaming enthusiast or PC hardware lover, you’ve probably encountered the phrase “ray-tracing” heard around quite often. You probably have heard of it before that. However, with the introduction of Microsoft’s DirectX Ray-Tracing and NVIDIA’s RTX branding, ray-tracing is now a part of the mainstream.

The concept of “ray-tracing” in 3D images is dated to the 15th century; however, the first explanation of the concept of¬†computer-generated graphics¬†was published in the year 1969.¬†

The first algorithm was essential and was merely an approximation. Ray-tracing evolved throughout the years until around 1986. Jim Kajiya presented a paper entitled “The rendering problem and its application to computers for graphics.” In just seven pages, Kajiya explained a method to determine how light behaves physically mathematically, and it was this discovery changed the way 3D rendering was rendered.

 

NVIDIA
(Image credit: Nvidia)

Kajiya has described the method as a “path tracer.” It is an improvement over simple Ray Tracing. Producing high-quality images from a path tracer involves casting millions of radiation. Until recently, it was thought of as not a viable option to render in real-time, but it has been utilized in film and other computer graphics that have been rendered offline for more than 15 years.

The advantage of using a path tracer is a “unified” rendering method. It means you don’t need to calculate different lighting effects, such as reflections, ambient occlusion, soft shadows, etc.¬†

The entire lighting process is performed using the algorithm of path tracing. It also has some drawbacks, mainly when computing path tracing using modern GPUs, primarily optimized for traditional rasterization applications; however, it’s an enormous net benefit for image quality and artists.

Real-time path-tracing that is full-featured isn’t a new concept; it’s how Quake II RTX functions in actuality. In the past, there was a time when the Otoy Brigade team was doing path-tracing with software.¬†

Both approaches have some compromises but. Brigade is limited in the light-based interactions that it can simulate as well as Quake II RTX has the most basic of scenes using basic geometry. It can only perform the equivalent of two bounces using only a few lights for each scene.

NVIDIA
Scene courtesy of GoldSmooth @ TurboSquid: https://www.turbosquid.com/3d-models/3d-interior-scene-opera-garnier-1194044

Then, NVIDIA would like to have cinematic path-tracing that is full-quality in real-time. At GTC 2022, NVIDIA showcased something close to it. Combining its SDKs RTXGI and RTXDI to create a custom “research renderer,” NVIDIA’s graphics engineers have produced results that could be defined as “incredible.”

They are scenes with as many as 3 billion triangles that are fully path-traced and have no rasterization. They’re rendering at “interactive” frames per second, despite using as many as thirty bounces per Ray. These examples contain as many as five hundred thousand “mesh lighting sources,” basically individual polygons that act as lights.¬†

While these scenes might not appear like photos, they’re a whole class over anything you’ll find when playing video games 2022.

Real-Time Path Tracing In Action – Credit: NVIDIA

There are not a lot of limitations to this method as well. It’s not as complicated as Euclideon’s “unlimited details” renderer, which was launched about a decade ago, was based on voxels, and was completely ineffective for games. This renderer uses regular 3D models and supports animated meshes, physical effects, and post-processing similar to every other standard 3D renderer.

One method of doing things faster is to do more work and more complex, but the more efficient approach is to decrease how much work to accomplish. Similar to DLSS, the approach of NVIDIA to speeding up path tracing is approximate. Naturally, the process is based heavily on denoising. The image below illustrates the effects of NVIDIA’s denoising filter using an exaggerated before-and-after comparison.

NVIDIA
(Image credit: TurboSquid)

It is unlikely to see scenes this complex in games anytime soon, and most of them are only running 30 FPS on a GeForce RTX 3090. But this rendering technique is accessible to developers using fully-sourced RTX Direct Illumination (RTXDI) and RTX Global Illumination (RTXGI) SDKs. People working with Unreal Engine can also grab tools that use binary plug-ins.

Researchers at NVIDIA are also open to admitting that more work needs to be accomplished before this path-tracer can replace all the rasterized rendering engines. Because a lot of the optimization relies on removing the most challenging work, the most difficult scenes remain slow and unappealing. Also, specific effects like volumetrics are a snare on the algorithm. In the below image how the fog appears as if it is stippled and noisy.

If you’re using PowerPoint — and you’ll require it, Google Docs barfs on the file. Go to the NVIDIA website and download the complete presentation of 660 megabytes to view the videos yourself. However, if you don’t have PowerPoint, browse smaller versions that are compressed from Green Team researcher Aaron Lefohn’s Twitter. Also, check out this NVIDIA blog to learn more about the process of path tracing generally.