Simulation / Modeling / Design

Top Nine Developer Questions About the Ray Tracing Essentials Series

During our recent webinar on ray tracing, available on-demand now, we received over 800 questions from developers who attended the session. Thanks to everyone who participated! After combing through the stack, NVIDIA researchers Eric Haines and Adam Marrs have selected the nine most compelling questions and provided in-depth answers. 

What about ray tracing in anti-aliasing algorithms? Can ray tracing improve temporal anti-aliasing (TAA)?

The advantage of ray tracing is that you can send more rays to pixels you have determined are likely to need more samples. It’s possible, likely even, that only a small percentage of the pixels will need additional samples. Compare this to traditional rasterization techniques for spatial antialiasing, where you must sample all pixels in lockstep with the same pattern. To learn more, check out the GDC talk on Improving Temporal Anti Aliasing with Adaptive Ray Tracing.

Are BVHs recomputed each frame? How long does it take compared to actual raycasts?

If objects have changed shape or moved within the BVH, then some actions need to be done to make the BVH match reality. There are different possibilities, such as expanding the bounds for each object moved, or rebuilding some (or all) of the BVH from scratch. 

As for times, your mileage will vary—it all depends. In practice, you can have multiple levels of BVHs, such as your teapot model might have an unchanging BVH around it, while your scene with a hundred teapots bouncing around it has a BVH that needs to be updated each frame.

For expert-level advice on this subject, see Tips and Tricks: Ray Tracing Best Practices and the RTX Real Time Ray Tracing Best Practices video.

Does path tracing shoot rays randomly, or does it follow a statistical model to shoot rays?

Rays can be cast entirely randomly, but we usually try to figure out some ways to favor the more useful directions so that we get a solution faster. This topic is called “importance sampling” and a lot of research has been done in this area. If you want to delve deeply into the theory, a good place to start is Physically Based Rendering, now free online.

Are there any guidelines on how to combine ray tracing with rasterization to get the best of both worlds?

Rasterization performs best starting from a single point (such as an eye or light location) to a set of evenly spaced locations (like a screen). This particular “coherent ray cast” scenario is what rasterization hardware is designed for and does extremely efficiently. Ray tracing provides the ability to cast rays from any starting point in any direction. This “incoherent ray cast” scenario isn’t possible with straight rasterization.

There are case studies of combining the two in Ray Tracing Gems, there’s more about Metro Exodus’s engine in this talk and Tech Interview: Metro Exodus, ray tracing and the 4A Engine’s open world upgrades, and this advanced video gives some newer tips.

Voxels are getting popular these days. Could you imagine that hardware will support octree traversal in the future?

Octrees can be a bit more efficient in some circumstances because as soon as you have a hit in an octree node, you don’t have to traverse the octree further. Researchers have explored such octree traversal hardware in the past.

That said, I have my doubts that special purpose hardware is needed. Bounding volume hierarchies are pretty flexible and can emulate octree traversal to a good extent using nested boxes. They seem more flexible to me overall. Put a big asterisk on this one, since I’m not a hardware designer nor a market analyst.

Can hardware ray tracing be applied to sound processing?

Yes. We don’t know that much about the area but do know that the VRWorks Audio SDK, for example, uses RTCores to accelerate ray tracing for its solutions.

Can you talk about how well RTX primitives might work for non-triangle geometry like an implicit surface?

Triangle intersection is a part of the RT Core hardware, as is the bounding box intersection. These cover most needs, just as rasterizers want almost everything turned into triangles to render. For ray tracing other primitive types, you can either turn them into triangles, or you can write shaders that perform the ray/object intersection for the primitive. In DirectX Ray Tracing, these shaders are called Intersection Shaders

What does the denoising process look like?

The topic of denoising was a popular one for questions. I know just the basics on this topic myself. I’ve collected much of what I know on Ray Tracing Resources, along with many other resources. The Quake II RTX demo has an implementation of A-SVGF denoising you can examine. In Chapter 19 of Ray Tracing Gems, you can learn how Epic Games approached denoising in Unreal Engine 4. The end of this newer video gives some tips about effective denoising.

When will the hardware be powerful enough to just render with brute force in real time?

There are two brute force techniques I can imagine. On a physical simulation level, my back-of-the-envelope calculation for an incandescent 40-watt lightbulb is that it emits about 10 to the twentieth photons per second. Say there are ten billion computer cores on earth, and each can trace the paths of a million photons per second. We’d need about ten thousand earths’ worth of computers to track them all, for just one dim light bulb. 

On a practical level, if by “brute force,” we mean “I’d like to gather say 1000 samples per pixel per frame,” that’s easier, but out of reach today for all but the most trivial scenes. Plus, artists and developers are always able to (and do!) increase scene complexity when hardware improves, whether it’s through more geometry, texture, shading models, or other effects. This dynamic keeps the field interesting, as we try to figure out the best ways to use the tools we have at hand to make the most convincing imagery.

Next steps

To learn more about the history and application of ray tracing techniques, check out the on-demand recording of the Ray Tracing Essentials webinar, watch the Ray Tracing Essentials YouTube series, or download the free 2019 book Ray Tracing Gems.  

Read more >>

Discuss (0)

Tags