One reason I love interactive graphics is that every now and then something happens in the field – programmable shaders, powerful mobile devices, DX12/Vulkan/Metal, VR, AR, and now this – that changes what’s possible and how we think about interactive rendering. New algorithms arise to exploit new and different functionality. It’s a fun world!
Microsoft added ray tracing support to its DirectX API. And this time it’s not an April Fool’s Day spoof, like a decade ago. Called DirectX Raytracing, DXR for short, it adds the ability to cast rays as shader invocations. There are already a bunch of articles and blog posts.
Here are the resources I’ve noticed so far (updated as I see new ones – let me know):
- Gamasutra’s, on DXR. Useful as a quick overview and for its links to the Remedy talk and other resources.
- The Northlight video. Remedy’s presentation on Monday at GDC. The Powerpoint is a large download, but has some good comparison images and slide notes.
- Jaw-dropping reflections in the real-time Star Wars demo. Unreal has information about their use of DXR and links.
- SEED’s Pica Pica demo video, their presentation slides, and post-GDC report.
- Microsoft’s blog entry on DXR, and PIX supports it, and tips on setting up.
- Anandtech covers and speculates a bit on hardware vendor support.
- NVIDIA’s developer blog, GameWorks overview, and general press release each have useful bits.
- NVIDIA also announced RTX, their DXR backend implementation. Video Q&A.
- The Call for Participation for the book Ray Tracing Gems, due out at GDC 2019. Deadline for submissions is October 15th.
- Aras Pranckevičius from Unity has some thoughts on his blog about DXR, with some tasty links tossed in.
- NVIDIA has some DXR ray tracing tutorials up on Github.
- Long video from NVIDIA showing DXR used in a wide variety of effects. Worth your time.
- I await slides or a video of the session for this, this, and this.
- and no doubt a dozen more resources by the end of GDC, e.g. this talk, so search Twitter for more (I did like this one).
It will be interesting to see if there’s any spike of interest for ray tracing on Google’s analytics. While I doubt having DXR functionality will change everything – it still has to be performant compared to other specialized techniques – it’s great seeing another tool in the toolbox, especially one so general. Even if no ray tracing is done in an interactive renderer that is in development, it will now be much easier to get a ground-truth image for comparison when testing other techniques, since shader evaluations and all the rest now fit within a ray tracing fragment. Ray and path tracing, done long enough (or smart enough), give the correct answer, versus screen-based techniques.
Doing these fast enough is the challenge, and denoisers and other filtering techniques (just as done today with rasterized-buffer-based algorithms) will see a lot of use in the coming months and years. I’m going to go out on a limb here, but I’m guessing GPUs will also get faster. Now if we can just get people to stop upping the resolution of screens and stop adding more content to scenes, it’ll all work out.
Even within the Remedy talk, we see ray tracing blending with other techniques more appropriate for diffuse global illumination effects. Ambient occlusion is of course a hack, but a lovely one, and ray tracing can stand in for screen-space methods and so avoid some artifacts. I think getting away from screen-space techniques is potentially a big win, as game artists and engineers won’t have to hack models or lighting to work around major artifacts seen in some situations, so saving time and money.
I’m also interested to see if this functionality gets used in other applications, as there are plenty of areas – all sorts of audio design applications, various other types of engineering analyses – that could benefit from faster turnaround on computations.
Enjoy exploring! I look forward to what we all find.
Some of the eye-candy videos:
Tags: ray tracing