With Ray Tracing becoming ever more popular, NVIDIA have written up a technical post on bringing DirectX Ray Tracing to Vulkan to encourage more developers to do it.
The blog post, titled "Bringing HLSL Ray Tracing to Vulkan" mentions that porting content requires both the API calls (so DirectX to Vulkan) and the Shaders (HLSL to SPIR-V). Something that's not so difficult now, with the SPIR-V backend to Microsoft's open source DirectXCompiler (DXC).
Since last year, NVIDIA added ray tracing support to DXC's SPIR-V back-end too using their SPV_NV_ray_tracing extension and there's already titles shipping with it like Quake II RTX and Wolfenstein: Youngblood. While this is all NVIDIA-only for now, The Khronos Group is having discussions to get a cross-vendor version of the Vulkan ray tracing extension implemented and NVIDIA expect the work already done can be used with it which does sound good.
NVIDIA go on to give an example and sum it all up with this:
The NVIDIA VKRay extension, with the DXC compiler and SPIR-V backend, provides the same level of ray tracing functionality in Vulkan through HLSL as is currently available in DXR. You can now develop ray-tracing applications using DXR or NVIDIA VKRay with minimized shader re-writing to deploy to either the DirectX or Vulkan APIs.
See the full post here.
Eventually, with efforts like this and when Vulkan has proper cross-vendor ray tracing bits all wired up, it would give developers an easier job to get Vulkan ports looking as good as they can with DirectX. This makes the future of the Vulkan API sound ever-more exciting.
Quoting: appetrosyanDon't quite get what they gain from this. Still, this means that we could (in theory) have RTX accelerated Quake 2 on Linux.We already do. That's the point. Quake II RTX is out and supports Linux.
I.e. for Nvidia, the more space on the die they use for ray tracing, the less is left for regular compute units. Which requires them to go out of the way to convince everyone how useful ray tracing ASICs are.
It's not clear at all, that the above trade off is worth it. For some minor improvement of ray traced lighting (big improvement can't be achieved even with such ASICs), you need to pay with reduced general GPU performance.
Last edited by Shmerl on 23 February 2020 at 7:31 am UTC
Quoting: ShmerlI wouldn't say ray tracing became more popular.
It's just a matter of time.
That said, I avoided buying a GTX 2000, because at the moment, it feels more like an expensive gimmick.
Quoting: EikeIt's just a matter of time.
That said, I avoided buying a GTX 2000, because at the moment, it feels more like an expensive gimmick.
It is a gimmick. More of a marketing tool than a really useful feature. To achieve good quality real time ray tracing, you need really powerful hardware. And one that can be fit in a single GPU gives at best some minor enhancement to the lighting, and as I said above, it naturally comes at the cost of everything else.
Last edited by Shmerl on 23 February 2020 at 7:51 am UTC
Quoting: ShmerlIt is a gimmick.
No, it's a solution the the rendering problems that rasterisers can't solve. It's just the first generation of hardware that attempts to use it in real time graphics. It's as much a gimmick as 3D rendering was with the first 3Dfx card.
Quoting: EhvisNo, it's a solution the the rendering problems that rasterisers can't solve. It's just the first generation of hardware that attempts to use it in real time graphics. It's as much a gimmick as 3D rendering was with the first 3Dfx card.
It can't solve it adequately this way. I explained above why. Unless you are proposing to have a whole dedicated device alongside your GPU, cramming more and more compute units into ray tracing ASICs to make it actually useful will cost more and more general GPU performance.
It's the same reason GPU was separated from the CPU in the first place.
Last edited by Shmerl on 23 February 2020 at 9:28 am UTC
Quoting: EhvisNo, it's a solution the the rendering problems that rasterisers can't solve. It's just the first generation of hardware that attempts to use it in real time graphics. It's as much a gimmick as 3D rendering was with the first 3Dfx card.
I'd say it's as much a gimmick as 3D rendering was before the first 3Dfx card.
Man, Decent on a 3Dfx was so amazing...
Quoting: ShmerlIt can't solve it adequately this way. I explained above why. Unless you are proposing to have a whole dedicated device alongside your GPU, cramming more and more compute units into ray tracing ASICs to make it actually useful will cost more and more general GPU performance.
To turn it downside up: Do you know another promising way to go to make graphics rendered in realtime "photorealistic"(*)?
(*) A term abused for decades...
See more from me