With Ray Tracing becoming ever more popular, NVIDIA have written up a technical post on bringing DirectX Ray Tracing to Vulkan to encourage more developers to do it.
The blog post, titled "Bringing HLSL Ray Tracing to Vulkan" mentions that porting content requires both the API calls (so DirectX to Vulkan) and the Shaders (HLSL to SPIR-V). Something that's not so difficult now, with the SPIR-V backend to Microsoft's open source DirectXCompiler (DXC).
Since last year, NVIDIA added ray tracing support to DXC's SPIR-V back-end too using their SPV_NV_ray_tracing extension and there's already titles shipping with it like Quake II RTX and Wolfenstein: Youngblood. While this is all NVIDIA-only for now, The Khronos Group is having discussions to get a cross-vendor version of the Vulkan ray tracing extension implemented and NVIDIA expect the work already done can be used with it which does sound good.
NVIDIA go on to give an example and sum it all up with this:
The NVIDIA VKRay extension, with the DXC compiler and SPIR-V backend, provides the same level of ray tracing functionality in Vulkan through HLSL as is currently available in DXR. You can now develop ray-tracing applications using DXR or NVIDIA VKRay with minimized shader re-writing to deploy to either the DirectX or Vulkan APIs.
See the full post here.
Eventually, with efforts like this and when Vulkan has proper cross-vendor ray tracing bits all wired up, it would give developers an easier job to get Vulkan ports looking as good as they can with DirectX. This makes the future of the Vulkan API sound ever-more exciting.
Quoting: ShmerlQuoting: elmapulwtf?
Ray tracing is the holygrail of computer graphics.
maybe Rtx, their dedicated cores, may be gimick, but Ray tracing?
We aren't talking about ray tracing, we are talking about Nvidia's implementation. See my post above. What they did it not a miracle, it's a gimmick. Once someone will make serious real time ray tracing on commodity hardware, you can call it a miracle. Nvidia did nothing of the sort.
if its more affordable than an super computer, or can do it under 16ms, it is an miracle, dont matter if it need an asyc to do it or not.
Quoting: GuestQuoting: elmapulQuoting: GuestQuoting: elmapulQuoting: ShmerlQuoting: EikeIt's just a matter of time.
That said, I avoided buying a GTX 2000, because at the moment, it feels more like an expensive gimmick.
It is a gimmick. More of a marketing tool than a really useful feature. To achieve good quality real time ray tracing, you need really powerful hardware. And one that can be fit in a single GPU gives at best some minor enhancement to the lighting, and as I said above, it naturally comes at the cost of everything else.
wtf?
Ray tracing is the holygrail of computer graphics.
maybe Rtx, their dedicated cores, may be gimick, but Ray tracing?
that is simply the reason why the computer graphics industry had to use countless other gimmicks, because they didnt had real time ray trace, what nvidia did was an miracle that was later followed by others, sure, its not as good as rendering the entire frame, the same way that eevee (on blender) is not as good as cycles, but its close enough.
rendering in 16ms what usually take hours in a much better machine is not an small deal, sure its not as good as, but its impressive nonetheless.
one thing that i hate in gamers in general is how clueless they are, i dont give a fuck about 4k, raytracing is an serious technology, 4k is just a gimmick, but when they realizes that they would have to give up on 4k to play with raytracing, what they did? trash talked the technology, and that is the reason why it didnt sell as it should.
sure, there are other factors too, like games that arent really optimized for it, but seem the reception that this technology had, just disgusts me.
I always love it when clueless people call others clueless.... It is funny.
No, Nvidia performed no such "miracle". Nvidia just caught up with AMD's hardware architecture after many years. Nvidia's gpus for the better part of this decade were lagging behind in technology. They lacked async compute (VERY important and if games actually utilized it we would be seeing superior games), they lacked enough shading power and relied on geometry and brute force, they overtesselated everything just to win in benchmarks etc.
RTX is just some CUDA shaders that perform raytracing effects. That is why Nvidia after some months enabled the 1000 series to have RTX too.... It was just software. And guess what, architecturally VEGA and NAVI from AMD could run RTX just as efficiently, if Nvidia allowed their shaders to be translated to AMD hardware legally.... Oops, i guess now they did.
4K is not a "gimmick". Alongside HDR, they can enhance graphical fidelity considerably. They do cost a lot of resources. But if i had the choice between 4K/HDR and RTX on 1080p, i would pick 4K, every single time. Why? Because most effects RTX performs can be done with traditional techniques and be quite good looking, while 4K and HDR color literally upgrade the detail level of the whole screen. so yeah, RTX is a gimmick.
first off, i dont know about the hardware details, nvidia may lag behind in this front, but from an software point of view, ray tracing in real time is impressive (even more if they did in on their hardware that is as bad as you described)
it dont matter if it was an hardware inovation, or software inovation, its impressive in any case!
saying "It was just software" ,"Its just a shader", donot explain how they could make such a shader, if it was so easy, people would have figured it out before.
"if Nvidia allowed their shaders to be translated to AMD hardware legally.... Oops, i guess now they did."
the fact that nvidia hold the patent for that proves that they did it first.
why didnt amd did it before? or any game developer? because it isnt easy to do.
and second, 4k is not an guarantee of quality.
if i want to make an 4k game, i can just make an game with atari graphics and it will run on an ps3, an ps2, maybe even an n64 (if you ignore that those platforms dont have the proper output capabilities)
the issue with 4k is that you are giving up on something else to increase the resolution.
nowadays, you can run old games in high resolution on some emulators, pick an n64 game in 4k vs an movie in full HD, what have better graphics?
until games looks as good as movies (with fur and etc) i dont see the reason to increase the resolution, fullHD is good enough, 4k is bullshit, i rather have the double the poly count than double the resolution. (quadruple if you count x and y)
"Because most effects RTX performs can be done with traditional techniques and be quite good looking,"
that is why ray tracing is such a revolution, because now developers dont have to use those techiniques (those gimicks), they can just focus on making the game.
instead of baking the lights then making the objects inmovable so you cant break the illusion that the shadows are real, you just simulate light.
baking shadows is a gimmick.
in the current gen games, its hard to show the true advantage, because games were designed with that limitation into account, but now developers can get rid of then.
this video explain the issue better than i can:
https://www.youtube.com/watch?v=lyfTPG-dwOE
First of all, it would be nice if you don't comment on things you have no knowledge about. Sorry to be harsh, but i am tired of wasting time responding to people who clearly haven't done their homework.... Mostly kids and teenaged gamers...
"saying "It was just software" ,"Its just a shader", donot explain how they could make such a shader, if it was so easy, people would have figured it out before."
You realize raytracing is a very old technique, right? Raytracing is not new, it is decades old, and you can google about it and educate yourself. People had already figured it out, the issue has always been an issue of not enough transistors present on personal hardware to make it feasible in real time video games.... The reason modern gpus began implementing it is because we are at 7nm these days, there is no rush to reach higher resolutions (4K is more than enough for the next decade) and there are no new raster techniques that need extra juice. So they can use any extra transistors to begin transitioning to raytracing this decade.
Nvidia didn't invent anything, really. They just use gpu shaders to perform some effects. Even the dedicated hardware they use on some models is really just specialized shaders. It is not some newfound "magic" Nvidia invented, despite their marketing claiming it is. That is what i am combating, this ridiculous Nvidia propaganda. I hate Nvidia with a passion because they are a detriment to the hardware industry, they enjoy a dominant position in the gpu market while selling pure air and marketing hype.
As for 4K, sorry to burst your bubble, but any 1080p game will have significantly worse visuals than 4K on a proper screen. You clearly haven't gamed on 4K.... It is a great improvement all around. On the other hand, right now video games are not going to be fully ray-traced, raytracing is used mostly on some effects, they REPLACE some effects. The difference is there, raytracing is definitely more realistic, but for the most part the difference is barely noticeable during gameplay. So if someone has to choose between tiny more realism at 1080p that he won't really notice while gaming, and a little less realism at 4K, i think the vast majority of people are going to pick 4K. It just adds more visual quality.
Obviously, N64 games at 4K won't look significantly better than 1080p, because their textures are tiny, even for 1995 standards N64 games had crappy textures due to ram restrictions. You can't notice the difference in detail on flat 1995 textures.... But we are going to play at 4K games like Cyberpunk 2077, with 8gb+ vram.... If you think you can display all that detail of games made in 2020 and beyond on 1080p, you are delusional. We are not talking Super Mario 64 here....
I am sorry, but you are clearly an Nvidia fanboy here. I can smell it from a mile away, you just want to support Nvidia propaganda. When next gen AAA games arrive, people will be disabling raytracing in droves to gain FPS and ability to play at higher resolutions, unless developers begin not implementing the same effects using traditional techniques on purpose to push for more raytracing adoption....
You realize raytracing is a very old technique, right? Raytracing is not new, it is decades old, and you can google about it and educate yourself. People had already figured it out, the issue has always been an issue of not enough transistors present on personal hardware to make it feasible in real time video games..
W.R.O.N.G.
they didnt brute force it, they made a few samples and extrapolated from it.
we still dont have enough power to do it in real time without some tricks.
and the fact that it can work in older hardware proves that its not brute force in hardware, but code, an clever algorith.
even if it was pure hardware based, they deserve some credits for making an 7nm hardware, this isnt easy either.
" N64 games had crappy textures due to ram restrictions. "
wrong again, the issue was to make the textures fit into the catridge.
"you are delusional. We are not talking Super Mario 64 here...."
that was just an example on how resolution means nothing in pratice.
people are criticizing doom eternal for not being 4k (on xbox or stadia) but they would have to sacrifice something else more important to render it in 4k.
like disable some shaders, or something else, they did the right choice, the 1800 resolution wont kill anyone, but consumers are stupid.
"I am sorry, but you are clearly an Nvidia fanboy here."
i'm not an nvidia fanboy, they did a lot of shit in the past
https://www.youtube.com/watch?v=ZcF36_qMd8M
but that dont means everything they do is shit, i recognize when they do something impressive, and to say its just bruteforce is quite dumb.
raytracing in real time is impressive, if that was not an thing, cryengine, microsoft and amd would not be running to implement it on their technology (especialy after nvidia coming with it proving it was possible)
RTX may be bullshit since it was proven that the specific/dedicated hardware was not nescessary, but raytracing is not.
See more from me