While DLSS has been technically available in the NVIDIA drivers for Linux for some time now, the missing piece was support for Proton which will be landing tomorrow - June 22.
In one of their GeForce blog posts, they made it very clear:
Today we’re announcing DLSS is coming to Facepunch Studios’ massively popular multiplayer survival game, Rust, on July 1st, and is available now in Necromunda: Hired Gun and Chernobylite. Tomorrow, with a Linux graphics driver update, we’ll also be adding support for Vulkan API DLSS games on Proton.
This was revealed originally on June 1 along with the GeForce RTX 3080 Ti and GeForce RTX 3070 Ti announcements. At least now we have a date for part of this extra support for Linux and DLSS. This, as stated, will be limited to games that natively use Vulkan as their graphics API which will be a short list including DOOM Eternal, No Man’s Sky, and Wolfenstein: Youngblood. Support for running Windows games that use DirectX with DLSS in Proton will arrive "this Fall".
With that in mind then, it's likely we'll see the 470 driver land tomorrow, that is unless NVIDIA have a smaller driver coming first with this added in. We're excited for the 470 driver as a whole, since that will include support for async reprojection to help VR on Linux and hardware accelerated GL and Vulkan rendering with Xwayland.
Last edited by Shmerl on 21 June 2021 at 6:25 pm UTC
Last edited by Spyker on 21 June 2021 at 9:05 pm UTC
Quoting: Guesti would be more excited if nvidia open sourced it. along with their drivers. rather than keeping everything behind proprietary, closed up source. especially considering they are not bothering adding support to older gpu's. which many of those older gpu's still offer amazing performance. like the 1080 ti., two generations old.Supporting on pre RTX is not possible. DLSS heavily uses tensor cores, which are only present on RTX2000+ GPUs. The reasons why fidelity FX can work on older GPUs is because it is a good old upscale filter. It is NOT an equivalent of DLSS, even though they market it as such ... (We will see hands-on results, but I feel disappointment coming - Look also at the videos they gave during the presentation, you will see the blur and jaggies on the ones which are moving coming from "dumb" upscales). Upscale works well for static scenes, but as soon as you had movement, there is only so much you can do - and it will look bad -... Comparatively, Nvidia's solution uses a neural network to infer lost information/pixels, thus reconstructing much more precisely the image and movements, with little to no blur and jaggies.
Quoting: Guestits going to be interesting to see when AMD's alternative lands on linux. at least on windows their version will be cross compatible. their own demo was done on a 1060. software lockin's are extremely unethical.Even if Nvidia wanted to port it over, they can not. AMD lacks the HW support for the feature. It is not a sw lock-in. It is just that they have an exclusive HW feature.
CUDA is a sw lock in on the other hand, since it theoretically could run on other GPU albeit it would likely then lose the advantage of being slimmer than openCL).
For DLSS, they could emulate it on older/amd GPUs, but it would most likely reduce performance instead of enhancing it (convolution and other inference methods are very heavy with no dedicated hw or customized ISA, and it would occupy normal cores for naught), which would make no sense.
Quoting: 3zekielQuoting: Guestits going to be interesting to see when AMD's alternative lands on linux. at least on windows their version will be cross compatible. their own demo was done on a 1060. software lockin's are extremely unethical.Even if Nvidia wanted to port it over, they can not. AMD lacks the HW support for the feature. It is not a sw lock-in. It is just that they have an exclusive HW feature.
CUDA is a sw lock in on the other hand, since it theoretically could run on other GPU albeit it would likely then lose the advantage of being slimmer than openCL).
For DLSS, they could emulate it on older/amd GPUs, but it would most likely reduce performance instead of enhancing it (convolution and other inference methods are very heavy with no dedicated hw or customized ISA, and it would occupy normal cores for naught), which would make no sense.
You can still create an Open standard in order to implement it, it's not about of what your competence can do with their current hw but how you allow to evolve the industry with your technology. Nvidia strategy is simply anti-competitive, they don't want to be the best they just want to keep you tied to their brand.
The saddest part is they have been doing time after time the same stupid proprietary strategy that always end up in failure. Lets hope that once again they fail (and looking on how they have been pushing more titles and this support on Proton, they are definitely in fear).
Last edited by x_wing on 21 June 2021 at 10:43 pm UTC
Quoting: Guesti would be more excited if nvidia open sourced it. along with their drivers. rather than keeping everything behind proprietary, closed up source. especially considering they are not bothering adding support to older gpu's. which many of those older gpu's still offer amazing performance. like the 1080 ti., two generations old.My understanding of how DLSS works is that it offloads the work to server farms. Not sure how open sourcing something like this would benefit anyone outside of another large company that wants to do the same thing. You would still need a render farm to analyze the output and upscale, right?
its going to be interesting to see when amd's alternative lands on linux. at least on windows their version will be cross compatible. their own demo was done on a 1060. software lockin's are extremely unethical.
I know, it'd be nice if everything was open source, but it isn't, sadly. I mean I think it'd be nice if Outlook were open sourced so people could go in there and either A) try to make it less of a steaming nugget of shit. or B) Figure out how it connects to o365 and make a better client (or just improve Evolution / KMail). So tired of using Outlook, can you tell?
Regardless, a lot of times Linux gets left behind when GPU makers come up with new features, but finally it seems they find it more beneficial to support every operating system equally. Granted, nvidia doesn't support Mac at all, which I still find amusing.
Quoting: slaapliedjeGranted, nvidia doesn't support Mac at all, which I still find amusing.They can't. On Windows and Linux, the GPU vendor provides the API implementation. On Macs, Apple do. Apple and Nvidia had a falling out, so no more Nvidia hardware in Macs, so no support from Apple for Nvidia hardware in Macs.
See more from me