Don't want to see articles from a certain category? When logged in, go to your User Settings and adjust your feed in the Content Preferences section where you can block tags!
We do often include affiliate links to earn us some pennies. See more here.

While DLSS has been technically available in the NVIDIA drivers for Linux for some time now, the missing piece was support for Proton which will be landing tomorrow - June 22.

In one of their GeForce blog posts, they made it very clear:

Today we’re announcing DLSS is coming to Facepunch Studios’ massively popular multiplayer survival game, Rust, on July 1st, and is available now in Necromunda: Hired Gun and Chernobylite. Tomorrow, with a Linux graphics driver update, we’ll also be adding support for Vulkan API DLSS games on Proton.

This was revealed originally on June 1 along with the GeForce RTX 3080 Ti and GeForce RTX 3070 Ti announcements. At least now we have a date for part of this extra support for Linux and DLSS. This, as stated, will be limited to games that natively use Vulkan as their graphics API which will be a short list including DOOM Eternal, No Man’s Sky, and Wolfenstein: Youngblood. Support for running Windows games that use DirectX with DLSS in Proton will arrive "this Fall".

With that in mind then, it's likely we'll see the 470 driver land tomorrow, that is unless NVIDIA have a smaller driver coming first with this added in. We're excited for the 470 driver as a whole, since that will include support for async reprojection to help VR on Linux and hardware accelerated GL and Vulkan rendering with Xwayland.

Article taken from GamingOnLinux.com.
32 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly.
See more from me
The comments on this article are closed.
46 comments
Page: «4/5»
  Go to:

x_wing Jun 22, 2021
Quoting: 3zekielAnd Nvidia is anti competitive yes ... I mean, I would do the same as them in their position, and frankly most sane people would, so I have a hard time criticizing them. At the same time, they are not a charity, but a company, they are supposed to be making money, not give kiss and hugs to everyone.
My only wish is that they open source the core driver, for which it makes absolutely no sense from a business perspective to keep closed source. It would keep everyone happy too, as those who do not want proprietary features could ignore them.

Nobody is expecting that they behave as charity company. But creating standards is not about charity but to create a sustainable market environment and allow it to evolve for the better.

Quoting: 3zekielAs for PhysX, it is embedded in engines directly since a long time already, and the point for them is not so much if many games use it or not, but at some point it was the cool thing that made you buy an Nvidia GPU. It's really all that matters to them, and it was a clear win on that point. It was also still used in metro last light at least, not sure for redux. I'd say it is mostly phased out by new techs - I remember it was used for some lighting, which as an example would be replaced by RT now. Once a feature like that is used up, you just do the next one. It also profits everyone eventually since the competitors will implement an alternative, potentially cross vendor and cross platform. Or it will just become a de facto standard, depends.

As I answered slaapliedje, the innovation was to move physics calculations into the GPU but they completely failed, mostly because their crappy proprietary API strategy.

Quoting: 3zekielThe win I present for us is the subject of the news, that is, Nvidia cares enough about us to support its features here. And I did throw some salt at AMD for their (lack of) RT support, but also OC that came after a long time etc ... (yeah I do not forgive easily, I know).

Which can also be seen as a marketing movement. I mean, you didn't get DLSS Proton support until AMD came up with FSR and what a coincidence that we get the Nvidia "new linux feature" on top of the AMD FSR article.
3zekiel Jun 22, 2021
Quoting: x_wing
Quoting: slaapliedje
Quoting: x_winghttps://en.wikipedia.org/wiki/List_of_games_with_hardware-accelerated_PhysX_support

40 games in ten years... I call that far away from a success.
This is kind of a false pretense. The PhysX engines have been built into the GPUs for years now, and so special support for it is no longer a thing. So 40 sounds about right. New games for the most part just use the hardware if they need/want to.

40 games and most of them (if not all of them) being sponsored by Nvidia in ten years. And as far I know, most of the nowdays game physics are still running on the CPU. So, the idea was to accelerate physics execution using the GPU but their reluctance to make a standard made them fail and 15 years after they first release of Physx we are still using the CPU. IMO, that's a failure.

40 big games for such feature is okay. Also, it is in fact open https://github.com/NVIDIAGameWorks/PhysX. It seems indeed to be used a lot on the CPU, but that is likely not because of openness or not but because it does not make much difference nowadays. From what I read, I see some games do win from using the GPU (PUBG-https://www.reddit.com/r/PUBATTLEGROUNDS/comments/c17kol/is_it_safe_to_put_the_physx_settings_from_auto/) while others see no difference (rocket league). Not idea why, but well. It seems every unreal engine game can potentially use it, and you have an option on whether you put in on auto-gpu or cpu (I am mostly browsing reddit and co, so don't quote me too much either).
So yeah, far from a loss I would say ? Once again, it did turn out to be a win in term of image. So I doubt Nvidia sees it as a loss.

Overall, it is not a loss for us, since the CPU implem seems to be pretty fast now. So win win ?
It is the same story in a way as G-Sync VS Freesync. Nvidia spearheaded the effort, made the R&D and marketing, then locked it in. It pushed competitors, who already had a reference so it was easier for them, to propose an alternative. And bam, we got freesync. Better yet, Nvidia gracefully allowed the use of freesync on their GPUs (which means not so locked in and evil, they could really just have dropped the price on gsync and "sponsored" it to death), and turned G-Sync into a quality indicator: not supported == we did not test it, and likely the panel quality is meh, expect flicker - freesync validated (or whatever the name) == we tested it and it's cool - gsync == massively tested it, and have very high quality standard. Now, whether you want to pay for the difference between the "validated" and gsync is your own affair. There is some gain, but for me it is not worth it clearly. Other people might think that it is a vital difference. But in the end, everyone win, we have a feature which we probably would never have been implemented if not spearheaded by them, and a "seal of quality" now that they adopted the new standard. That is also why I do not particularly hate them, they do spearhead a lot of stuff we got as standard today.
They are also fixing the stupid stuff (virt io lock) they did before. So I am much more "kind" to them than one year or one year and a half ago.

I do appreciate that AMD is trying to catch up, and that they open the result, thus participating in getting everyone together after the front runner opened the path. I appreciate less their code drop approach, but many companies do that ... So can't completely blame them either.
So far, if you want a real open source support (as in, working upstream and ahead of time) only Intel does that. Will they still do it with DG2 ? Time will tell, if so, and if XeSS is good and supported then count me in.
3zekiel Jun 22, 2021
Quoting: x_wing
Quoting: 3zekielThe win I present for us is the subject of the news, that is, Nvidia cares enough about us to support its features here. And I did throw some salt at AMD for their (lack of) RT support, but also OC that came after a long time etc ... (yeah I do not forgive easily, I know).

Which can also be seen as a marketing movement. I mean, you didn't get DLSS Proton support until AMD came up with FSR and what a coincidence that we get the Nvidia "new linux feature" on top of the AMD FSR article.
\

Of course it is a marketing movement, but it means we matter, which is by far the most important. If we did not, they would just not implement it. They also supported virt io properly, and are coming on multiple other features (wayland, NvFBC when using gsync or gsync compatible etc).
Also, they have dropped headers quite a while ago, so I don't think it was originally with FSR in mind. Most likely, the date is because of FSR, not the feature. At that time, there was no reaction from the community though, no implem or anything that I could see at least, so it seems they lost patience and pushed it themselves. Maybe they just did not open it enough too.

What you are saying here, is that competition is good and make companies be more consumer friendly. Which I 100% agree on. I also am not wishing death to AMD or anything. I just want them to push more fwd. For now, I am still disappointed of their Linux support, and hope for them to do better. I also wish they would be more clean in their marketing for FSR. And I want Intel to enter the market full force too, and give another standard of good support. Hell, if a 4th one could enter, it would be even better. Strong competition will enforce differentiation, but at the same time, will accelerate the standardization of the most important differentiators (since every competitor will cooperate to take the crown and make a standard to share it).
CatKiller Jun 22, 2021
View PC info
  • Supporter Plus
Quoting: x_wingSo, the idea was to accelerate physics execution using the GPU but their reluctance to make a standard made them fail and 15 years after they first release of Physx we are still using the CPU.
No, the idea was that you'd buy a separate card just for accelerating physics calculations. But that was silly: no one was going to buy a card just for that, and no one was going to put support into their game for something that no one had. So Nvidia bought the company and made it so that you could run those calculations on the GPU that you already had. Then they open sourced it some time later.
x_wing Jun 22, 2021
Quoting: CatKiller
Quoting: x_wingSo, the idea was to accelerate physics execution using the GPU but their reluctance to make a standard made them fail and 15 years after they first release of Physx we are still using the CPU.
No, the idea was that you'd buy a separate card just for accelerating physics calculations. But that was silly: no one was going to buy a card just for that, and no one was going to put support into their game for something that no one had. So Nvidia bought the company and made it so that you could run those calculations on the GPU that you already had. Then they open sourced it some time later.

IIRC, the first sample of Physx I saw was on 2005 and it was from the former company that created the tech, using dedicated hardware, which was in a very early stage (I'm almost sure that their dedicated solution never got to the market). In the moment that Nvidia bought that company, their strategy was to implement that solution into the GPU. So, Nvidia wanted to move physics calculation into GPU as use case of GPGPU. But they fucked up with that proprietary API that only became open source long after the hype was gone. That's my point.

Time will tell what will win. But I'm confident to say that Nvidia will fuck up once again.
CatKiller Jun 22, 2021
View PC info
  • Supporter Plus
Quoting: x_wingIIRC, the first sample of Physx I saw was on 2005 and it was from the former company that created the tech, using dedicated hardware, which was in a very early stage (I'm almost sure that their dedicated solution never got to the market).


The PPUs definitely existed. I doubt that many got sold, because the business case for them was rubbish, but you could get pre-built gaming machines with them in. The technology was also in a bunch of console games before Nvidia bought Ageia.

QuoteIn the moment that Nvidia bought that company, their strategy was to implement that solution into the GPU. So, Nvidia wanted to move physics calculation into GPU as use case of GPGPU.

Of course they did. Buying an extra PPU was silly, but GPGPU is great. And of course they wanted it to be a market differentiator to make back the purchase price, particularly as Intel had just bought Havok at the time.
slaapliedje Jun 22, 2021
Quoting: CatKiller
Quoting: x_wingIIRC, the first sample of Physx I saw was on 2005 and it was from the former company that created the tech, using dedicated hardware, which was in a very early stage (I'm almost sure that their dedicated solution never got to the market).


The PPUs definitely existed. I doubt that many got sold, because the business case for them was rubbish, but you could get pre-built gaming machines with them in. The technology was also in a bunch of console games before Nvidia bought Ageia.

QuoteIn the moment that Nvidia bought that company, their strategy was to implement that solution into the GPU. So, Nvidia wanted to move physics calculation into GPU as use case of GPGPU.

Of course they did. Buying an extra PPU was silly, but GPGPU is great. And of course they wanted it to be a market differentiator to make back the purchase price, particularly as Intel had just bought Havok at the time.
Yeah, the idea of offloading physics calculation to an extra GPU was awesome. Also not only a gaming feature, by the way. First game I remember utilizing it was Ghost Recon: Advanced Warfighter. Excellent game, but I think to this day one where PhysX won't work under wine :( That's one of those games that for the longest time, you couldn't play at max detail unless you had specific hardware, or it was SLOW...

Nvidia has done pretty well for themselves, considering they bought 3Dfx, and many other technologies as they went along. Sure their 'OMG, Ray Tracing!' was a little stupid for those of us that have known Ray Tracing has been a thing for decades, but it is still 'OMG realtime Ray Tracing!' which is actually rather phenomenal for consumer grade cards to be able to have such a feature.

People in the Linux community are interesting as there are some that are like 'Awesome, they support us with a driver that actually covers all of the features!' and then there are those that are 'OPEN SOURCE or GTFO!' I understand both, but for now I can't find an AMD graphics card, and I did finally find an nvidia one, and outside of Optimus shenanigans, I've never had any issues with nvidia's hardware / drivers.
Eike Jun 23, 2021
View PC info
  • Supporter Plus
Quoting: CatKillerThe PPUs definitely existed. I doubt that many got sold, because the business case for them was rubbish, but you could get pre-built gaming machines with them in. The technology was also in a bunch of console games before Nvidia bought Ageia.

I googled and found some... on Geocities. (So that's the age we're talking about. :D )

http://www.geocities.ws/nagaty_h/hardware/asus_physx_p1.htm


Last edited by Eike on 23 June 2021 at 9:02 am UTC
CatKiller Jun 23, 2021
View PC info
  • Supporter Plus
Quoting: Eike
Quoting: CatKillerThe PPUs definitely existed. I doubt that many got sold, because the business case for them was rubbish, but you could get pre-built gaming machines with them in. The technology was also in a bunch of console games before Nvidia bought Ageia.

I googled and found some... on Geocities. (So that's the age we're talking about. :D )

http://www.geocities.ws/nagaty_h/hardware/asus_physx_p1.htm
Well, not quite Geocities' heyday, but it was a while ago. It was the PS3 era, and MySpace was the world's biggest social network. AMD had a really terrible open source driver and a really terrible proprietary driver (I won't say the name in case it triggers flashbacks), and were selling off their fabs because they'd run out of money. Intel was switching to the Core architecture after the failures of Netburst and Itanium. YouTube was full of videos showing off Compiz, and Ubuntu had released a "Long Term Support" version called Dapper Drake.
slaapliedje Jun 23, 2021
Quoting: CatKiller
Quoting: Eike
Quoting: CatKillerThe PPUs definitely existed. I doubt that many got sold, because the business case for them was rubbish, but you could get pre-built gaming machines with them in. The technology was also in a bunch of console games before Nvidia bought Ageia.

I googled and found some... on Geocities. (So that's the age we're talking about. :D )

http://www.geocities.ws/nagaty_h/hardware/asus_physx_p1.htm
Well, not quite Geocities' heyday, but it was a while ago. It was the PS3 era, and MySpace was the world's biggest social network. AMD had a really terrible open source driver and a really terrible proprietary driver (I won't say the name in case it triggers flashbacks), and were selling off their fabs because they'd run out of money. Intel was switching to the Core architecture after the failures of Netburst and Itanium. YouTube was full of videos showing off Compiz, and Ubuntu had released a "Long Term Support" version called Dapper Drake.
There are some dedicated PhysX cards on eBay. Weirdly PCIe, my memory through the years would have insisted that they were PCI!
https://www.ebay.com/itm/362488709628?hash=item546602c1fc:g:io4AAOSw8R9b7tcM
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.