Don't want to see articles from a certain category? When logged in, go to your User Settings and adjust your feed in the Content Preferences section where you can block tags!
We do often include affiliate links to earn us some pennies. See more here.

It seems to be a busy weekend! NVIDIA have put out a new version of their Vulkan beta driver and it's an interesting one.

Today, NVIDIA 415.22.05 became available and as expected of this driver series it adds in new Vulkan extensions. Specifically, it adds support for VK_KHR_depth_stencil_resolve, VK_EXT_buffer_device_address, VK_EXT_memory_budget, VK_EXT_memory_priority (only for Windows currently) and VK_EXT_pci_bus_info.

The extra interesting bit is the improvement they listed in this driver version. They mention that it has "Better pipeline creation performance when there is a cache hit" so it will be an interesting driver to test out. Good to see NVIDIA continue working on performance!

Find the driver info here.

For those on Ubuntu wishing to test out the beta driver, there is this PPA which sadly hasn't been updated since October last year. Hopefully they will get moving on that sometime soon. I'm unsure how other distributions handle beta drivers like this, hopefully they make it easy.

Article taken from GamingOnLinux.com.
14 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly.
See more from me
The comments on this article are closed.
48 comments
Page: «4/5»
  Go to:

YoRHa-2B Jan 6, 2019
Quoting: jensBack on topic: "Better pipeline creation performance when there is a cache hit" sounds indeed interesting, I would be curious to know how much performance win it yields in real life scenarios.
The way they describe it, it's going to yield exactly 0% in scenarios where performance would actually matter.

Still a nice-to-have improvement since it might slightly reduce loading times in case you already have a populated shader cache for a given game, but that's basically it.


Last edited by YoRHa-2B on 6 January 2019 at 11:45 pm UTC
CFWhitman Jan 7, 2019
Quoting: Avehicle7887I see many people saying Nvidia drivers are bad etc without mentioning a few key points. Sure they are closed source and not as friendly as others in the Linux world but:

- They've offered Linux support for a long time, I still remember NV drivers back when Ubuntu was all the rage (Ubuntu 10.04 etc).

I used Nvidia with their closed source drivers at least as early as 2001 on Slackware and Linux Mandrake when I was playing Loki Entertainment releases for Linux. It was a bit more manual at the time than it is today.

Still, I switched to AMD video cards for the desktops I put together and ran Linux on at home about seven years ago, when I decided the open source drivers were more future proof. Now I have a Vega 56 card in this machine.

The only issue with my current approach is that, though the open source approach is more future proof, you find yourself on the bleeding edge when you first get a new card at least some of the time. I have Ubuntu Studio 18.10 on this machine, but I still have to use the Padoka PPA and an Ukuu loaded kernel to get Rise of the Tomb Raider to run correctly. Of course, so far it runs very well and looks very good even with the settings quite high (that is, the highest standard setting).


Last edited by CFWhitman on 7 January 2019 at 3:46 am UTC
Shmerl Jan 7, 2019
Quoting: CFWhitmanThe only issue with my current approach is that, though the open source approach is more future proof, you find yourself on the bleeding edge when you first get a new card at least some of the time. I have Ubuntu Studio 18.10 on this machine, but I still have to use the Padoka PPA and an Ukuu loaded kernel to get Rise of the Tomb Raider to run correctly. Of course, so far it runs very well and looks very good even with the settings quite high (that is, the highest standard setting).

That's expected if you are using upstream kernel for the graphics driver. Commonly support is not backported to older kernels, so you need to use the most recent one if you have very new hardware.


Last edited by Shmerl on 7 January 2019 at 4:58 am UTC
Purple Library Guy Jan 7, 2019
Quoting: jens
Quoting: Shmerl
Quoting: jensSo you had an interview with somebody from the Nvidia management?

It's not developers' decision, or you expected otherwise for some reason?

You called their management people jerks, so I guess you ever talked to one of them and can justify your statement? See above, if you don't like their decisions, just fine and you are rightful to do that. But calling someone a jerk you have never talked to (my assumptions) is just cheap and below every level.
I've never talked to Mussolini, and yet I feel fairly confident in calling him a jerk and reject the idea that it is cheap to do so.

If people's actions and statements are on the public record one can often be quite justified in forming judgements about their character without actually meeting them. I don't personally know whether the information about NVidia management is such as to justify such a judgement, but it certainly could be; insisting personal contact is needed is just mistaken.
And actually contrariwise, it's also possible to have met and talked to someone and not have the information you would need to determine whether they're a jerk. So personal contact is neither necessary nor sufficient for knowledge of jerk-hood.
mahagr Jan 7, 2019
Quoting: dubigrasuAt the same time I don't remember the drivers stuck on performance mode while in desktop mode, sure the modes were alternating depending on desktop activity, but stuck on max power, no.

For me my GTX 1080 Ti is always in the p0 state (max clocks) until I switch to the console. There's a known issue for this in nVidia bug tracker as well as a public thread in nVidia forums. I am running my computer with two 4K G-Sync monitors, which seems to make the issue worse. Windows shares the same power management code, but the difference between Windows and Linux is that in Windows they have information if more draw calls are coming to the pipeline or not, which allows the graphics card to go to a lower power state earlier.

I agree that keeping the clocks high is great when you're gaming (but only if your game needs 100% of your GPU), but it's not great if your card runs hot 24/7 and never stops the fans because of the power saving doesn't work for a few users.

Do not get me wrong: I am and have been nVidia user for a long time (just threw away a broken GT 8800 card among with some other old hardware) and I will likely be using nVidia graphics cards in the future, too.

PS. Regarding to my first comment on nVidia Linux driver quality.. That came from an nVidia employee who I know. I have also worked in a few companies where the main reason not to release source code was a bad code quality (not because of bad workers but because of there was no time to polish the code).
dubigrasu Jan 7, 2019
Quoting: mahagr
Quoting: dubigrasuAt the same time I don't remember the drivers stuck on performance mode while in desktop mode, sure the modes were alternating depending on desktop activity, but stuck on max power, no.

For me my GTX 1080 Ti is always in the p0 state (max clocks) until I switch to the console. There's a known issue for this in nVidia bug tracker as well as a public thread in nVidia forums. I am running my computer with two 4K G-Sync monitors, which seems to make the issue worse. Windows shares the same power management code, but the difference between Windows and Linux is that in Windows they have information if more draw calls are coming to the pipeline or not, which allows the graphics card to go to a lower power state earlier.

I agree that keeping the clocks high is great when you're gaming (but only if your game needs 100% of your GPU), but it's not great if your card runs hot 24/7 and never stops the fans because of the power saving doesn't work for a few users.

Do not get me wrong: I am and have been nVidia user for a long time (just threw away a broken GT 8800 card among with some other old hardware) and I will likely be using nVidia graphics cards in the future, too.

PS. Regarding to my first comment on nVidia Linux driver quality.. That came from an nVidia employee who I know. I have also worked in a few companies where the main reason not to release source code was a bad code quality (not because of bad workers but because of there was no time to polish the code).

Found the topic and it was an interesting read. I'm interested also on this since I'm bound to get myself a new Nvidia card (currently using AMD).
Looking at the initial report it doesn't sound that bad to me though, 28 seconds to ramp down, I can live with that as long the max power mode isn't triggered by any light desktop activity.
But to be exact, you're saying that in your case the card always stays in max power mode and never ever ramps down, even on idle?
I can imagine that indeed running two 4K monitors does put a strain on the card, is that with vsync enabled/disabled ?
In any case, if I'll be confronted with that (constant max power on) I'll probably look into vsync/frame limiting or setting the card to use the power-save mode in desktop mode, and use performance mode only for gaming through the means of a script/gamemode/etc.
OTOH, I'm not likely to run more than 1080p any time soon so I guess I'll be "safe". I have also a two monitor setup and with my previous Nvidia card I didn't ran into this problem.
jens Jan 7, 2019
  • Supporter
Quoting: mahagrPS. Regarding to my first comment on nVidia Linux driver quality.. That came from an nVidia employee who I know. I have also worked in a few companies where the main reason not to release source code was a bad code quality (not because of bad workers but because of there was no time to polish the code).

Thanks for the additional information. I read your posting as the "usual" NVidia bashing, though with this extra information your statements sounds much more objective. Technical dept may indeed be one of many reasons.

I apologize if my initial response was somewhat offending.
jens Jan 7, 2019
  • Supporter
Quoting: Purple Library GuyI've never talked to Mussolini, and yet I feel fairly confident in calling him a jerk and reject the idea that it is cheap to do so.

If people's actions and statements are on the public record one can often be quite justified in forming judgements about their character without actually meeting them. I don't personally know whether the information about NVidia management is such as to justify such a judgement, but it certainly could be; insisting personal contact is needed is just mistaken.
And actually contrariwise, it's also possible to have met and talked to someone and not have the information you would need to determine whether they're a jerk. So personal contact is neither necessary nor sufficient for knowledge of jerk-hood.

Well, yes and no ;)
Your statement is technically completely correct and you have effectively proven that my statement is wrong. :)

That said, my opinion is still that I grant people the advantage if I don't know them. Lets say "innocent until proven guilty". Furthermore I draw quite a clear distinction between judging actions of someone "his actions are jerky" versus judging the person itself "he is a jerk". Actually I'm quite sensible on this, may be that's why my reactions here were quite strong. I do agree that there are actual jerks out there that deserve to be called jerks. Though I'm pretty convinced that the typical manager of a company is doing the best in his capabilities to help his company and his team to prosper. There are most likely jerks among them, but as stated, I prefer to grant advantage and prefer not to judge the person based on actions that seem jerky to me or anybody else. I'm fine with calling some action jerky, but I prefer to keep the respect of the actual person until there is really no way to misjudge like with your example.

Related to that a question, how offending is the word "jerk"? I, non native English speaker, would give it lets say a 6 on a scale from 1 (like you would talk with kids when they behave somewhat clumsy) to 10 (very offending). Is this correct?

Spoiler, click me

On a side note: I understand that your example was to effectively highlight your point and not to compare a random NVidia manager with Mussolini. With the last US election I decided for myself to skip pseudo comparisons with actual criminals of mankind. During that time you could read quite some columns of people that compared the new US administration with the German Nazi regime. As a response to that I read in another column that no matter how "evil" one thinks of the new Potus, he is not and will most likely never be a mass murderer of millions of people. That is a completely different magnitude. Any comparison like this will not paint a better picture of what to expect in the future but will only soften/weaken the crimes of the Nazi regime and hurt the victims of that time.
I decided for myself to keep that in mind. As stated, I did not read that comparison in your statement, but I thought I would share this.


Last edited by jens on 10 January 2019 at 7:17 am UTC
mahagr Jan 7, 2019
@dubigrasu

Yes, my card stays on max power as long as I have X server showing up from the screens. It will idle in about 45 seconds if I switch to the console (Ctrl-Alt-F5). I've not tried to SSH to the computer while the screens are off, so I am not sure if the graphics card is idling when I'm not around, though my computer does go to sleep after some inactivity. Maybe I should check it out as it would be interesting to know.

I'm using GTX 1080 Ti, I'm not sure if I had the same issue when I owned GTX 1080, at least I didn't notice it with the older card... I know that the slow idle rampdown of ~45 seconds is common to everyone using recent drivers, but there's another issue where the card never sleeps, which seems to be somewhat related to multiple monitors and running compositing window manager with opengl support.

@jens

No offence taken. If you do some googling, you will see that nVidia has a lot of open source projects from research to deep learning, they even opened some of their gaming related libraries. I don't know what are their reasons to keep the drivers closed source, but some likely reasons are legal issues, code quality or hardware secrets. Even AMD and Intel aren't releasing their closed drivers, but they are replacing them with new ones which are open source. I think that nVidia as a market leader just hasn't had a reason to do that yet.
Shmerl Jan 7, 2019
Quoting: jensThough I'm pretty convinced that the typical manager of a company is doing the best in his capabilities to help his company and his team to prosper. There are most likely jerks among them, but as stated, I prefer to grant advantage and prefer not to judge the person based on actions that seem jerky to me or anybody else. I'm fine with calling an actions jerky, but I prefer to keep the respect of the actual person until there is really no way to misjudge like with your example.

So who makes decisions like this? Their management. Saying that decisions are jerky but the company is not doesn't sound convincing in the least. If such decisions are characteristic for the company (i.e. they happen as a rule), the company can deservedly be called a jerk.


Last edited by Shmerl on 7 January 2019 at 10:36 pm UTC
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.
Buy Games
Buy games with our affiliate / partner links: