NVIDIA have released their 381.22 driver which comes with plenty of fixes, newer Vulkan support and more.
It now turns off OpenGL threaded optimizations by default, as they had enough reports showing that it was causing instability.
It adds support for these Vulkan extensions:
They also removed their NVIDIA logo splash screen which could show up on booting into a distribution. I'm glad they've removed it, but I practically never saw it. The only time I've ever seen it is on my TV PC which weirdly shows it for a second before the login screen, that's the only time I've ever personally seen it happen across all the computers I've owned with NVIDIA. It likely depends on how it's installed.
It also has "Improved compatibility with recent kernels", so you should have little to no trouble with this driver on the latest and greatest Kernel.
It now turns off OpenGL threaded optimizations by default, as they had enough reports showing that it was causing instability.
It adds support for these Vulkan extensions:
- VK_EXT_acquire_xlib_display
- VK_EXT_display_control
- VK_EXT_display_surface_counter
- VK_EXT_direct_mode_display
- VK_KHX_external_memory
- VK_KHX_external_memory_fd
- VK_KHX_external_semaphore
- VK_KHX_external_semaphore_fd
They also removed their NVIDIA logo splash screen which could show up on booting into a distribution. I'm glad they've removed it, but I practically never saw it. The only time I've ever seen it is on my TV PC which weirdly shows it for a second before the login screen, that's the only time I've ever personally seen it happen across all the computers I've owned with NVIDIA. It likely depends on how it's installed.
It also has "Improved compatibility with recent kernels", so you should have little to no trouble with this driver on the latest and greatest Kernel.
Some you may have missed, popular articles from the last month:
Quoting: GuestQuoting: liamdaweVery true, I know that.Quoting: GuestWhat?It should not be up to driver developers to optimize for each game.
[But they did it for specific games in Windows/DX12 anyway](http://uk.download.nvidia.com/Windows/382.05/382.05-win10-win8-win7-desktop-release-notes.pdf), so I was just having a rant.
Yes. In Windows, the Nvidia driver has a huge list of games and You can configure the driver settings for each individual games...
In Linux, to configure the driver settings for each game is not that easy.
1 Likes, Who?
Quoting: spiffykQuoting: GuestBut where are our ****ing performance optimizations for specific games?
You mean where are the workarounds introduced to make up for a game abusing the API? To hell with that.
Actually, that's quite state of the art. Game developers and driver developers working around special use cases where the available API implementation just sucks.
With each driver release, even on Windows, NVidia introduced optimizations for certain games. That should not be necessary. Seems it is .. at least currently.
0 Likes
Quoting: Comandante ÑoñardoYes. In Windows, the Nvidia driver has a huge list of games and You can configure the driver settings for each individual games...It's not? Maybe not quite as easy, but there are Application Profiles in the NVIDIA X Server Settings that you can set up based on multiple triggers, process name being one of them.
In Linux, to configure the driver settings for each game is not that easy.
0 Likes
Any time I've seen the 'game optimizations' they are just settings that nVidia thinks are the optimum. I thought all the settings within Ghost Recon: Wildlands were the same, but I swear it performs worse after I clicked the 'optimize' button in the Geforce Experience application.
0 Likes
I maybe should take my fingers out of my butt and install the new Nvidia drivers.
0 Likes
Quoting: slaapliedjeAny time I've seen the 'game optimizations' they are just settings that nVidia thinks are the optimum. I thought all the settings within Ghost Recon: Wildlands were the same, but I swear it performs worse after I clicked the 'optimize' button in the Geforce Experience application.
This isn't the same thing as what we're referring to, although it's easy to get that confused since "optimize" can cover such a broad range of things. What we're referring to are compiled in, driver level, optimizations that the user can't change and they do make games perform quite a bit better. The "optimize" button in the GeForce Exp app simply changes game settings that are available to the end user and have nothing really to do with the driver at all.
And as a side note: I also agree, the "optimize" button for the GF Exp app has almost always given me poor performance settings as well, that I would need to dial back a bit to make games run properly :)
0 Likes
So only I have such problems in ARK with nvidia drivers? I did not test latest driver version, but previous versions had this glitches in ARK. At least I have. Caves for me is unplayable at all. Whenever there is some smoke or fog, I see images, like below.
0 Likes
ARK has issues for years and they also occur with AMD drivers, so I don't think this is a problem you can expect driver developers to fix.
1 Likes, Who?
I do not think, it is ARK fault. I did not see any video, where someone had similar issues, but I see them a lot. And ARK is not only game with such glitches. I have similar issue with Master of Orion, where main character has no face and developers said this driver fault.
Last edited by DMG on 10 May 2017 at 8:26 am UTC
Last edited by DMG on 10 May 2017 at 8:26 am UTC
0 Likes
Quoting: spiffykQuoting: GuestBut where are our ****ing performance optimizations for specific games?
You mean where are the workarounds introduced to make up for a game abusing the API? To hell with that.
It's more complicated than that in reality :) Your example is one thing that can happen in certain cases :( however you can also easily get the same effect for a number of other reasons.
The most likely on Linux is if a game is using newer implemented features that are not optimised fully in the drivers. Every new game can have the potential to use an API in a completely legal way but also hit an unexpected inefficiency in the drivers for certain specific hardware as real life usage rarely matches benchmark sample code. Life Is Strange on the NV10x0 series for example had a massive performance boost due to the game uncovering a use case that wasn't covered for that hardware when the game launched.
That means the end user might get a driver update that looks like performance optimisations for specific games when it's actually optimising the drivers and this impacts a popular game that uses that area of the drivers or API. Usually the issue in the driver might even have been highlighted by that game.
The Open Source Mesa drivers are littered with examples of this in the past 2 years. You can pick almost any game shipped within the last couple of years and benchmark the speed with 2 year old drivers and again now and you'll likely see huge boosts none of which are workarounds due to API abuse. As more games came to Linux and more Mesa developers had real life examples it allows them to optimised the drivers more as well as the more reported new features.
The second reason is linked to various API, hardware and game engine design choices. When optimising and designing a game you can have multiple ways of doing things that *should* all run fast but sometimes due to how drivers might work or how the underlying silicon of a specific card designed means they won't run as fast as expected.
This means you can end up with two (or more) different ways of optimising the same call in a graphics driver depending on how it's used and neither usage is wrong. If different games use these different philosophies then to get the best performance in the drivers then the driver might need to analyse how the API is being used and then use the optimal path for that use case. Given the large differences between GPU ranges and the larger difference between vendors you can end up with these compromises. A "game ready" driver in most cases is just a driver that is pre-loaded with information on the best decisions to make if it hits one of these use cases. Usually the driver tries to be smart and do this at run time but that is not always possible.
That is not to say drivers haven't had some very specific game changes that should be avoided but I thought a little more detail might be interesting to readers as the common "abusing the API" meme isn't really telling the entire story or the reasons why things like game specific drivers can exist and how the line between optimisation and game specific drivers can easily get blurred when you have extremely complex systems and sometimes ambiguous API definitions.
Last edited by edddeduck_feral on 10 May 2017 at 8:38 am UTC
14 Likes, Who?
See more from me