Nvidia are keeping up their great support of Vulkan with yet another new build of their Vulkan driver which supports Linux.
Driver version 355.00.29 is now available:
You can grab it here.
Driver version 355.00.29 is now available:
QuoteMarch 2nd, Windows 356.45, Linux 355.00.29
Support Vulkan API version v1.0.4
Fix device lost issue with some MSAA resolves
Fix vkGetQueryPoolResults() when queryCount=0
Fix OpImageQuerySample with images
Fix OpVectorExtractDynamic issues with doubles
Fix handling of sparse image miptail when the whole image fits within a page
Improve vkAcquireNextImageKHR() conformance to WSI spec
Improve GL_KHR_vulkan_glsl compatibility when using GLSL directly
Improve GPU texturing performance in some cases
Improve vkAllocateDescriptorSets()/vkFreeDescriptorSets() performance in some cases
Improve vkCmdBindDescriptorSets() performance in some cases
Improve vkCmdCopyImage() performance in some cases
You can grab it here.
Some you may have missed, popular articles from the last month:
I hoped for a wider GPU support, but again I was disappointed. Probably have to wait for the RC or the next branch, I expect they will soon provide support for all cards capable of OpenGL 4.5.
0 Likes
With NVIDIAS Vulkan driver does it treat every GPU you have as a resource? do games benefit with multiple GPUs yet under Vulkan?
Just wondering because SLI under Linux has always been a bit shaky at times. (better then AMD whom don't support XF or Vulakn atm)
Last edited by TheRiddick on 4 March 2016 at 8:40 pm UTC
Just wondering because SLI under Linux has always been a bit shaky at times. (better then AMD whom don't support XF or Vulakn atm)
Last edited by TheRiddick on 4 March 2016 at 8:40 pm UTC
1 Likes, Who?
I think SLI/Xfire is down to the devs to implement. The drivers play a much smaller roll with Vulkan/DX12 as the drivers are a "thin" layer between app and hardware .
I may be totally wrong though....
I may be totally wrong though....
0 Likes
The only thing I want is moving Vulkan to the main branch. 35x.xx branches have issues with Steam, it crashes on adding a category for a game and some other action (sometimes just on checking notifications), also NS2 crashes randomly with it while on 361.xx both the game and Steam are rock-solid. Moreover, 35x.xx rely on older Xorg ABI so I have to downgrade Xorg to 1.16. Too much hassle for testing it.
1 Likes, Who?
QuoteI have to downgrade Xorg to 1.16
1.16 !? I don't think I'll try these beta drivers for awhile.
0 Likes
No problems with Mint 17.3 KDE. Here my Talos Principle Benchmarks:
CPU: i7 6700 @4,3GHz
GTX650Ti Boost 2GB
OpenGL:
Duration: 59.6 seconds (4189 frames)
Average: 70.3 FPS (75.4 w/o extremes)
Extremes: 281.8 max, 6.4 min
Sections: AI=4%, physics=1%, sound=1%, scene=84%, shadows=9%, misc=2%
Highs: 85 in 0.6 seconds (134.4 FPS)
Lows: 339 in 9.0 seconds (37.5 FPS)
30-60 FPS: 22%
> 60 FPS: 78%
Vulkan:
59.9 seconds (4375 frames)
Average: 73.0 FPS (74.4 w/o extremes)
Extremes: 366.2 max, 4.5 min
Sections: AI=4%, physics=1%, sound=1%, scene=85%, shadows=6%, misc=3%
Highs: 70 in 0.4 seconds (169.1 FPS)
Lows: 78 in 2.7 seconds (28.9 FPS)
30-60 FPS: 8%
> 60 FPS: 91%
For the first time on my system Vulkan ist faster than OpenGL. It still uses mostly one CPU core, but this time at a few percent less than 100%, while there is one other core at about 10-15% instead of under 10% with the old driver.
CPU: i7 6700 @4,3GHz
GTX650Ti Boost 2GB
OpenGL:
Duration: 59.6 seconds (4189 frames)
Average: 70.3 FPS (75.4 w/o extremes)
Extremes: 281.8 max, 6.4 min
Sections: AI=4%, physics=1%, sound=1%, scene=84%, shadows=9%, misc=2%
Highs: 85 in 0.6 seconds (134.4 FPS)
Lows: 339 in 9.0 seconds (37.5 FPS)
30-60 FPS: 22%
> 60 FPS: 78%
Vulkan:
59.9 seconds (4375 frames)
Average: 73.0 FPS (74.4 w/o extremes)
Extremes: 366.2 max, 4.5 min
Sections: AI=4%, physics=1%, sound=1%, scene=85%, shadows=6%, misc=3%
Highs: 70 in 0.4 seconds (169.1 FPS)
Lows: 78 in 2.7 seconds (28.9 FPS)
30-60 FPS: 8%
> 60 FPS: 91%
For the first time on my system Vulkan ist faster than OpenGL. It still uses mostly one CPU core, but this time at a few percent less than 100%, while there is one other core at about 10-15% instead of under 10% with the old driver.
3 Likes, Who?
Quoting: TheRiddickWith NVIDIAS Vulkan driver does it treat every GPU you have as a resource? do games benefit with multiple GPUs yet under Vulkan?
Just wondering because SLI under Linux has always been a bit shaky at times. (better then AMD whom don't support XF or Vulakn atm)
Vulkan offers implementation of multiGPU setups -- it is up to the developer to use it in the engine. With Vulkan it is possible that if you have NVIDIA and Intel GPU's one can be use to render the scene and the other for openCL computations (so physic etc.). However, both GPU's doing the rendering is not so viable atm as the framebuffer needs to be copied for one card to the other (as far as i understand the specification, the frame buffer has to be copied via CPU so GPU framebuffer -> GPU -> PCIexpress -> Interruption -> CPU cache -> CPU -> PCI express -> GPU -> GPU framebuffer [kind of long way :)]). With multimonitor-multiGPU that would work perfectly (with lot of work from developers :) ). The current implementation AFAIK is not able to take advantage of SLI or Xfire buses (probably there will be some extensions from vendors).
1 Likes, Who?
Quoting: nocriQuoting: TheRiddickWith NVIDIAS Vulkan driver does it treat every GPU you have as a resource? do games benefit with multiple GPUs yet under Vulkan?
Just wondering because SLI under Linux has always been a bit shaky at times. (better then AMD whom don't support XF or Vulakn atm)
Vulkan offers implementation of multiGPU setups -- it is up to the developer to use it in the engine. With Vulkan it is possible that if you have NVIDIA and Intel GPU's one can be use to render the scene and the other for openCL computations (so physic etc.). However, both GPU's doing the rendering is not so viable atm as the framebuffer needs to be copied for one card to the other (as far as i understand the specification, the frame buffer has to be copied via CPU so GPU framebuffer -> GPU -> PCIexpress -> Interruption -> CPU cache -> CPU -> PCI express -> GPU -> GPU framebuffer [kind of long way :)]). With multimonitor-multiGPU that would work perfectly (with lot of work from developers :) ). The current implementation AFAIK is not able to take advantage of SLI or Xfire buses (probably there will be some extensions from vendors).
Shader compiling could be spread on all GPUs and CPUs available though.
1 Likes, Who?
Quoting: nocriVulkan offers implementation of multiGPU setups -- it is up to the developer to use it in the engine. With Vulkan it is possible that if you have NVIDIA and Intel GPU's one can be use to render the scene and the other for openCL computations (so physic etc.). However, both GPU's doing the rendering is not so viable atm as the framebuffer needs to be copied for one card to the other (as far as i understand the specification, the frame buffer has to be copied via CPU so GPU framebuffer -> GPU -> PCIexpress -> Interruption -> CPU cache -> CPU -> PCI express -> GPU -> GPU framebuffer [kind of long way :)]). With multimonitor-multiGPU that would work perfectly (with lot of work from developers :) ). The current implementation AFAIK is not able to take advantage of SLI or Xfire buses (probably there will be some extensions from vendors).
I believe HSA addresses this limitation.
0 Likes
Quoting: srlsboyActually, 1.17 may work as well. Debian has 1.18 in testing, which I use, and 1.16 in stable. It's safe to try for yourself, if ABI isn't compatible you'll see an appropriate message in the Xorg log and it won't start. There's also an option to add -ignoreABI parameter (as said in that exact error message) and it works, but then Xorg crashes on the game start.QuoteI have to downgrade Xorg to 1.16
1.16 !? I don't think I'll try these beta drivers for awhile.
0 Likes
See more from me