Every article tag can be clicked to get a list of all articles in that category. Every article tag also has an RSS feed! You can customize an RSS feed too!
We do often include affiliate links to earn us some pennies. See more here.

NVIDIA have a little present available for Linux fans today, with the release of the 435.17 beta driver now being available.

This is a beta driver and it includes quite the highlight with the addition of PRIME render offload support for Vulkan and OpenGL. This is where you might have your Intel GPU running most normal applications, with an NVIDIA chip then powering your games. It's usually found in Notebooks and it's been a source of annoyance for NVIDIA Notebook owners for a long time, so it's really pleasing to see proper progress like this.

It comes with some caveats though, as it needs a very up to date X.Org Server with git commits not available in a normal release yet. However, if you're on Ubuntu 19.04 or 18.04 NVIDIA have provided a PPA. There's a little additional work needed for now too, you can read more about the PRIME render offload support here.

For the rest of what's in this new driver, it has the usual assortment of bug fixes and "experimental support for runtime D3 (RTD3) power management on Turing notebook GPUs". The full changelog can be found here.

Article taken from GamingOnLinux.com.
24 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly checked on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly.
See more from me
The comments on this article are closed.
31 comments
Page: «2/4»
  Go to:

MrKiasu Aug 14, 2019
View PC info
  • Supporter
Quoting: DuncI wish there was an equivalent to PRIME render offloading on desktops. I have a GPU here on my motherboard that's literally never been used.

(And yes, I know it's an architectural limitation and there isn't really any way of using it and a PCIe card at the same time. But it's annoying all the same.)

It can work. I don't think any of the codepaths actually care if your machine is a desktop or laptop. In my case, I've got a new turing GPU and I configured the BIOS to enable both the external and internal GPUs (Every BIOS I've ever seen has supported this) and then the Ubuntu PRIME support kicked in and worked the same as you'd expect on a laptop. It would come up on the iGPU and I could switch it to full-screen offload on the nvidia GPU.

On top of that, it can handle when the system is booted with the nvidia GPU connected to the display (the EFI is smart and knows which GPU has a display connected and makes that primary on each boot). It will disable all the PRIME stuff and load the nvidia driver normally.

I did have to make some additional changes to the PRIME support scripts to turn on uncertified gsync, which requires extra xorg.conf content. But at the end of all this, I could switch between nvidia primary and intel primary by changing which GPU the display is plugged into, and when plugged into intel, I can use PRIME to activate the nvidia.

Interestingly, with this setup, trying to use Wayland on the nvidia GPU actually causes the desktop to be rendered on the intel GPU and then outputted on the nvidia GPU. It's slow and unresponsive, but it isn't supposed to work, so that was amusing.

I expect this per-app offload functionality to also work fine on a desktop; I'll try it in the near future.
Luke_Nukem Aug 14, 2019
This is the example /etc/X11/xorg.conf I used:

Section "ServerLayout"
  Identifier "layout"
  Screen 0 "iGPU"
  Option "AllowNVIDIAGPUScreens"
EndSection

Section "Device"
  Identifier "iGPU"
  Driver "modesetting"
  BusID "PCI:00:02:0"
EndSection

Section "Screen"
  Identifier "iGPU"
  Device "iGPU"
EndSection

# May or may not need this section
#Section "Device"
#  Identifier "nvidia"
#  Driver "nvidia"
#  BusID "PCI:01:00:0"
#EndSection


I'm not able to get the power-off for Turing working though. Not sure why yet.


Last edited by Luke_Nukem on 14 August 2019 at 1:28 am UTC
Leopard Aug 14, 2019
Quoting: sigz
Quoting: LeopardBumblebee is thrash and not necessary.

Don't say that... bumblebee helped a lot in the past when there was no other solutions..

No? Bumblebee didn't help for anything. Prime was a better solution , at least it was reliable.

While Bumblebee was not.
Ivancillo Aug 14, 2019
QuoteThis is where you might have your Intel GPU running most normal applications, with an NVIDIA chip then powering your games.

Does this mean that it only work on laptops with Intel CPU?

What about Ryzen ones?
flesk Aug 14, 2019
View PC info
  • Contributing Editor
Quoting: Leopard
Quoting: sigz
Quoting: LeopardBumblebee is thrash and not necessary.

Don't say that... bumblebee helped a lot in the past when there was no other solutions..

No? Bumblebee didn't help for anything. Prime was a better solution , at least it was reliable.

While Bumblebee was not.

Prime didn't become available until 2016 though, while Bumblebee has been around since at least 2011. For half a decade it was the only (good) option for utilizing dual graphics on Linux, and that's worth something. It was always a hassle to set up and tended to break with driver upgrades, so I'm glad there are better, official options now for those of us still stuck with Optimus laptops.
Gnomerick Aug 14, 2019
PRIME output capability was added to the nvidia driver
2013-04-09 version 319.12
2016 was the year of PRIME sync, i.e. tear free vsync'd display.
Testing the render offload feature I can say it's working fine, Steam Proton games working without a hitch. Might be interesting to run some benchmarks comparing PRIME output with render offload.
Pre-Turing gpus always having to stay powered is a downside, of course, but I can understand why nvidia devs made that decision. A lot of crappy notebook hardware has been sold over the years with broken acpi/bioses and flawed intel pcie controllers.
Nanobang Aug 14, 2019
View PC info
  • Supporter
I gave up futzing with all this a long time ago, shortly after Primus came along. I set my (I-will-never-buy-another) Optimus laptop to "Nvidia" and keep it plugged in. The downside is that sounds and feels like an idling Harrier Jump Jet. The upside is that it will probably die sooner, and the sooner it dies, the sooner I can look into non-Optimus options. :D
Dunc Aug 14, 2019
Quoting: MrKiasu
Quoting: DuncI wish there was an equivalent to PRIME render offloading on desktops. I have a GPU here on my motherboard that's literally never been used.

(And yes, I know it's an architectural limitation and there isn't really any way of using it and a PCIe card at the same time. But it's annoying all the same.)

It can work. I don't think any of the codepaths actually care if your machine is a desktop or laptop. In my case, I've got a new turing GPU and I configured the BIOS to enable both the external and internal GPUs (Every BIOS I've ever seen has supported this)...
I'll have to check, but I'm pretty certain mine doesn't. And I have to say, I don't think I've ever seen one that does; that was part of the “architectural limitation” I was referring to. EFI, yes, but not an actual BIOS. (Maybe I should have been clearer on that. Having never owned an EFI machine, I tend to forget it exists even though it's more or less universal now. :) )

Thanks for the info, though. I'll have to look into it further.


You know what? I did look into it further, and after poking around my BIOS a bit, I discovered that yes, it does support both GPUs being enabled. The way it's worded in the menus is what confused me, but you're absolutely right and I have to eat my words.

Saying that, it turns out to be a 7000 series which isn't supported in the proprietary driver any more and it probably isn't worth trying to get it to work anyway. But still, I stand corrected.


Last edited by Dunc on 15 August 2019 at 1:04 am UTC
edo Aug 14, 2019
Quoting: Leopard
Quoting: sigz
Quoting: LeopardBumblebee is thrash and not necessary.

Don't say that... bumblebee helped a lot in the past when there was no other solutions..

No? Bumblebee didn't help for anything. Prime was a better solution , at least it was reliable.

While Bumblebee was not.

Prime require you to switch session, while bumblebee works in the main session. There is a bit of performance overhead but at least there is no need to switch session which is very annoying for those who dont use the pc for only gaming. And when you need to get full performance, there is always nvidia-xrun
edo Aug 14, 2019
I have been waiting this for so many years.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.