NVIDIA have a little present available for Linux fans today, with the release of the 435.17 beta driver now being available.
This is a beta driver and it includes quite the highlight with the addition of PRIME render offload support for Vulkan and OpenGL. This is where you might have your Intel GPU running most normal applications, with an NVIDIA chip then powering your games. It's usually found in Notebooks and it's been a source of annoyance for NVIDIA Notebook owners for a long time, so it's really pleasing to see proper progress like this.
It comes with some caveats though, as it needs a very up to date X.Org Server with git commits not available in a normal release yet. However, if you're on Ubuntu 19.04 or 18.04 NVIDIA have provided a PPA. There's a little additional work needed for now too, you can read more about the PRIME render offload support here.
For the rest of what's in this new driver, it has the usual assortment of bug fixes and "experimental support for runtime D3 (RTD3) power management on Turing notebook GPUs". The full changelog can be found here.
Quoting: IvancilloIt works, I got it running on a Ryzen 3750h laptop with picasso gpu and nvidia 1660ti, this is the /etc/X11/xorg.conf if you have one of those laptops:QuoteThis is where you might have your Intel GPU running most normal applications, with an NVIDIA chip then powering your games.
Does this mean that it only work on laptops with Intel CPU?
What about Ryzen ones?
Section "ServerLayout"
Identifier "layout"
Screen 0 "amd"
Inactive "nvidia"
Option "AllowNVIDIAGPUScreens"
EndSection
Section "Device"
Identifier "nvidia"
Driver "nvidia"
BusID "1:0:0"
EndSection
Section "Device"
Identifier "amd"
Driver "modesetting"
Option "TearFree" "true"
Option "DRI" "3"
BusID "5:0:0"
EndSection
Section "Screen"
Identifier "amd"
Device "amd"
EndSection
Section "Screen"
Identifier "nvidia"
Device "nvidia"
EndSection
I just bought a new Ryzen 3700U + Vega 10 notebook, and I'm not looking back!
Quoting: qgnoxQuoting: IvancilloIt works, I got it running on a Ryzen 3750h laptop with picasso gpu and nvidia 1660ti, this is the /etc/X11/xorg.conf if you have one of those laptops:QuoteThis is where you might have your Intel GPU running most normal applications, with an NVIDIA chip then powering your games.
Does this mean that it only work on laptops with Intel CPU?
What about Ryzen ones?
Section "ServerLayout"
Identifier "layout"
Screen 0 "amd"
Inactive "nvidia"
Option "AllowNVIDIAGPUScreens"
EndSection
Section "Device"
Identifier "nvidia"
Driver "nvidia"
BusID "1:0:0"
EndSection
Section "Device"
Identifier "amd"
Driver "modesetting"
Option "TearFree" "true"
Option "DRI" "3"
BusID "5:0:0"
EndSection
Section "Screen"
Identifier "amd"
Device "amd"
EndSection
Section "Screen"
Identifier "nvidia"
Device "nvidia"
EndSection
Thanks.
On the AMD side, are you using AMDGPU PRO or just AMDGPU?
Quoting: Ivancilloamdgpu, you can replace also in the xorg.conf the driver modesetting for amdgpu to have less tearing in apps running with the amdgpu but it doesn't affect the nvidia offload.Quoting: qgnoxQuoting: IvancilloIt works, I got it running on a Ryzen 3750h laptop with picasso gpu and nvidia 1660ti, this is the /etc/X11/xorg.conf if you have one of those laptops:QuoteThis is where you might have your Intel GPU running most normal applications, with an NVIDIA chip then powering your games.
Does this mean that it only work on laptops with Intel CPU?
What about Ryzen ones?
Section "ServerLayout"
Identifier "layout"
Screen 0 "amd"
Inactive "nvidia"
Option "AllowNVIDIAGPUScreens"
EndSection
Section "Device"
Identifier "nvidia"
Driver "nvidia"
BusID "1:0:0"
EndSection
Section "Device"
Identifier "amd"
Driver "modesetting"
Option "TearFree" "true"
Option "DRI" "3"
BusID "5:0:0"
EndSection
Section "Screen"
Identifier "amd"
Device "amd"
EndSection
Section "Screen"
Identifier "nvidia"
Device "nvidia"
EndSection
Thanks.
On the AMD side, are you using AMDGPU PRO or just AMDGPU?
Quoting: NanobangI gave up futzing with all this a long time ago, shortly after Primus came along. I set my (I-will-never-buy-another) Optimus laptop to "Nvidia" and keep it plugged in. The downside is that sounds and feels like an idling Harrier Jump Jet. The upside is that it will probably die sooner, and the sooner it dies, the sooner I can look into non-Optimus options. :D
I am on the same boat, although (at least in Fedora with RPMFusion drivers) you can disable the GPU at boot time on grub, that way if I know I'm not going to game, I just use the Intel/Noveau.
It would be really cool if when updating nvidia drivers a new grub entry was created with nvidia disabled, I saw an open ticket (don't remember in which project) that was exactly this... but didn't get much traction.
That said I tried doing something myself, bit cant really seem to get the grubby docs.
Laptop is an MSI GS65-Stealth RTX-2060. Also played a few games through Proton D9VK without issue too... I'm blown away...
Power use with power-management set up drops down to 7-10w for browsing etc. 7w just idling. 4-5w with screen off. Guesstimate 6-10 hours battery time depending on what I'm doing.
To get proper power-management I needed to do
sudo tee /sys/bus/pci/devices/0000:01:00.0/power/control <<<auto
Quoting: LeopardQuoting: sigzQuoting: LeopardBumblebee is thrash and not necessary.
Don't say that... bumblebee helped a lot in the past when there was no other solutions..
No? Bumblebee didn't help for anything. Prime was a better solution , at least it was reliable.
While Bumblebee was not.
Seems you never knew the time there was no prime. Bumblebee was a solution, not a perfect solution, but it was here before prime. You cannot say it's thrash
Last edited by sigz on 15 August 2019 at 12:27 pm UTC
Does anyone know if the VRAM is separate as well? If so, that'd be a major boon, especially for AI work. Right now around half of my VRAM can be eaten up just by basic multitasking. I'd love to be able to offload this to system memory and have my VRAM reserved for processes that actually need the performance.
Another interesting thing would be the possibility of discrete GPU driver updates without having to reload X.
As long as your onboard graphics are good enough for your basic desktop tasks, being able to pick and choose which applications use your discrete GPU seems like a major win for desktop users just as much as laptop users.
Quoting: MunkIt would be interesting to experiment on desktop with offloading the overhead of desktop rendering to the capable-enough onboard graphics, which otherwise just go unused, on a desktop. Unless there's some large overhead to this, which I don't see why there would, I would expect to see modest performance gains, especially when running multiple high-resolution displays in which only one is used for gaming.
Does anyone know if the VRAM is separate as well? If so, that'd be a major boon, especially for AI work. Right now around half of my VRAM can be eaten up just by basic multitasking. I'd love to be able to offload this to system memory and have my VRAM reserved for processes that actually need the performance.
Another interesting thing would be the possibility of discrete GPU driver updates without having to reload X.
As long as your onboard graphics are good enough for your basic desktop tasks, being able to pick and choose which applications use your discrete GPU seems like a major win for desktop users just as much as laptop users.
This is unworkable due to hardware differences. Laptops use a mux to direct graphics output etc, and are fairly integrated together. Whereas the desktop has these as very separated components with separate outputs and memory. The copy from the gfx ram to the iGPU ram would be hideously slow.
But you could hook up two displays :shrug:
I have no hard numbers nor do I know how to get them, but it almost seems like the stream from the discrete card is being slowed in some way depending on the circumstances. Something just feels very off and it doesn't happen when I just boot my DE with nvidia-xrun.
See more from me