Something I wrote about last year, was that NVIDIA were working towards better supporting NVIDIA Optimus on Linux. Seems like another step is being made towards that end!
In the proposal sent to the xorg development mailing list from NVIDIA's Kyle Brenneman, it goes over how they would expect it to work:
For GPU offloading in libglvnd, where individual clients can run with an alternate GPU and client-side vendor library, we'd need some way for that alternate vendor library to communicate with its server-side counterpart. Normally, the server's GLXVND layer would dispatch any GLX requests to whichever driver is running an X screen. This is a GLX extension that allows a client to tell the server to send GLX requests to a different driver instead.
In the proposal you can find a draft of the extension spec too. You can also find the WIP (work in progress) merge request here on the xserver GitLab too where Brenneman is hoping for some more feedback as they currently have none sent in reply to their proposal.
This is something that has been a pain point for Linux laptop users for far too long, perhaps there's now finally a light at the end of the tunnel. It certainly would make choosing a laptop a whole lot easier in future if this all works out.
Thanks, Phoronix.
Quoting: cprnActually, since I do have a CPU built-in AGPU that's never used and a GTX1070 that's currently my main GPU but could easily be used as a discrete one, is there any reason to use Optimus on PC? Can I somehow grab extra f/s out of it?
You could spare yourself and the environment some Watts. I doubt using both GPUs together would be worth any effort for speed.
I am one of those lucky bastards that got his T430 with HD4000 only swapped with a T430 with optimus.
It's the biggest crap I've ever seen.
And with al the new systemd enhancements the crap pile even rises... (On ubuntu bionic LTS, you can't turn off optimus anymore once you've touched it, because some PID 1 has an open file descriptor to the DRI).
It would have been acceptable if I could just switch between HD4000 (with *better* support) and the nvidia (to get the minidisplay port working) on reboot basis, but alas, that does not work.
And due to systemd (fixed in a new release of systemd, so I have to wait 2 years or go for something like arch), the laptop has early thermal issues. It just adds 6W of thermal to your CPU (shared heatsink) when not used and not being able to shut it.
</rant>
I would love to own an i7 8809G, and have the freedom to select or turn off the GPU I don't need.
Or even go eGPU and then turn off the eGPU before undocking. Which of course can only be an AMD due to these kind of problems.
Quoting: EikeQuoting: cprnActually, since I do have a CPU built-in AGPU that's never used and a GTX1070 that's currently my main GPU but could easily be used as a discrete one, is there any reason to use Optimus on PC? Can I somehow grab extra f/s out of it?
You could spare yourself and the environment some Watts. I doubt using both GPUs together would be worth any effort for speed.
Well, I already throttle the clocks to about 10% when I don't run anything GPU intensive to slow down the fans, etc. I don't think AGPU can beat it wattage-wise when it's downclocked like that. Might be wrong. Am I wrong? I just feel like that AGPU lies there to waste.
Quoting: cprnWell, I already throttle the clocks to about 10% when I don't run anything GPU intensive to slow down the fans, etc. I don't think AGPU can beat it wattage-wise when it's downclocked like that. Might be wrong. Am I wrong? I just feel like that AGPU lies there to waste.
That's an interesting question and I'd be interested to know if you happen to check. I'd think AGPU would win due to better efficiency, but that's just a wild guess.
I know the feeling of wasted... transistors, if you will. I'd prefer the CPU without graphics circuits if it's economically reasonable.
See more from me