While you're here, please consider supporting GamingOnLinux on:
Reward Tiers: Patreon. Plain Donations: PayPal.
This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!
You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
Reward Tiers: Patreon. Plain Donations: PayPal.
This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!
You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
Login / Register
- Valve released the Best of Steam - 2024 showing off the highest earners and most played games
- Fan-made PC port of Star Fox 64 is out now
- Open source evolution sim Thrive v0.8 brings more graphics improvements and new game mechanics
- Proton Experimental updated with fixes for Marvel Rivals on Steam Deck / Linux and other game improvements
- ScummVM 2.9.0 is out now for expanded retro gaming support
- > See more over 30 days here
View PC info
So my Nvidia GTX 1050 mobile with 550 drivers has gotten a score of 1211 in glmark2. While my Integrated Intel gpu got 3854.
My nvidia config file and my offload script is
If you have any idea how to fix then pls tell me and thanks in advance for any help.
- Are you sure it's specific to Wayland? It sounds like you switched distributions too? Can you test with X11 temporarily?
- Might glmark2 be using nouveau or possibly software rendering?
- Any difference between OpenGL and Vulkan?
- What sort of behavior do you get in "normal", non-benchmark applications, like games?
...not sure how helpful this is, but at least you get a thread bump!
View PC info
- Are you sure it's specific to Wayland? It sounds like you switched distributions too? Can you test with X11 temporarily?
I did test x11 to but the gpu performance only went up by about 200 points, so it's kinda nothing.
- Might glmark2 be using nouveau or possibly software rendering?
- Any difference between OpenGL and Vulkan?
- What sort of behavior do you get in "normal", non-benchmark applications, like games?
Well I'm having the issue in every thing i tried it on and the gpu always performs worse than the cpu.
Still thanks
View PC info
View PC info
I think the issues lies somewhere else.
Looking up your configs nothing seems to stand out except your offloading script:
export __NV_PRIME_RENDER_OFFLOAD_PROVIDER=NVIDIA-G0
Not sure where you got that from but for offloading to work it is not required. Try not using this environment variable and test again.
For offloading to work this is all you need:
__NV_PRIME_RENDER_OFFLOAD=1 __VK_LAYER_NV_optimus=NVIDIA_only __GLX_VENDOR_LIBRARY_NAME=nvidia
Note 1: the __VK_LAYER_NV_optimus is only required for Vulkan based applications.
Note 2: You can not mix Zink with Offloading as Zink requires __GLX_VENDOR_LIBRARY_NAME set to mesa which will un-offload your application. Do you happen to have Zink enabled somewhere like:
__GLX_VENDOR_LIBRARY_NAME=mesa MESA_LOADER_DRIVER_OVERRIDE=zink GALLIUM_DRIVER=zink
- Which desktop environment / window manager do you run?
- Do you use something else as an init systems then systemD?
I think, as you run nix the other configs might be of interest here too.
If you run nvidia-smi while benchmarking which performance level is reported by the driver?
+-----------------------------------------------------------------------------------------+
| NVIDIA-SMI 550.107.02 Driver Version: 550.107.02 CUDA Version: 12.4 |
|-----------------------------------------+------------------------+----------------------+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+========================+======================|
| 0 NVIDIA GeForce RTX 3080 Off | 00000000:01:00.0 On | N/A |
| 0% 57C P8 41W / 320W | 1021MiB / 10240MiB | 2% Default |
| | | N/A |
+-----------------------------------------+------------------------+----------------------+
See P* beside the GPU temperature.
Last but not least as I am not a NixOS expert. Does nix use glibc or musl?
Last edited by Vortex_Acherontic on 23 September 2024 at 10:20 pm UTC