You can sign up to get a daily email of our articles, see the Mailing List page.
Wolves of Midgard
groundhog_day86 Sep 27, 2017
Started Vikings - Wolves of Midgard tonight & noticed it was running funny. First, Specs as follows...

Debian Stretch 9.1
Gnome DE
Nvidia Driver 375.66
Asus B75 mobo
i5 3570
8gb ddr3
gtx 960 reference

Now what I found odd, using V-Sync & 'Low" graphics settings...

Vram usage around %50-60 with Gpu utilization steady at %80 peaking to %100 causing frame rate issues.

I have come to assume nothing in the world of Linux gaming, but this seems odd to me. Shouldn't Vram max out well before total utilization? I have seen many reports of frame rate issues on linux and windows versions, but have yet to find any reasons or 'solutions'.

Curious what comments this forum might have on this games performance and how it uses Nvidia GPUs.

Btw, 1st time poster, this site is great!
Liam Dawe Sep 27, 2017
Well utilization and VRAM are separate, so hitting max utilizing does make sense. It means it's doing the job. VRAM is used to store various things, so the game just might not have a lot of stuff to store in it. That's about as simple as I can make it, hope it makes sense.

Welcome! :)
groundhog_day86 Sep 27, 2017
I've just yet to see such a ratio between Vram and utilization while playing a native linux game. Tomb Raider & Shadow of Mordor does great with no frame rate issues. Just found it odd is all
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
Login / Register


Or login with...
Sign in with Steam Sign in with Google
Social logins require cookies to stay logged in.