Support us on Patreon to keep GamingOnLinux alive. This ensures all of our main content remains free for everyone. Just good, fresh content! Alternatively, you can donate through PayPal. You can also buy games using our partner links for GOG and Humble Store.
We do often include affiliate links to earn us some pennies. See more here.
tagline-image
I'm really excited to try Magicka 2, as I thought the first one was quite fun. The developer has been keeping everyone up to date on progress in their Steam forum. Sadly, for AMD GPU users it looks like it won't run as well as on Nvidia hardware.

A developer wrote this on their official Steam forum:
QuoteWe've had discussions on how to support non-XInput controllers and we found that just using SDLs input subsystem would probably solve that. Since there are controller emulators already we consider it a 'would be nice' feature.

I took the liberty of replacing the Nvidia 660GTX in my Linux machine for our Radeon 270X and ran some tests. On Ubuntu I tested both the open source drivers and fglrx and both worked fine. I think the open source drivers have slightly better performance. Switching drivers around kinda broke my setup though so I installed Debian 8 and did some tests there and only had issues with decals getting a slight rectangular outline.

Overall the biggest issue in my tests with AMD cards is that the performance on Linux & MacOSX feels like it's halved compared to a corresponding Nvidia card. I've added graphics settings to control how many decals/trees/foliage the game draws that helps a bit but it would've been better if this was not necessary.


It's fantastic to see them actually implement something to help with the issue though, and I'm sure many AMD GPU users will be pretty happy about that. It's not all doom and gloom, since that developer mentioned it will even work on the open source driver for AMD users, so that's great too. They must be one of a very few developers who are testing their port so thoroughly, it's quite refreshing to see.

I just want to get my hands on it already! Article taken from GamingOnLinux.com.
0 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly.
See more from me
The comments on this article are closed.
52 comments
Page: «4/6»
  Go to:

tuubi Oct 15, 2015
View PC info
  • Supporter Plus
Quoting: alexWhy? Well because it's impossible. That's why. Just like all this pseudo-programming bullshit
Obviously it is impossible as proven beyond any doubt by your most enlightening anecdote. If they don't detect every programming mistake or lost optimization opportunity imaginable, surely it is impossible to hack around anything at all. The rest of us are simply talking out of our asses. Glad that's settled then.
Guest Oct 15, 2015
At least I have some real examples and not just unconfirmed bullshit. You are just rehashing things you have heard from others, but since you don't understand the topic you just spew out tons of bullshit. You might have some things more or less correct, but the way you explain it just marks you with this gigantic neon lit sign "incompetent".

The mirv example was well documented and if this is true then yes thats a good and specific example. But when you explain in terms of " magic" and such it's just completely obvious you dont know anything about software development.
alexThunder Oct 15, 2015
What "real" examples (you have) are you referring to?
tuubi Oct 15, 2015
View PC info
  • Supporter Plus
Quoting: alexThe mirv example was well documented and if this is true then yes thats a good and specific example. But when you explain in terms of " magic" and such it's just completely obvious you dont know anything about software development.
Damn. Busted. I wonder how our software design business lasted for ten years before anyone found out I have no idea what I'm doing. Please don't tell our clients. :'(
alexThunder Oct 15, 2015
Quoting: tuubi
Quoting: alexThe mirv example was well documented and if this is true then yes thats a good and specific example. But when you explain in terms of " magic" and such it's just completely obvious you dont know anything about software development.
Damn. Busted. I wonder how our software design business lasted for ten years before anyone found out I have no idea what I'm doing. Please don't tell our clients. :'(

So you're working for Microsoft? ( ͡° ͜ʖ ͡°)
Guest Oct 15, 2015
The restrict pointer example is a real world example of how limited optimization is. We are basically discussing compiler optimization passes now. And this is not something I'm going to do on a random forum filled with ppl who never wrote a single line of OpenGL yet are experts on the Nvidia driver.

If the driver was this magical optimizer that fixes bad behavior automagically then why are ppl discussing "approaching zero driver o erhead" a lot now?

http://gdcvault.com/play/1020791/

See this talk, you will se that in order to fix bad behavior you simply dont use Nvidia - you rewrite the code! (Which ofc, would be blatantly clear if you knew anything about the limitations a compiler faces)
Guest Oct 15, 2015
Quoting: tuubi
Quoting: alexThe mirv example was well documented and if this is true then yes thats a good and specific example. But when you explain in terms of " magic" and such it's just completely obvious you dont know anything about software development.
Damn. Busted. I wonder how our software design business lasted for ten years before anyone found out I have no idea what I'm doing. Please don't tell our clients. :'(

So you write OpenGL there? No? Right.

Language? Hello kitty script?
alexThunder Oct 15, 2015
But that wasn't the example "you have", was it? And why does it show, how limited optimization is? We're talking about things from like simple type conversions to replacing whole shaders. You just provided a for-loop which doesn't tell us anything.

In general, you're not doing much more then keep telling us, how stupid we all are. Other than insults, you didn't contribute much so far.

Quoting: alexSee this talk, you will se that in order to fix bad behavior you simply dont use Nvidia - you rewrite the code! (Which ofc, would be blatantly clear if you knew anything about the limitations a compiler faces)

In how far is this different from what I suggested in the first place?


Last edited by alexThunder on 15 October 2015 at 10:28 pm UTC
Guest Oct 15, 2015
Whatever man, this whole thing is retarded. Impossible to get through when you dont know shit.

The loop was an extremely simple counter argument shpwing that given a very simple code, the compiler cannot magically optimize things. You need to manually mark things. This is a counter argument to this whole "Nvidia compiler automagically fixes everything". " you can write shitty code and it gets optimized boom bang yeah!". Yet back in reality land, the compiler cannot even figure out the for loop could run much faster if marked restrict.
alexThunder Oct 15, 2015
And how does this example show, that compilers don't do any optimizations elsewhere?

(Yes, it's a trap)


Last edited by alexThunder on 15 October 2015 at 10:42 pm UTC
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.
Buy Games
Buy games with our affiliate / partner links: