I'm really excited to try Magicka 2, as I thought the first one was quite fun. The developer has been keeping everyone up to date on progress in their Steam forum. Sadly, for AMD GPU users it looks like it won't run as well as on Nvidia hardware.
A developer wrote this on their official Steam forum:
QuoteWe've had discussions on how to support non-XInput controllers and we found that just using SDLs input subsystem would probably solve that. Since there are controller emulators already we consider it a 'would be nice' feature.
I took the liberty of replacing the Nvidia 660GTX in my Linux machine for our Radeon 270X and ran some tests. On Ubuntu I tested both the open source drivers and fglrx and both worked fine. I think the open source drivers have slightly better performance. Switching drivers around kinda broke my setup though so I installed Debian 8 and did some tests there and only had issues with decals getting a slight rectangular outline.
Overall the biggest issue in my tests with AMD cards is that the performance on Linux & MacOSX feels like it's halved compared to a corresponding Nvidia card. I've added graphics settings to control how many decals/trees/foliage the game draws that helps a bit but it would've been better if this was not necessary.
It's fantastic to see them actually implement something to help with the issue though, and I'm sure many AMD GPU users will be pretty happy about that. It's not all doom and gloom, since that developer mentioned it will even work on the open source driver for AMD users, so that's great too. They must be one of a very few developers who are testing their port so thoroughly, it's quite refreshing to see.
I just want to get my hands on it already!
Some you may have missed, popular articles from the last month:
Quoting: alexLanguage? Hello kitty script?Yes, our main product is a virtual knickers-untwister written in Hello kitty script. Although personally I prefer to prevent having my knickers in a twist by going commando.
(In case you're not getting it, I'm not interested in a shit-flinging match with you. You don't seem interested in a civil, productive discussion so just take a deep breath, assume I'm a troll and/or an idiot and go about your business.)
Last edited by tuubi on 15 October 2015 at 11:05 pm UTC
0 Likes
That's funny
0 Likes
Cheezus, so the guy pulls out his Nvidia card to test and fix the build on AMD hardware and gets slammed by AMD supporters while triggering the usual AMD vs Nvidia shitstorm.
No good deed goes unpunished.
No good deed goes unpunished.
1 Likes, Who?
This isn't about slamming anyone - neither the dev nor Nvidia.
2 Likes, Who?
This whole patching stuff (same happens in the web. You know how many sites in the www do not deliver correct html? Way more than half!) is the biggest nonsense that was ever made.
And I really "fear" this will be what hinders Vulkan from breaking through. Because if devs are too lazy to learn OpenGL why will Vulkan be any different?
Apart from bigger studios I fear it will not take off (or in the end we'll see the same situation and another layer that patches the whole incompetence of the devs)
And I really "fear" this will be what hinders Vulkan from breaking through. Because if devs are too lazy to learn OpenGL why will Vulkan be any different?
Apart from bigger studios I fear it will not take off (or in the end we'll see the same situation and another layer that patches the whole incompetence of the devs)
1 Likes, Who?
Quoting: alexAt least I have some real examples and not just unconfirmed bullshit. You are just rehashing things you have heard from others, but since you don't understand the topic you just spew out tons of bullshit. You might have some things more or less correct, but the way you explain it just marks you with this gigantic neon lit sign "incompetent".
The mirv example was well documented and if this is true then yes thats a good and specific example. But when you explain in terms of " magic" and such it's just completely obvious you dont know anything about software development.
Ah, that's why there are always specific driver releases for games where the changelog says:
"Increased performance in [GameX] by 30%"
Of course, one can not work around the quirks of a game, so these changelogs are just bullshit, right?
0 Likes
Quoting: dubigrasuCheezus, so the guy pulls out his Nvidia card to test and fix the build on AMD hardware and gets slammed by AMD supporters while triggering the usual AMD vs Nvidia shitstorm.
No good deed goes unpunished.
As said, it's bullshit to just develop on one card.
This is no small 1 man-show, this is a studio. I'm pretty sure they tested their DX implementation on AMD before launch, or did they not?
Hell, I wrote a 3D engine from scratch for one university course with a colleague and even though both of us only had access to AMD cards we tested it on nvidia, which meant taking a day off or jiggling with our free-time to get to the university (at opening hours) and test on the nvidia-machine there (if it was free).
In the end the game held to the spec and ran on AMD and nvidia, although I must admit, we had to fix some things, because even if we both used AMD, some smaller stuff didn't work on one card or the other. (both AMD, mind)
Now if I was to develop a game in a studio I would absolutely make sure that this stuff runs on AMD and nvidia, even though AMD has less marketshare, because, guess what: Else a shitstorm is coming (and rightfully so).
If you argue that nvidia has the bigger marketshare... well, why are we even here then? Windows has the bigger marketshare (on the desktop that is weisenheimer!), so why even bother with Linux-porting at all?
1 Likes, Who?
I develop on AMD and Intel mesa stack, if it works well there it damn well better work great amd and nvidia proprietary.
"If you argue that nvidia has the bigger marketshare... well, why are we even here then? Windows has the bigger marketshare (on the desktop that is weisenheimer!), so why even bother with Linux-porting at all?"
I maybe wrong, but if I am wrong, why is it people like Liam didn't seem to know how to use the ~ to indicate home? I bet the vast majority of the users here spend as little time in a terminal as possible and probably don't use any of the benefits Linux actually has to offer. I think it's more of a dislike of Windows than a love of *nix that most people are here and they want to play games on their new found platform and I honestly can't blame them they just want them to work. Sadly nvidia offers the better experience, of course much of this is coding for their broken GL implementation explicitly then blaming AMD when it doesn't work on their drivers but whatever.
Last edited by Xzyl on 16 October 2015 at 11:46 am UTC
"If you argue that nvidia has the bigger marketshare... well, why are we even here then? Windows has the bigger marketshare (on the desktop that is weisenheimer!), so why even bother with Linux-porting at all?"
I maybe wrong, but if I am wrong, why is it people like Liam didn't seem to know how to use the ~ to indicate home? I bet the vast majority of the users here spend as little time in a terminal as possible and probably don't use any of the benefits Linux actually has to offer. I think it's more of a dislike of Windows than a love of *nix that most people are here and they want to play games on their new found platform and I honestly can't blame them they just want them to work. Sadly nvidia offers the better experience, of course much of this is coding for their broken GL implementation explicitly then blaming AMD when it doesn't work on their drivers but whatever.
Last edited by Xzyl on 16 October 2015 at 11:46 am UTC
1 Likes, Who?
Quoting: GuestQuoting: XzylI develop on AMD and Intel mesa stack, if it works well there it damn well better work great amd and nvidia proprietary.
Maybe. You might need to modify a few things, depending what you've done and how close you are to spec.
Just as an example, I had code that ran fine with Mesa, but not on anything else. It was entirely my own fault of course (things running in the wrong thread, oops), and fortunately nothing too difficult to change considering it was caught early, but it does highlight that it's not purely an nvidia blob issue - it's a sticking to one driver only for most of your development issue.
I probably should of said that so far it has worked without issue. Sadly, as awful as this is, I need to buy another nvidia card as my kepler test card died... I swear the computer gods are determined to make me hate nvidia.
0 Likes
Quoting: XzylI develop on AMD and Intel mesa stack, if it works well there it damn well better work great amd and nvidia proprietary.You're probably thinking of my UT post with the dirty launcher, I am fully aware of what ~ is, but it didn't work correctly when I tried, so I didn't include it in the article. And yes, i don't like Windows, I don't use what most people think are the advantages of being on Linux. I just want a free OS to play with and tinker, but why is this argument in here that's nothing to do with the article :p
"If you argue that nvidia has the bigger marketshare... well, why are we even here then? Windows has the bigger marketshare (on the desktop that is weisenheimer!), so why even bother with Linux-porting at all?"
I maybe wrong, but if I am wrong, why is it people like Liam didn't seem to know how to use the ~ to indicate home? I bet the vast majority of the users here spend as little time in a terminal as possible and probably don't use any of the benefits Linux actually has to offer. I think it's more of a dislike of Windows than a love of *nix that most people are here and they want to play games on their new found platform and I honestly can't blame them they just want them to work. Sadly nvidia offers the better experience, of course much of this is coding for their broken GL implementation explicitly then blaming AMD when it doesn't work on their drivers but whatever.
As for marketshare stuff, developers go for Nvidia first on Windows just as much as they do on Linux.
Last edited by Liam Dawe on 17 October 2015 at 7:58 am UTC
1 Likes, Who?
See more from me