
I'm really excited to try Magicka 2, as I thought the first one was quite fun. The developer has been keeping everyone up to date on progress in their Steam forum. Sadly, for AMD GPU users it looks like it won't run as well as on Nvidia hardware.
A developer wrote this on their official Steam forum:
QuoteWe've had discussions on how to support non-XInput controllers and we found that just using SDLs input subsystem would probably solve that. Since there are controller emulators already we consider it a 'would be nice' feature.
I took the liberty of replacing the Nvidia 660GTX in my Linux machine for our Radeon 270X and ran some tests. On Ubuntu I tested both the open source drivers and fglrx and both worked fine. I think the open source drivers have slightly better performance. Switching drivers around kinda broke my setup though so I installed Debian 8 and did some tests there and only had issues with decals getting a slight rectangular outline.
Overall the biggest issue in my tests with AMD cards is that the performance on Linux & MacOSX feels like it's halved compared to a corresponding Nvidia card. I've added graphics settings to control how many decals/trees/foliage the game draws that helps a bit but it would've been better if this was not necessary.
It's fantastic to see them actually implement something to help with the issue though, and I'm sure many AMD GPU users will be pretty happy about that. It's not all doom and gloom, since that developer mentioned it will even work on the open source driver for AMD users, so that's great too. They must be one of a very few developers who are testing their port so thoroughly, it's quite refreshing to see.
I just want to get my hands on it already!
Some you may have missed, popular articles from the last month:
All posts need to follow our rules. For users logged in: please hit the Report Flag icon on any post that breaks the rules or contains illegal / harmful content. Guest readers can email us for any issues.
Can't wait for this one. The first game was absolutely mental in multiplayer. One of those games that you can't really "come back" to though. You learn these crazy 6-button combos for massive damage, healing, shields, or whatever, then you don't play it for a month, sit back down, and realise that you can't remember any of it!
I think this new version allows you to store macros to certain spells? That might solve that problem.
I think this new version allows you to store macros to certain spells? That might solve that problem.
1 Likes
A developer: I took the liberty of replacing the Nvidia 660GTX in my Linux machine for our Radeon 270X and ran some tests.
Doesn't this statement, again, shows that many developers develop on Nvidia first, then try with an AMD card and are disappointed - or even blame AMD for bad performance?
5 Likes
A developer: I took the liberty of replacing the Nvidia 660GTX in my Linux machine for our Radeon 270X and ran some tests.
Doesn't this statement, again, shows that many developers develop on Nvidia first, then try with an AMD card and are disappointed - or even blame AMD for bad performance?
No, it means he is doing thorough testing. You can only pick one card to test at a time, so they will obviously go with the more supported & used card.
Did you read what else he said? He's even been testing it on open source drivers too.
4 Likes
A developer: I took the liberty of replacing the Nvidia 660GTX in my Linux machine for our Radeon 270X and ran some tests.
Doesn't this statement, again, shows that many developers develop on Nvidia first, then try with an AMD card and are disappointed - or even blame AMD for bad performance?
Possibly - I'm not sure how much you "code" for a specific manufacturer though. Also, the R9 270 is about a hundred dollars more expensive that the GTX card (http://www.hwcompare.com/15671/geforce-gtx-660-vs-radeon-r9-270x/). You certainly wouldn't expect "half" the performance.
Great to see such diverse testing though.
0 Likes
In good news nvidia users (most part in market) for comments have good performance only needs launched
Back to amd anyone expect some different respect actual driver situation
However maybe more interesting performance comparative between catalyst vs opensource
^_^
Last edited by mrdeathjr on 15 Oct 2015 at 10:09 am UTC
Back to amd anyone expect some different respect actual driver situation
However maybe more interesting performance comparative between catalyst vs opensource
^_^
Last edited by mrdeathjr on 15 Oct 2015 at 10:09 am UTC
0 Likes
A developer: I took the liberty of replacing the Nvidia 660GTX in my Linux machine for our Radeon 270X and ran some tests.
Doesn't this statement, again, shows that many developers develop on Nvidia first, then try with an AMD card and are disappointed - or even blame AMD for bad performance?
And what did you expect develop on AMD and the see that performance on NVidia where your most of the profit will come from is sub par. I don't think so. Of course you develop on card which will give you most profit and then test on secondary cards which would be nice to work but not mandatory.
4 Likes
A developer: I took the liberty of replacing the Nvidia 660GTX in my Linux machine for our Radeon 270X and ran some tests.
Doesn't this statement, again, shows that many developers develop on Nvidia first, then try with an AMD card and are disappointed - or even blame AMD for bad performance?
I'm not a game developer here, so this is more of a question than a statement. But wouldn't coding for a particular card only be the case when you're adding functionality specific to that card (proprietary things like Nvidia's PhysX)? Other than that, shouldn't common OpenGL functions work the same on both cards? These are open standards that both drivers should meet...right?
0 Likes
Other than that, shouldn't common OpenGL functions work the same on both cards? These are open standards that both drivers should meet...right?My understanding is that Nvidia hacks around a lot of common mistakes and bad practices in OpenGL code. Basically it'd be better for compatibility if games were primarily (but not exclusively) tested and optimized on an AMD gpu, as this would - at least in theory - result in cleaner code that would most likely run just fine on Nvidia's drivers as well. Although coding strictly to spec without great documentation and tooling (test suites etc.) is really hard and time-consuming.
2 Likes
A developer: I took the liberty of replacing the Nvidia 660GTX in my Linux machine for our Radeon 270X and ran some tests.
Doesn't this statement, again, shows that many developers develop on Nvidia first, then try with an AMD card and are disappointed - or even blame AMD for bad performance?
No, it means he is doing thorough testing. You can only pick one card to test at a time, so they will obviously go with the more supported & used card.
Did you read what else he said? He's even been testing it on open source drivers too.
Yes, I did. But doesn't it read like he did so *after* they ported the game?
So what's your point about my question?
2 Likes
Doesn't this statement, again, shows that many developers develop on Nvidia first, then try with an AMD card and are disappointed - or even blame AMD for bad performance?
And what did you expect develop on AMD and the see that performance on NVidia where your most of the profit will come from is sub par. I don't think so. Of course you develop on card which will give you most profit and then test on secondary cards which would be nice to work but not mandatory.
This way of thinking got us this far...
Secondary cards? There's no such thing as secondary hardware, there's only miscalleneous hardware and amd is still a long way from getting kicked out of the market.
Amd support not being mandatory now? I honestly can't believe I'm reading this, almost sounds like a fanboy reply.
With such a consumer-friendly comment coming out of the keyboard of a linux (minority) user I'm starting to wonder how come devs still release linux games for us to play...
I agree with sub on this one, they indeed do seem to be developing specifically for nvidia. Then they complain about amd and after a while they decide to drop amd support in their games, that's always the plan. Let's applaud them (everyone) for it, shall we?
I'm not saying amd (fglrx/radeonsi) can currently compete with nvidia when it comes to which driver offers the most opengl performance (illegal implementation or not) and up to a great percentage that's due to amd's own support, though developing a linux game without ever testing on amd (if that was indeed the case) and expecting it to its whole, already lesser, potential is just infuriating to say the least.
They don't have to test on a single system throughout development and in any case switching between nvidia and amd once in a while is not that hard either. It's their job after all, right?
Last edited by kon14 on 15 Oct 2015 at 2:02 pm UTC
1 Likes
I want to buy an AMD card next time I upgrade, but they have a LOT more work to do on their Linux drivers. Trying to shift blame of poor performance to ALL of the game devs is pretty stupid. NVIDIA makes a better driver, closer to parity with Windows. AMD have struggled with Linux since the ATI days, and continue to do so. I hope it will change soon.
2 Likes
Other than that, shouldn't common OpenGL functions work the same on both cards? These are open standards that both drivers should meet...right?My understanding is that Nvidia hacks around a lot of common mistakes and bad practices in OpenGL code. Basically it'd be better for compatibility if games were primarily (but not exclusively) tested and optimized on an AMD gpu, as this would - at least in theory - result in cleaner code that would most likely run just fine on Nvidia's drivers as well. Although coding strictly to spec without great documentation and tooling (test suites etc.) is really hard and time-consuming.
You nailed it. If you're primarily developing with Nvidia's drivers and then intend to get your code working on other hardware as well, you're only making things more difficult for yourself. Like in general, for multiplatform development, you should not start with targeting a specific platform and then try to port your software - go multiplatform from the beginning!
You could argue, that starting development on AMD (GPUs) is safest, because they stick with the OpenGL specifications. However, it's not that their drivers are working perfectly (surprise!) ;) What worked best for me is developing on AMD and test stuff very frequently on Nvidia and Intel during development. Developing on Nvidia and make it work on AMD and Intel afterwards was worst.
1 Likes
[quote=alexThunder]
Maybe this should be the way to do the things, starting with Feral, so they will achieve amd support on their games. It sounds like a better idea to develop on amd and just test on nvidia, after all they respect the opengl specifications.Other than that, shouldn't common OpenGL functions work the same on both cards? These are open standards that both drivers should meet...right?What worked best for me is developing on AMD and test stuff very frequently on Nvidia and Intel during development. Developing on Nvidia and make it work on AMD and Intel afterwards was worst.
0 Likes
This happen to me also when I had a AMD 5850;
from the developer;
"switching drivers around kinda broke my setup though so I installed Debian 8 "
He was switching from open source to close source drivers. I dont want to remember how much time I took troubleshooting and switching between drivers when I used AMD. I understand the support for open source but whats the point of supporting if it cant provide a usable gaming computer. I dont want to spend $500.00 dollars on a video card for mediocre performance. I am a gamer at heart that has been windows free for over a year now. I have made the switch full time now and I do miss some games but I know have plenty of other games to play.
Plus I think AMD slap all the open source fanatics with Vulkan development. You will need to use their proprietary drivers if you will want to use Vulkan. My 2 points.
from the developer;
"switching drivers around kinda broke my setup though so I installed Debian 8 "
He was switching from open source to close source drivers. I dont want to remember how much time I took troubleshooting and switching between drivers when I used AMD. I understand the support for open source but whats the point of supporting if it cant provide a usable gaming computer. I dont want to spend $500.00 dollars on a video card for mediocre performance. I am a gamer at heart that has been windows free for over a year now. I have made the switch full time now and I do miss some games but I know have plenty of other games to play.
Plus I think AMD slap all the open source fanatics with Vulkan development. You will need to use their proprietary drivers if you will want to use Vulkan. My 2 points.
1 Likes
A developer: I took the liberty of replacing the Nvidia 660GTX in my Linux machine for our Radeon 270X and ran some tests.
Doesn't this statement, again, shows that many developers develop on Nvidia first, then try with an AMD card and are disappointed - or even blame AMD for bad performance?
No, it means he is doing thorough testing. You can only pick one card to test at a time, so they will obviously go with the more supported & used card.
Did you read what else he said? He's even been testing it on open source drivers too.
Yes, I did. But doesn't it read like he did so *after* they ported the game?
So what's your point about my question?
You're thinking about this the wrong way around. They are testing, and obviously rather thoroughly to include open source drivers, and to implement extra features to help AMD users, is that not a great thing?
A game being "ported" isn't some magical milestone, it means it compiles and runs on Linux, that IS when the proper testing begins, and they can only test one GPU at a time unless they can do both at the same time there is always going to be one that is first.
1 Likes
I think people are overreacting here. Think about it: this dev tested on BOTH the AMD prop blob and the open driver. This alone says something about their testing methodology and I wouldn't worry about AMD not being developed on directly. I mean, this is still testing phase and they can still make changes to the engine should they find some NvidiaGL in the code. This testing here should be seen as something very positive instead of painting it as some kind of hatred towards AMD. Remember, a lot of devs won't even look at AMD and if they do, they will only look at the proprietary driver. This developer is already better than that.
1 Likes
Other than that, shouldn't common OpenGL functions work the same on both cards? These are open standards that both drivers should meet...right?My understanding is that Nvidia hacks around a lot of common mistakes and bad practices in OpenGL code. Basically it'd be better for compatibility if games were primarily (but not exclusively) tested and optimized on an AMD gpu, as this would - at least in theory - result in cleaner code that would most likely run just fine on Nvidia's drivers as well. Although coding strictly to spec without great documentation and tooling (test suites etc.) is really hard and time-consuming.
Your understanding is incorrect. Nvidia has always been the top quality company when it comes to OpenGL implementation.
https://sv.dolphin-emu.org/blog/2013/09/26/dolphin-emulator-and-opengl-drivers-hall-fameshame
OpenGL is not some pizza dough you can just squish in any way you want and get things to work. That's not how software development works.
1 Likes
Other than that, shouldn't common OpenGL functions work the same on both cards? These are open standards that both drivers should meet...right?My understanding is that Nvidia hacks around a lot of common mistakes and bad practices in OpenGL code. Basically it'd be better for compatibility if games were primarily (but not exclusively) tested and optimized on an AMD gpu, as this would - at least in theory - result in cleaner code that would most likely run just fine on Nvidia's drivers as well. Although coding strictly to spec without great documentation and tooling (test suites etc.) is really hard and time-consuming.
Your understanding is incorrect. Nvidia has always been the top quality company when it comes to OpenGL implementation.
https://sv.dolphin-emu.org/blog/2013/09/26/dolphin-emulator-and-opengl-drivers-hall-fameshame
OpenGL is not some pizza dough you can just squish in any way you want and get things to work. That's not how software development works.
His understandment seems quite right, OpenGL is a specification, its drivers are not strictly controlled by anyone other than the vendors themselves.
Nvidia does use some shady techniques in order to achieve greater performance "in exchange" for breaking opengl's specification.
Most devs are developing with nvidia in mind and thus nvidia ends up "defining" the actual protocol in use.
Nvidia is the one to screw up everybody...
[Here](http://www.extremetech.com/gaming/182343-why-we-cant-have-nice-things-valve-programmer-discusses-wretched-state-of-opengl)'s a nice read
Last edited by kon14 on 15 Oct 2015 at 7:11 pm UTC
1 Likes
A developer: I took the liberty of replacing the Nvidia 660GTX in my Linux machine for our Radeon 270X and ran some tests.
Doesn't this statement, again, shows that many developers develop on Nvidia first, then try with an AMD card and are disappointed - or even blame AMD for bad performance?
I'm not a game developer here, so this is more of a question than a statement. But wouldn't coding for a particular card only be the case when you're adding functionality specific to that card (proprietary things like Nvidia's PhysX)? Other than that, shouldn't common OpenGL functions work the same on both cards? These are open standards that both drivers should meet...right?
The OpenGL functions as used by the CPU should do the same things. Most likely what is fast on Nvidia will be fast on AMD but in theory, if you base desicions on findings from an Nvidia card you might make things slow for AMD. But I have a hard time imagining an example. Extensions?
When it comes to the shaders, which are general programs that execute on the actual GPU then ofc differences in architecture matters. But again, I have problems imagining differences that huge.
0 Likes
Other than that, shouldn't common OpenGL functions work the same on both cards? These are open standards that both drivers should meet...right?My understanding is that Nvidia hacks around a lot of common mistakes and bad practices in OpenGL code. Basically it'd be better for compatibility if games were primarily (but not exclusively) tested and optimized on an AMD gpu, as this would - at least in theory - result in cleaner code that would most likely run just fine on Nvidia's drivers as well. Although coding strictly to spec without great documentation and tooling (test suites etc.) is really hard and time-consuming.
Your understanding is incorrect. Nvidia has always been the top quality company when it comes to OpenGL implementation.
https://sv.dolphin-emu.org/blog/2013/09/26/dolphin-emulator-and-opengl-drivers-hall-fameshame
OpenGL is not some pizza dough you can just squish in any way you want and get things to work. That's not how software development works.
His understandment seems quite right, OpenGL is a specification, its drivers are not strictly controlled by anyone other than the vendors themselves.
Nvidia does use some shady techniques in order to achieve greater performance "in exchange" for breaking opengl's specification.
Most devs are developing with nvidia in mind and thus nvidia ends up "defining" the actual protocol in use.
In the end Nvidia ends up screwing up everybody...
[Here](http://www.extremetech.com/gaming/182343-why-we-cant-have-nice-things-valve-programmer-discusses-wretched-state-of-opengl)'s a nice read
So all I found was this:
What most devs use because this vendor has the most capable GL devs in the industry and the best testing process. It's the "standard" driver, it's pretty fast, and when given the choice this vendor's driver devs choose sanity (to make things work) vs. absolute GL spec purity.
Yes, they might do minor tweaks but they are still highly regarded in their OpenGL work. I've been unable to find any example of what they tweak though. You have to take into account this guy is an OpenGL hater, I've read about him before and I do not see it the same way.
2 Likes
See more from me