Support us on Patreon to keep GamingOnLinux alive. This ensures all of our main content remains free for everyone. Just good, fresh content! Alternatively, you can donate through PayPal. You can also buy games using our partner links for GOG and Humble Store.
We use affiliate links to earn us some pennies. Learn more.

The developer of Smith and Winston made an interesting blog post about supporting multiple platforms

By -
Last updated: 3 Feb 2020 at 2:40 pm UTC

I recently talked about the Steam release of Smith and Winston but I didn't realise until late last night, that the developer actually made a very interesting blog post about supporting multiple platforms.

Interesting enough, that it warranted an extra post to talk a little about it. Why? Well, a bit of a situation started when game porter Ethan Lee made a certain Twitter post, which is a bit of a joke aimed at developers who see Linux as "Too niche" while practically falling over themselves to get their games on every other new platform that appears. This Twitter post was shared around (a lot) and some developers (like this) ended up mentioning how Linux doesn't sell a lot of games and it continued spreading like wildfire.

There's been a lot of counter-arguments for Linux too, like this and this and this and this and a nice one thrown our way too. Oh and we even spoke to Tommy Refenes who said the next SMB should come to Linux too, so that was awesome. Additionally, Ethan Lee also wrote up a post about packaging Linux games, worth a read if you're new to packaging for Linux.

Where was I again? Right, the blog post from the developer of Smith and Winston about how they support Windows, Mac and Linux. They go over details about how they do so, from using SDL2 which they say "takes 90% of the pain away from platform support" to the cross-platform rendering library bgfx. It's just a really interesting insight into how developing across multiple platforms doesn't have to be overly difficult.

I especially liked these parts:

I’ve been writing games and engines for 30+ years so none of this is new, I have a lot of experience. But you only get the experience by doing it and not making excuses.

By forcing the game through different compilers (Visual C++, Clang and GCC) you find different code bugs (leave the warnings on max). By forcing the runtime to use different memory allocators, threading libraries and rendering technologies you find different runtime bugs. This makes your code way more stable on every platform. Even if you never deploy your code to a non windows platform just running it on Linux or macOS will expose crashes instantly that rarely appear on Windows. So delivering a quality product on the dominant platform is easier if your support the minor platforms as well.

They also clearly mention, that they might not even make their money back on the Linux port of Smith and Winston. However, they're clear that the other reasons (code quality, easier porting to other platforms and so on) do help make up for it. This is a similar point the people from Stardock also made on Reddit.

See the post here in full. If you wish to check out their game, Smith and Winston, it's available on itch.io and Steam in Early Access.

Article taken from GamingOnLinux.com.
Tags: Misc | Apps: Smith and Winston
23 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly checked on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. You can also follow my personal adventures on Bluesky.
See more from me
The comments on this article are closed.
All posts need to follow our rules. For users logged in: please hit the Report Flag icon on any post that breaks the rules or contains illegal / harmful content. Guest readers can email us for any issues.
15 comments Subscribe

Beamboom 10 Jan 2019
I don't understand how different compilers can expose different bugs in the same(?) code. I mean, a bug is a bug isn't it? Or is it because the use of different libraries expose bugs caused by those particular libraries/APIs? If so, how will the code run smoother on a different set of libraries if the bug is related to that other library?

I don't get this?
TheSHEEEP 10 Jan 2019
  • Supporter Plus
I don't understand how different compilers can expose different bugs in the same(?) code. I mean, a bug is a bug isn't it? Or is it because the use of different libraries expose bugs caused by those particular libraries/APIs? If so, how will the code run smoother on a different set of libraries if the bug is related to that other library?

I don't get this?
Doesn't work that easy in programming, especially in C/C++, which is what pretty much all gaming is based on.
The same code simply might behave ever-so-slightly-different under different compilers.
Not major differences, mind you - that would be absurd. But small ones that can actually help uncovering otherwise hard to find bugs.

But he also mentions different libraries, like different threading libs per platform. That might be a more obvious example, where one library functions in a way that makes a possible deadlock in your code occur more often on that platform, so finding that bug earlier helps tremendously.


Last edited by TheSHEEEP on 10 Jan 2019 at 11:25 am UTC
Marc Di Luzio 10 Jan 2019
  • Game Dev
  • Supporter Plus
I don't understand how different compilers can expose different bugs in the same(?) code. I mean, a bug is a bug isn't it? Or is it because the use of different libraries expose bugs caused by those particular libraries/APIs? If so, how will the code run smoother on a different set of libraries if the bug is related to that other library?

I don't get this?

In general, this is a bunch of things:

1. Different compilers have different sets of warnings, one may silently do something unexpected, another may say "Uh, look here mate, a bad" (MSVC is known for being very lenient, clang is very strict, so clang may find the source of bugs for you).
2. "Undefined Behaviour" is common in C/C++ and compilers may behave differently. One compiler may cause an immediate crash (easy to debug) but another may overwrite a random bit of RAM somewhere (very hard to debug).
3. Two compilers may share the same compiler bugs or odd behaviours (in the old days PS3 used GCC), so compiling for one platform may be doing a chunk of the work resolving issues you would have found on another platform in the future (Linux GCC bug may also have been found on PS3 GCC).

There's more than just those, but hopefully, that's a taste.
BielFPs 10 Jan 2019
Hey Liam,

Have you thought about consulting some linux developers to make a kind of guide of "Best practices to make an multi-platform game"?

Like pointing some tips about which engine, libraries, sound libraries they should use, and which they should avoid...

I know that most developer companies don't support linux because of business decisions, but there are a lot of devs who are a little "clueless" about how linux works. This also clould help some users to feedback devs about using some resources to make the porting / emulation easier for us.

Anyway It's just a suggestion :)
metaldazza 10 Jan 2019
  • Game Dev
I don't understand how different compilers can expose different bugs in the same(?) code. I mean, a bug is a bug isn't it? Or is it because the use of different libraries expose bugs caused by those particular libraries/APIs? If so, how will the code run smoother on a different set of libraries if the bug is related to that other library?

I don't get this?

This is a good question: At a very basic level different compilers warn about different things. Although the C/C++ standard defines what is an error very clearly different compilers will warn about different things. Often this isn't that important and isn't going to save your bacon but a good developer listens to their compiler. There is a reason the compiler isn't happy and it can reveal false assumptions and you'll find yourself saying "I have no idea how that ever ran".

Different compilers do have different standard libraries though and they can be more of less forgiving. Visual C++ STL (Standard Template Library) has extensive debugging output in debug builds that catch errors quickly and precisely at the expense of debug builds being very slow. This also has the effect of using more memory and changed memory usage can also hide or expose different kinds of bugs. So macOS and Linux not having these is "good" as in different and in the case difference is good but not better or worse (I am not saying one compiler is better than another, just different).

Another big difference between standard libraries is the way they allocate memory within the application heap. So the OS gives your application a chunk of memory that only it can use called a heap and the application allocate's and frees blocks within that heap. Each standard library has different algorithms for how allocate and free work and you can even replace them with your own if you are brave/foolish/clever/stupid/genius. So when you allocate a piece of memory, use it, free it and then illegally use this piece of freed memory you get different behaviour. Annecdotally on Windows you get away with this a lot more often than on UNIX where you will more often (but not always) crash almost instantly making it easier to track down the problem. On embedded platforms (consoles for example) where memory is tighter you also get different behaviour as the OS vendor will tweak the memory allocator to be more aggressive about recycling memory than on a desktop where "memory is limitless".

Hope that helps and vaguely makes sense?
Beamboom 10 Jan 2019
different threading libs per platform. That might be a more obvious example, where one library functions in a way that makes a possible deadlock in your code occur more often on that platform, so finding that bug earlier helps tremendously.

Absolutely more obvious, but like I say how can a library-specific bug really help on other platforms, where other libraries are used for that same function where a particular issue is not present?

Admittedly I've only got experience from higher level languages, and will totally accept that is the reason for my lack of comprehension. :)
Tchey 10 Jan 2019
  • Supporter
Nice article of the blog, @metaldazza thanks for writing it, sharing your experience and mind.

I didn't regret buying W&W, but now i regret even less. So, i sub-regret it ?
metaldazza 10 Jan 2019
  • Game Dev
I didn't regret buying W&W, but now i regret even less. So, i sub-regret it ?

If we ever do a boxed version this quote is going on it :D
Beamboom 10 Jan 2019
Hope that helps and vaguely makes sense?

Absolutely, thanks!
wintermute 10 Jan 2019
  • Supporter
but like I say how can a library-specific bug really help on other platforms, where other libraries are used for that same function where a particular issue is not present?

The library-specific bug could expose a problem in your algorithm that you've mostly been able to ignore up until now because the bug only occurs in 0.01% of cases with the other library, so you've never been able to replicate it.
cprn 10 Jan 2019
TL;DR: Using only one environment gives you an idea of what works in that specific environment. Only by fully understanding another environment you can formulate an opinion about how good is whatever you've used until now and which of your practices are good in general as opposed to just being good in isolation.

I worked with a 2D Python pet project with a weird glitch that every now and then made all animations jump a pixel for 2-3 frames. IMHO it looked good and gave the game a unique feeling but the original developer wasn't happy about how it occasionally f*cked up his precious pixel perfect collisions. I ported the rendering bits of what he used to PySDL2 for entirely unrelated reason - to see how it'd impact the performance - but surprisingly that not only got rid of the glitch but also gave a bunch of nice warnings revealing his XY math returning floats (and as we all know, or at least should know, floats that are supposed to represent integers aren't always equal to these integers). I'm not going to dis the formerly used rendering code, which is a part of pretty okay library, but if I never ported it to something else we wouldn't know what caused the glitch because the formerly used library just floored all pixel coordinates. The argument my colleague gave for not going with SDL2 from the very beginning was the one of limited resources in a one-man army studio, mostly time, and a reluctance of wasting them on learning "new stuff" when he could just re-use those little bits he wrote himself for some other game thus having something he knew by heart and could debug easily, etc.
F.Ultra 10 Jan 2019
  • Supporter
I don't understand how different compilers can expose different bugs in the same(?) code. I mean, a bug is a bug isn't it? Or is it because the use of different libraries expose bugs caused by those particular libraries/APIs? If so, how will the code run smoother on a different set of libraries if the bug is related to that other library?

I don't get this?

This is a good question: At a very basic level different compilers warn about different things. Although the C/C++ standard defines what is an error very clearly different compilers will warn about different things. Often this isn't that important and isn't going to save your bacon but a good developer listens to their compiler. There is a reason the compiler isn't happy and it can reveal false assumptions and you'll find yourself saying "I have no idea how that ever ran".

Different compilers do have different standard libraries though and they can be more of less forgiving. Visual C++ STL (Standard Template Library) has extensive debugging output in debug builds that catch errors quickly and precisely at the expense of debug builds being very slow. This also has the effect of using more memory and changed memory usage can also hide or expose different kinds of bugs. So macOS and Linux not having these is "good" as in different and in the case difference is good but not better or worse (I am not saying one compiler is better than another, just different).

Another big difference between standard libraries is the way they allocate memory within the application heap. So the OS gives your application a chunk of memory that only it can use called a heap and the application allocate's and frees blocks within that heap. Each standard library has different algorithms for how allocate and free work and you can even replace them with your own if you are brave/foolish/clever/stupid/genius. So when you allocate a piece of memory, use it, free it and then illegally use this piece of freed memory you get different behaviour. Annecdotally on Windows you get away with this a lot more often than on UNIX where you will more often (but not always) crash almost instantly making it easier to track down the problem. On embedded platforms (consoles for example) where memory is tighter you also get different behaviour as the OS vendor will tweak the memory allocator to be more aggressive about recycling memory than on a desktop where "memory is limitless".

Hope that helps and vaguely makes sense?

Matches 100% my experience from 30+ years of coding. Especially fun with the "change of memory regions" is when the code crashes reliably but when you add some simple "printf ()":s to write out some values just before the crash happens then it stops to crash :)

And I'd say that the single most important thing that happened to me when I switched from Windows to Linux back in the day was getting access to Valgrind. That is one very very good tool!
aluminumgriffin 10 Jan 2019
I don't understand how different compilers can expose different bugs in the same(?) code. I mean, a bug is a bug isn't it? Or is it because the use of different libraries expose bugs caused by those particular libraries/APIs? If so, how will the code run smoother on a different set of libraries if the bug is related to that other library?

I don't get this?

Each dialect, compiler, and platform behaves somewhat differently - this is pretty much why each coder has its favorite compiler.

Three things that tends to bite people hard if they don't regularly jump platforms are memory alignment, size of datatypes and bit-order/byte-order.

Using a spoiler tag to hide a bit more in depth explanation of two of the issues
Spoiler, click me

Datatypes:
The "int" datatype. Depending on which compiler, dialect, version of language, and platform you're on it varies in size. It historically was "the native word size of the platform". Which means that on 16bit machines it should be 16bit, on 32bit platforms it should be 32bit, on 64bit platforms it should be 64bit. However this does not hold true today since it now is defined to only be guaranteed to hold -32768 to 32767 (16bit, signed).
Do note that this already makes it weird it for machines with a word size smaller than 16bit, and also to make it even funnier on 32bit machines where it can be either 16bit or 32bit, and on 64bit machines it is normally (almost - but not quite - always) 32bit as well.
So, an "int" can only be assumed to be "at least 16bit", and on embedded you really should read the datasheets and specs anyways.
Now add to that you often also can change the compiler behaviour by selecting different methods of packing.
And yes, "int" is the most common datatype.

Memory alignment:
Take the simple declaration "int a, b;" and tell me how that is arranged in memory. it is 4 bytes (16bit*2), is it 8bytes (32bit*2), or is it 16bytes (64bit*2) (this might happen either is the int is 8bits or if it is aligned to match with memory boundaries). Also, do the 'a' come before the 'b' in in physical memory? is it something between them? (padding and such) and overflowing 'a' in a such a way that would not be caught, will that alter 'b' or will it cause a memory access violation?

Even funnier is when the runtime of your compiler does not exactly match the settings of the specific build of the libraries you're using (so yes, you can end up with b=a+a; working but calling a function that does b=a+a; will crash - even when fed with the exact same datatypes and values).

Long story short - every place where you've made an assumption can bite you when you jump platforms (this is why you often see a "datatypes.h" in multi-platform projects)

How this makes stuff run smoother - assume an overflowing variable isn't caught by the runtime of one compiler but rather overwrites whatever is next in memory, in this compiler it will corrupt the data in the following variable and this corruption can cause an undesired behaviour in a place that isn't even near the code that overflowed (even worse, where the undesired behaviour arise can be in correct code). But if you try the same code in a compiler with a more strict runtime it will crash at the overflow itself.
fabertawe 11 Jan 2019
I liked the look of this game anyway and will definitely be buying when it's out of early access (or earlier, funds and time permitting) ^_^
F.Ultra 12 Jan 2019
  • Supporter
Also 100% hilarious to read all the mansplaining tweets from Windows devs to Ethan Lee "informing" him on how impossible it is to develop for Linux and how fragmented it is. Having porting games to Linux for over a decade have probably not given him any form of experience in the field so let's inform him!!!
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.