Recently it was noticed that users on more bleeding-edge Linux distributions that updated saw Easy Anti-Cheat no longer working on Linux, the culprit was glibc and now a Valve developer has spoken out about it.
Writing in a small thread on Twitter, Valve developer Pierre-Loup Griffais said:
Unfortunate that upstream glibc discussion on DT_HASH isn't coming out strongly in favor of prioritizing compatibility with pre-existing applications. Every such instance contributes to damaging the idea of desktop Linux as a viable target for third-party developers.
Our thoughts on the topic from this prior compatibility issue in BlueZ apply more than ever: https://github.com/bluez/bluez/commit/35a2c50437cca4d26ac6537ce3a964bb509c9b62#commitcomment-56028543
It is unfortunately yet another entry in a growing list over the years.We understand that working with a focus on compatibility requires more resources and more engineering trade-offs, but strongly believe it is nonetheless the way to go. We are very interested in helping with any underlying resource constraints.
This prompted CodeWeavers (who work on Wine and with Valve on Proton) developer Arek Hiler to write a blog post titled "Win32 Is The Only Stable ABI on Linux" and their ending statement is something people should think on:
I think this whole situation shows why creating native games for Linux is challenging. It’s hard to blame developers for targeting Windows and relying on Wine + friends. It’s just much more stable and much less likely to break and stay broken.
Hiler certainly isn't the only one to think like that, with another CodeWeavers developer Andrew Eikum mentioning on Hacker News some time ago:
As a long-time Linux dev (see my profile), I have also found this to be true. Linux userland APIs are unstable and change all the time. Some transitions that come to mind that have affected me personally: ALSA->pulse; libudev->libudev2->systemd; gstreamer 0.10->1.0. All of those changes required modifications to my software, and the backwards-compat tools that are provided are buggy and insufficient. Meanwhile, you can still write and run winmm[1] applications on Windows 10, and they will work in almost all cases. It's simply the case that the win32 API is more stable than Linux userland APIs, so it's entirely plausible that games will run better in Wine, which shares that stable ABI, than they will on Linux, especially as time goes on and Linux userland shifts yet again.
[1] winmm dates to the Windows 3.x days!
Situations like this can be pretty messy and this is not a case of open source versus secret closed source anti-cheat stuff either, since the glibc issue affected a Native Linux game (Shovel Knight) and Linux software libstrangle. No doubt there are other things yet to be discovered that were broken by the change.
It is of course also a case that Linux distributions need to ensure they do quality assurance testing, especially for gaming which can end up showing up issues quite easily and that bleeding-edge distributions can and clearly do end up breaking things by pulling new software in so quickly.
Quote...on more bleeding-edge Linux distributions...
Quoting: dibzQuote...on more bleeding-edge Linux distributions...
Rarely do I run into issues on "bleeding edge" distributions. It's also important to note that "Bleeding Edge" distributions aren't generally using "Bleeding edge" versions of software. Latest upstream stable versions? yes
Granted, when I think bleeding edge, I think software in beta stage (complete but still testing). There is a HUGE HUGE HUGE problem with certain distributions still using versions of upstream software that the upstream project has made end of life (Python 2.x anyone?)..... which is not a good practice in today's security environment.
Quoting: minidouNothing got broken. A two decade of depcrecation function got removed, but nothing broken. Or do we just expect everything to be forever maintained ?
As was mentioned by Arek Hiler in his blog post, that "Two Decade deprecation" was very poorly documented and warnings were not easily seen/found/displayed.
For those 16 years, it was Glibc who provided the compatibility and overrode the defaults for everyone and there never were any easy-to-spot deprecation warnings.
It’s also unrealistic to expect every ELF consumer to keep up with undocumented developments of the format.
QuoteI don't expect anyone to check, I expect a CI or a quality gate to stop them from shipping.
Can't have a CI check if the change is poorly documented and not made glaringly obvious to the developer. Working as a software dev myself, I generally do not go to project pages and discussions to find out more about the library I'm using. I rely on the documentation and any output the library gives me. I don't care if it was deprecated for almost 20 years. If it wasn't well documented where developers could see AND warnings were obviously placed, then I'm probably not going to see it.
Even then, glibc made a breaking change without bumping the major version. That's a big break in protocol. In most versioning schemes, the major version is used to denote breaking changes. Regardless if those changes will actually break anything, if it breaks any existing functionality, it's a breaking change.
QuoteI'll call it bad practice, or just not being up to 2022 standards.I'm sorry, but my terminal generally doesn't have enough scroll back to see all deprecation warnings in the compilation, let alone an easy way to highlight them well. Not to mention, there are a LOT of deprecation warnings in compiled software to the point where developers get so overloaded by warnings that end up amounting to nothing that they just start ignoring them.
A better solution is to update the documentation and put out a big notification ahead of releasing a breaking change.
Last edited by EagleDelta on 17 August 2022 at 2:32 pm UTC
The discussion if this is appropriate should and will happen in glibc's channels and only dependent on their goals as API developers. It is very much fine to break your ABI _if_ this is the kind of API you are providing. The kernel is not but they have to decide if their "backwards compatible, portable, and high performance ISO C library" is and should be.
Depending on the outcome of that the distro or runtime maintainers (e.g. Valve) will discuss if or how glibc is appropriate to use for their goals.
I personally hope that something changes in one of these points.
To add a personal note from a non-game-developer: I'm also annoyed by glibc but don't know if we'd fare better with others like musl (any experiences?). There have been breaking changes in the past even for applications that don't use it directly. We keep recompiling a lot after every dependency change against multiple versions and their compatibility to others in order to ensure everything still works. I for example currently cannot release any binaries compiled on my machine because of braking changes when using openssl with glibc (whoops!). As a maintainer I've had lots of integration failures because of glibc. This costs a lot of time especially when testing it (and waiting for all the tests to release a security fix that needs to be pushed NOW). So I can relate to any game developer which just wants to fire and forget their finished fun little puzzle game.
Case A:
At one point Windows server had an update with a breaking change which crashed our test machine; we reported to Microsoft, two days later we got a new update reverting the previous update. Business as usual. We were just one company out of the many many companies that run on Microsoft's platforms, yet they immediately gave us a quick-line to their experts and made a revert because breaking applications is not OK.
Case B:
We had another issue with a small application that relied on an external device to access a proprietary system. The driver for said device only supported Windows 2000. Thus that machine had to be completely quarantined and we had to keep using an old Win2k system just so we keep the application running. Could not get the hardware devs to ship updated drivers or the application devs to ship and update with support for newer hardware because both companies had gone bankrupt and disappeared a decade earlier. Now a decade later I expect that this application is still crucial even though all it was used for is one small function of the application.
This is reality. This is how the situation is for professionals, whether you are talking about IT systems or heck even infrastructure.
PS. Yeah my customers were not your average Pa & Ma's Baking Corner but a global enterprise so yes we had a bit of weight to throw around in case A, and they absolutely had the finances to make a replacement software in case B, but that would take years of development and testing to make sure it still worked the same, and since it was a proprietary system they might not even have been legally able to do it.
PPS. We have seen an analog version of this recently with trains to/from Ukraine and the different gauges for rail tracks. I found a good map on jakubmarian.com. Now replace that with glibc versions.
Quoting: EagleDeltaQuoting: dibzQuote...on more bleeding-edge Linux distributions...
Rarely do I run into issues on "bleeding edge" distributions. It's also important to note that "Bleeding Edge" distributions aren't generally using "Bleeding edge" versions of software. Latest upstream stable versions? yes
Granted, when I think bleeding edge, I think software in beta stage (complete but still testing). There is a HUGE HUGE HUGE problem with certain distributions still using versions of upstream software that the upstream project has made end of life (Python 2.x anyone?)..... which is not a good practice in today's security environment.
Honestly, when I saw the headline with Valve and glibc my very first thought before reading the article was "that's what you get for choosing Arch for newer packages." Which was their stated reason for the switch to Arch in the past. Once I read the article I found out that wasn't the case, and they were just commenting on the situation.
I'll admit my idea of Bleeding Edge may also be an outdated view of what the difference is, which would be a shame if that's the case. Why on earth would people blur that line on purpose? It's silly because the term becomes meaningless if people put it on everything. Bleeding Edge used to mean, essentially, Nightly. Not only would compatibility not be guaranteed between updates, it could completely break.
Personally I still prefer how Debian labels things with stable / testing / unstable. It's clear what it is and isn't, I would consider unstable to be their bleeding edge. I'm not personally advocating using Debian or anything, it's just a good example.
Quoting: dibzPersonally I still prefer how Debian labels things with stable / testing / unstable. It's clear what it is and isn't, I would consider unstable to be their bleeding edge. I'm not personally advocating using Debian or anything, it's just a good example.
The problem with that is that, as great as Debian is, "stable" runs a LOT of software that the upstream developers no longer support at all and haven't for years.
Which, IMO, is very different than what's happening here. glibc accidentally broke some software and games, but refuse to revert the change. Most projects don't take that kind of stance. They will revert breaking changes and work on a future resolution to the problem instead of pointing blame at someone else.
As for comments on Valve using an Arch-base. That's needed for what they do as fixes for gaming specifically are only found in newer software, drivers, and Linux kernels. Running something like Debian stable would be a nightmare for gaming as things are out of date and unusable for a lot of games really fast. Especially newer games and especially of those games (or WINE/DXVK/etc) rely on newer driver features that may require newer kernel features to function, etc.
Which is where part of the issue lies with gaming and Linux. A lot of "vocal" linux users and devs want GameDevs to develop for Linux, but ALSO to conform to what those "vocal" users/devs think is the "right way" and People don't work like that. Linux has to go to THEM and make their lives easy/easier when working with Linux and do so in a way that THEY are familiar with or they will just nope out and not care. And that will happen because the perception is that we, the linux community, don't care about their perspective.... so why should they care about us?
I don't know if anyone is right or wrong on this particular topic, but it shows one thing: Linux on the desktop is several products pieced together (hence why it's called a "distribution") so sometimes when a project goes its own way (and rightly so, it's their own work we're talking about), the other actors of said Linux desktop don't always take the time or resources to work on the issues that could potentially come from it (and rightly so, for the exact same reason).
I think it's part of how things are on Linux, it's not necessarily good or bad, I learned to live with it as a matter of fact. Things could certainly be improved though.
Last edited by omer666 on 17 August 2022 at 5:43 pm UTC
Quoting: EagleDeltaQuoting: dibzPersonally I still prefer how Debian labels things with stable / testing / unstable. It's clear what it is and isn't, I would consider unstable to be their bleeding edge. I'm not personally advocating using Debian or anything, it's just a good example.
The problem with that is that, as great as Debian is, "stable" runs a LOT of software that the upstream developers no longer support at all and haven't for years.
Which, IMO, is very different than what's happening here. glibc accidentally broke some software and games, but refuse to revert the change. Most projects don't take that kind of stance. They will revert breaking changes and work on a future resolution to the problem instead of pointing blame at someone else.
As for comments on Valve using an Arch-base. That's needed for what they do as fixes for gaming specifically are only found in newer software, drivers, and Linux kernels. Running something like Debian stable would be a nightmare for gaming as things are out of date and unusable for a lot of games really fast. Especially newer games and especially of those games (or WINE/DXVK/etc) rely on newer driver features that may require newer kernel features to function, etc.
Which is where part of the issue lies with gaming and Linux. A lot of "vocal" linux users and devs want GameDevs to develop for Linux, but ALSO to conform to what those "vocal" users/devs think is the "right way" and People don't work like that. Linux has to go to THEM and make their lives easy/easier when working with Linux and do so in a way that THEY are familiar with or they will just nope out and not care. And that will happen because the perception is that we, the linux community, don't care about their perspective.... so why should they care about us?
I wasn't actually advocating using debian stable (which I noted), it's simply a good example of a distro where updates are extremely unlikely to break it -- not that it's impossible, stable distros still receive security updates while they're still supported and you never know, some app or another might depend on the patched behavior.
Similar with Arch. I probably needed a /s for sarcasm. It's fine if you use arch by the way.
Yeah, one of the best and worst things about Linux is the vocal community. And let's be honest, the people that flock tend to have very strong personalities -- borderline autistic at times. Bad advice is extremely rampant, often times based simply on personal preference OR it puts themselves in a position of power where the advice seeker is forced to continue asking for help. It can be difficult for first timers to tell the difference. On the other hand, if one is able to navigate those waters, it's an amazing community with many different perspectives and often times intelligent reasoned discourse. Heck, look at the comment section on this site sometimes, it's rarely to the point of "requiring moderation" so to speak, but it can get fairly intense at times.
Unfortunately, and this isn't fair to engineers, but I tend to call your last bit and the behavior you describe the "engineer attitude." I'm a sysops guy, and I run into that a lot when dealing with purely-engineer folks, especially in more recent years (last 5 years maybe? could be 10...). The my-problem/not-my-problem strictness, which in the workplace can absolutely be necessary, but it can and often is abused as well. Some will recognize this as being strongly related to common workflows in practice today. You're a lot more productive the more work that isn't your problem, lol. Honestly it might be a little more fair to ascribe this to Project Managers, but certain personalities blend really well with it.
Last edited by dibz on 17 August 2022 at 3:38 pm UTC
Quoting: shorbergCase A:
At one point Windows server had an update with a breaking change which crashed our test machine; we reported to Microsoft, two days later we got a new update reverting the previous update. Business as usual. We were just one company out of the many many companies that run on Microsoft's platforms, yet they immediately gave us a quick-line to their experts and made a revert because breaking applications is not OK.
Being an admin too, I can relate to that. The only thing I can maybe "critic" is that here you are talking about Windows Server. So, I don't know if M reverting to the previous version is due to the company you worked for being huge or the problem being on Windows Server. On the server side, stability is an important thing.
This year, an important software, of a company I used to work for, randomly ceased to work on Windows 10. Fortunately, they produced a patch rapidly. "Something" was changed in Windows 10, and they had to update sone of their code for it.
Both you and I have stories about zombie systems still maintained because something was broken in the Microsoft APIs. So, when I hear Windows API are more stable, I'm a bit sceptic. Particularly when I read people writing things like "I can still run software from Windows 3.x on Windows 10". Yes, they can run some, but the vast majority won't work any more. I think 3.x software started to break around Windows XP era. Even as gamers, how may of our 90s game, can we still start today on Windows 10/11? Not much, same is happening or has already happened with 00s games. And I think I have a couple 2010s games that are starting to act weird.
For me, the only thing we can hold glibc dev accountable for is the poor communication around the depreciation of this function. (proprietary software are not better on that one) But let's take it with a grain of salt because anyone who has been asked to write a doc will tell you no one like to write them and no one want to write them. If the communication was done properly this wouldn't have happened.
On Linux, we are not yet used to run zombie systems. But I feel it will soon come, as there is not only zombie Windows out there. I'll now say monster names that will make some Sysadmins have cold sweat. AS400, Sun Solaris, OS/2, HP-UX, DOS. They are still out there waiting in the shadow for your helpdesk to be the more vulnerable. And then they strike, they have a hiccup making some big company piss their pants because their whole business rely on them.
Anyway when I looked into the reason behind the issue... yes, it is hard to blame the distro nor GlibC.
Humans will invent general artificial intelligence, and maybe go to Mars, but, if and when that happens, games will still be using 32 bit libraries and running on x86.
Quoting: dibzAn interesting fact is that the very reason why we can talk about those issues in the first place is precisely because of Linux's open source nature. A Windows user could say "Windows Update broke my toy", but on Linux it's much easier to point out one (group of) person's fault. It is a double edged sword, really. We are a very tech-savy, software engineering curious community, but like every enthusiast population, we can have very strong-minded opinions.Quoting: EagleDeltaWhich is where part of the issue lies with gaming and Linux. A lot of "vocal" linux users and devs want GameDevs to develop for Linux, but ALSO to conform to what those "vocal" users/devs think is the "right way" and People don't work like that. Linux has to go to THEM and make their lives easy/easier when working with Linux and do so in a way that THEY are familiar with or they will just nope out and not care. And that will happen because the perception is that we, the linux community, don't care about their perspective.... so why should they care about us?Yeah, one of the best and worst things about Linux is the vocal community. And let's be honest, the people that flock tend to have very strong personalities -- borderline autistic at times. Bad advice is extremely rampant, often times based simply on personal preference OR it puts themselves in a position of power where the advice seeker is forced to continue asking for help. It can be difficult for first timers to tell the difference. On the other hand, if one is able to navigate those waters, it's an amazing community with many different perspectives and often times intelligent reasoned discourse. Heck, look at the comment section on this site sometimes, it's rarely to the point of "requiring moderation" so to speak, but it can get fairly intense at times.
Last edited by omer666 on 17 August 2022 at 4:43 pm UTC
This is not about Arch, or about Arch's bleeding edge nature.
I don't use Arch and I'm not the kind of guy who wants to run bleeding edge. But if anything, Arch here functioned as a useful coal mine canary. If a new GlibC had contained a bad behaviour that the GlibC people just hadn't caught yet, and they quickly rushed out a bugfix point release that no longer did that, and the only distro that pushed out that first, buggy GlibC without waiting was Arch, then it would be about Arch. But that's not the story here--the story here is that the GlibC people created the breakage deliberately and are hesitant about the idea of un-creating it. Whether the breakage is justified or not (I'm leaning not), that means everyone, all distros, in the end would be finding themselves distributing a GlibC that creates that breakage. Arch was just the first to expose the issue.
This is doubly not about Valve using Arch as the base for SteamOS because SteamOS is not a bleeding edge distro in any way. It just uses a heavily curated snapshot of Arch as a base, but there is no way it would find itself inheriting a problem like this.
Quoting: JpxeIsn't Flatpak the solution to this?
I think lbic is low level enough to be considered a fundamental library even for the above.
Windows, which have nearly entire market, so people must adapt.
Also, on one polish University device was created and they had problems write drivers for Windows. They wrote drivers for Linux without problems. Again.. This device do not be very popular due Windows had entire market and was stop existing. Simple - nobody bought it, so nobody produce it.
Quoting: Purple Library GuyUm, how many of these other thingies are we typically talking about existing at a time?Run ldd on a binary and count the number of libraries that are referenced.
Quoting: DerpFoxBeing an admin too, I can relate to that. The only thing I can maybe "critic" is that here you are talking about Windows Server. So, I don't know if M reverting to the previous version is due to the company you worked for being huge or the problem being on Windows Server. On the server side, stability is an important thing.Fair critique.
Quoting: DerpFoxThis year, an important software, of a company I used to work for, randomly ceased to work on Windows 10. Fortunately, they produced a patch rapidly. "Something" was changed in Windows 10, and they had to update sone of their code for it.Aint that a blast. Gotta love the "something" part.
Quoting: DerpFoxBoth you and I have stories about zombie systems still maintained because something was broken in the Microsoft APIs. So, when I hear Windows API are more stable, I'm a bit sceptic. Particularly when I read people writing things like "I can still run software from Windows 3.x on Windows 10". Yes, they can run some, but the vast majority won't work any more. I think 3.x software started to break around Windows XP era. Even as gamers, how may of our 90s game, can we still start today on Windows 10/11? Not much, same is happening or has already happened with 00s games. And I think I have a couple 2010s games that are starting to act weird.I'm not sure if that is because of the Win32 API or some other APIs but that was definitely already a case with XP as you noted. For the uninitiated (I envy you) this was when Windows stopped being built as a DOS application and moved to the NT-platform.
And to add to what DerpFox said, it has also been a big pain point for many gamers with the switch to Windows 10, as $SEARCH_ENGINE can tell.
Quoting: DerpFoxI'll now say monster names that will make some Sysadmins have cold sweat. AS400, Sun Solaris, OS/2, HP-UX, DOS.Geez.. Language, man!
Since you love to throw around scary words like that on poor unsuspecting strangers; I'm gonna have to pay you back and ask if you have read the fine blog by Raymond Chen?
Last edited by Cloversheen on 17 August 2022 at 5:56 pm UTC
See more from me