Confused on Steam Play and Proton? Be sure to check out our guide.
We do often include affiliate links to earn us some pennies. See more here.

After Canonical announced they would be ending 32bit support earlier this year and then adjusting their plans after the backlash, they've now posted what packages they will look to continue supporting.

Canonical's Steve Langasek posted on their Discourse forum a list which they "have been able to determine there is user demand based on the feedback up to this point" and they will "carry forward to 20.04" and that includes other packages not directly in the list that they may depend on.

Additionally, their methodology for picking the packages included ensuring some well-known apps continue working like Unity, Godot, printer drivers and more. The list includes some noteworthy items like SDL 2, Wine, DXVK, Steam, some Mesa packages, a few open source games and so on.

See the full post here, where Langasek did mention to give feedback if you feel essential 32bit packages are missing from their list. It's good to see some clarity surrounding it, hopefully this won't cause any issues now.

Article taken from GamingOnLinux.com.
Tags: Distro News
24 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly checked on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. You can also follow my personal adventures on Bluesky.
See more from me
The comments on this article are closed.
All posts need to follow our rules. For users logged in: please hit the Report Flag icon on any post that breaks the rules or contains illegal / harmful content. Guest readers can email us for any issues.
35 comments
Page: «2/2
  Go to:

Shmerl Sep 17, 2019
The dumb thing is these packages are mostly handled by the build system. So there isn't actually a person who manually builds these, they just hit a build server, and you know Debian isn't going to drop 32bit support anytime soon, Ubuntu is just trying to be like Apple.

Problems might start creeping in, when upstream (i.e. library developers) will decide, that supporting 32-bit is too much of a burden. It's not as simple as "just build it" usually.


Last edited by Shmerl on 17 September 2019 at 7:46 pm UTC
Shmerl Sep 17, 2019
As far as I remember then libraries that interface with drivers will have to be the same version, so nvidia and mesa will have to be a current version in the container. The programs in the container run on the same kernel.
And in the Ubuntu world frozen does not mean unmaintained, even if it sometimes seems like it or even is. Security and other bugfixes will be backported to the "frozen" version.

Why do you need containers then, if libraries there will be recent? Multiarch already works fine for that. Containers make sense for frozen case, when there are no more upstream updates coming.


Last edited by Shmerl on 17 September 2019 at 7:45 pm UTC
Redface Sep 17, 2019
The dumb thing is these packages are mostly handled by the build system. So there isn't actually a person who manually builds these, they just hit a build server, and you know Debian isn't going to drop 32bit support anytime soon, Ubuntu is just trying to be like Apple.

The build system is set up to support 32 bit installs on 32bit i386 processors which Debian supports for all distributions, and Ubuntu still for LTS 16.04 and LTS 18.04. But since there is no 32 bit installer for 18.04 or newer and no upgrade path for those on 32 bit processors most 32 bit packages since 18.10 will never be installed by a user.

But they still can fail to build and a maintainer has to look into why.

There are also really many of those. Ubuntu has for now 200 packages on the list, lets say that grows to 500.
I earlier counted 33365 available 32 bit packages on 19.04, so lets say 28000 wasted effort and resources. And Ubuntu always has always around 5 different releases supported or under development, so we now are around 100000 packages build that no one uses.

Apart from maintainer time this uses electricity, storage and bandwidth, which all could be used for something useful instead.

Apple is AFAIK completely disabling running 32 bit programs except from inside virtual machines, and soon there will be no MacOS version left with 32 bit support that still is supported. Ubuntu wants to stop building packages no one uses, and in the future finding another solution for the few hundred packages left that are needed to run all kind of 32 bit programs. But already in the first plan they wanted to make sure users still can run 32 bit programs, the solution for that was still not ready though, so they came up with this plan.
So really not like Apple.
How is that similar?
Redface Sep 17, 2019
Why do you need containers then, if libraries there will be recent? Multiarch already works fine for that. Containers make sense for frozen case, when there are no more upstream updates coming.

Mixed cases, some libraries do not always stay backward compatible, while the kernel tries to never break userspace.

So the games might work with current graphics drivers but not with some network library or whatever.

Did you never encounter games or other programs that have problems with newer libraries?

I am not convinced that this is the way forward to run old games, but it is better than virtual machines since containers can get access to the all parts of the system you want.

And containers where the original plan, for now until at least 20.04 Ubuntu will built all requested packages for 32 bit, and who knows for the next distribution after that maybe too, but I understand they do not want to commit to that now. 20.04 will be supported until April 2025 that is already a long time they will stick to this new scheme for that release at least.


Last edited by Redface on 17 September 2019 at 8:16 pm UTC
slaapliedje Sep 17, 2019
The dumb thing is these packages are mostly handled by the build system. So there isn't actually a person who manually builds these, they just hit a build server, and you know Debian isn't going to drop 32bit support anytime soon, Ubuntu is just trying to be like Apple.

The build system is set up to support 32 bit installs on 32bit i386 processors which Debian supports for all distributions, and Ubuntu still for LTS 16.04 and LTS 18.04. But since there is no 32 bit installer for 18.04 or newer and no upgrade path for those on 32 bit processors most 32 bit packages since 18.10 will never be installed by a user.

But they still can fail to build and a maintainer has to look into why.

There are also really many of those. Ubuntu has for now 200 packages on the list, lets say that grows to 500.
I earlier counted 33365 available 32 bit packages on 19.04, so lets say 28000 wasted effort and resources. And Ubuntu always has always around 5 different releases supported or under development, so we now are around 100000 packages build that no one uses.

Apart from maintainer time this uses electricity, storage and bandwidth, which all could be used for something useful instead.

Apple is AFAIK completely disabling running 32 bit programs except from inside virtual machines, and soon there will be no MacOS version left with 32 bit support that still is supported. Ubuntu wants to stop building packages no one uses, and in the future finding another solution for the few hundred packages left that are needed to run all kind of 32 bit programs. But already in the first plan they wanted to make sure users still can run 32 bit programs, the solution for that was still not ready though, so they came up with this plan.
So really not like Apple.
How is that similar?
It is similar because they wanted to ditch 32 bit outright until everyone threatened to stop using it
slaapliedje Sep 17, 2019
The dumb thing is these packages are mostly handled by the build system. So there isn't actually a person who manually builds these, they just hit a build server, and you know Debian isn't going to drop 32bit support anytime soon, Ubuntu is just trying to be like Apple.

The build system is set up to support 32 bit installs on 32bit i386 processors which Debian supports for all distributions, and Ubuntu still for LTS 16.04 and LTS 18.04. But since there is no 32 bit installer for 18.04 or newer and no upgrade path for those on 32 bit processors most 32 bit packages since 18.10 will never be installed by a user.

But they still can fail to build and a maintainer has to look into why.

There are also really many of those. Ubuntu has for now 200 packages on the list, lets say that grows to 500.
I earlier counted 33365 available 32 bit packages on 19.04, so lets say 28000 wasted effort and resources. And Ubuntu always has always around 5 different releases supported or under development, so we now are around 100000 packages build that no one uses.

Apart from maintainer time this uses electricity, storage and bandwidth, which all could be used for something useful instead.

Apple is AFAIK completely disabling running 32 bit programs except from inside virtual machines, and soon there will be no MacOS version left with 32 bit support that still is supported. Ubuntu wants to stop building packages no one uses, and in the future finding another solution for the few hundred packages left that are needed to run all kind of 32 bit programs. But already in the first plan they wanted to make sure users still can run 32 bit programs, the solution for that was still not ready though, so they came up with this plan.
So really not like Apple.
How is that similar?
By the way, the i386 packages and the packages for the 32bit version of the distribution are the same thing, so it is the fault of Ubuntu itself for ditching 32bit supported iso. Debian still supports it (as well as many other architectures) so and Ubuntu just rebuild the Debian packages. So is it REALLY that much more effort for them to continue to do so?
tonR Sep 17, 2019
Additionally, their methodology for picking the packages included ensuring some well-known apps continue working like Unity, Godot, printer drivers and more.
Well, for me it sounds like Canonical wanted to try "pay to maintain" model to some "profitable entities".

We need to understand that 32 bit architecture/application is (and still) unique situation in computer history as it was grown in popularity during the PC market boomed in late 1990's / early-2000 plus popularity and reliability of Win XP that no other Windows before and after can matched it's ability. As like Android apps today, many software developers builder at that time built their program/apps around Win XP and 32 bit. Many of those software still being used, reliable, sometimes up-to-date and more importantly many of those can run on Wine perfectly better than run on some modern Windows.

So, let's say 32bit "things" (and all it's argument of existence) will stay around for a very long time.
slaapliedje Sep 17, 2019
The dumb thing is these packages are mostly handled by the build system. So there isn't actually a person who manually builds these, they just hit a build server, and you know Debian isn't going to drop 32bit support anytime soon, Ubuntu is just trying to be like Apple.

Problems might start creeping in, when upstream (i.e. library developers) will decide, that supporting 32-bit is too much of a burden. It's not as simple as "just build it" usually.
Never actually seen it this way, I have seen it the other way where things won't build on 64bit because of machine language and such, but most languages are more abstracted for that, and since the 64bit CPUs are just the extension of the old x86, I can't imagine there being many that are 'we can't compile this on 32bit, but works on 64bit just fine!' unless they start requiring more than 4gb of memory, there is no reason for it.
Shmerl Sep 18, 2019
In theory, but in practice it's not always that simple.
Phlebiac Sep 18, 2019
Canonical's resources could be used more productively

Canonical has never seemed too interested in using their resources productively; rather they do things like Mir, Unity, Snap... although recently one of their developers *has* been submitting GNOME patches. A pleasant change to see them actually contributing to upstream projects.
Arten Sep 18, 2019
In all honesty, 32 bit stuff DOES need to go at some point. I mean, for how long is Linux supposed to carry on that old baggage?

That Steam (which is one of the most important Linux applications there is, and is maintained by a multi-billion dollar business) STILL doesn't have a 64 bit client is quite frankly unforgivable.

I would really think they should agree on a reasonable grace period and then elbow people into finally updating their legacy 32 bit apps. If after that date, people still -really- insist on running decades-old software or even older hardware, they can still maintain and build these packages themselves. It's open source software, after all.

Couple of years longer then Windows if we want linux be widely used as desktop OS. You know, on windows there is still developed on Visual Studio 2019, which is partialy 32bit application? (main process is 32bit) Lots of 32 bit apps are still developed and no one have courage to change it, because transition can break too many thinks and cost milions in damages (medical software, accounting software). And there is one more think, lots of games, new 64bit games, uses 32bit luncher. If you want to run new game on linux, you still need run 32bit wine.


Last edited by Arten on 18 September 2019 at 6:10 am UTC
oldrocker99 Sep 18, 2019
View PC info
  • Supporter Plus
I moved from 11 years of Ubuntu to Manjaro, so I could have programs which Canonical has, over the years, decided to remove, because science.

All the software that Canonical:><: has removed from their repos is still available in the AUR. It's also faster than any Ubuntu flavor, running ~20% of the daemons that Ubuntu does, at boot, idle.

And you only need to install it once. After 11 years of installing new versions of Ubuntu, I've finally moved to a rolling release.

You don't get "RTFM" answers for your questions, unlike the rather arrogant Arch forums; it's pretty much like the Ubuntu Forums. Manjaro does test new programs from Arch, so no Win10-like breaking of a system after an update. Besides, it's easy to downgrade a problematic file.

It's as easy to install as Ubuntu, and it is as suitable for rank n00bz as for old hands.

HIGHLY recommended.
Redface Sep 18, 2019
It is similar because they wanted to ditch 32 bit outright until everyone threatened to stop using it

That is not true, do not believe online trolls or blog posts based on those. Granted the communication from Canonical could have been a lot better, but much of the controversy was based on misunderstandings or outright lies.

From the original announcement at https://discourse.ubuntu.com/t/intel-32bit-packages-on-ubuntu-from-19-10-onwards/11263/2

Q. Doesn’t Steam use 32 bit libraries? How can I play my games?

Steam itself bundles a runtime containing necessary 32-bit libraries required to run the Steam client. In addition each game installed via Steam may ship 32-bit libraries they require. We’re in discussions with Valve about the best way to provide support from 19.10 onwards.

It may be possible to run 32 bit only games inside a lxd container running a 32 bit version of 18.04 LTS. You can pass through the graphics card to the container and run your games from that 32bit environment.

Q. How can I run 32-bit Windows applications if 32-bit WINE isn’t available in the archive?

Try 64-bit WINE first. Many applications will “just work”. If not use similar strategies as for 32 bit games. That is use an 18.04 LTS based Virtual Machine or LXD container that has full access to multiarch 32-bit WINE and related libraries.

Q. I have a legacy proprietary 32-bit Linux application on my 64-bit installation. How can I continue running it.

Run an older release of Ubuntu which supports i386, such as 16.04 LTS or, preferably 18.04 LTS in a Virtual Machine or LXD container as above.

And if that is not enough look what Valve wrote: https://steamcommunity.com/app/221410/discussions/0/1640915206447625383/
There's a lot more to the technical and non-technical reasons behind our concerns, but the bottom line is that we would have had to drop what we're doing and scramble to support the new scheme in time for 19.10. We weren't confident we could do that without passing some of the churn to our users, and it would not solve the problems for third-party software outside of Steam upon which many of our users rely.

So they did not like that, and I and many other did not either, but it still would have been possible to run steam and other 32bit programs.
slaapliedje Sep 19, 2019
It is similar because they wanted to ditch 32 bit outright until everyone threatened to stop using it

That is not true, do not believe online trolls or blog posts based on those. Granted the communication from Canonical could have been a lot better, but much of the controversy was based on misunderstandings or outright lies.

From the original announcement at https://discourse.ubuntu.com/t/intel-32bit-packages-on-ubuntu-from-19-10-onwards/11263/2

Q. Doesn’t Steam use 32 bit libraries? How can I play my games?

Steam itself bundles a runtime containing necessary 32-bit libraries required to run the Steam client. In addition each game installed via Steam may ship 32-bit libraries they require. We’re in discussions with Valve about the best way to provide support from 19.10 onwards.

It may be possible to run 32 bit only games inside a lxd container running a 32 bit version of 18.04 LTS. You can pass through the graphics card to the container and run your games from that 32bit environment.

Q. How can I run 32-bit Windows applications if 32-bit WINE isn’t available in the archive?

Try 64-bit WINE first. Many applications will “just work”. If not use similar strategies as for 32 bit games. That is use an 18.04 LTS based Virtual Machine or LXD container that has full access to multiarch 32-bit WINE and related libraries.

Q. I have a legacy proprietary 32-bit Linux application on my 64-bit installation. How can I continue running it.

Run an older release of Ubuntu which supports i386, such as 16.04 LTS or, preferably 18.04 LTS in a Virtual Machine or LXD container as above.

And if that is not enough look what Valve wrote: https://steamcommunity.com/app/221410/discussions/0/1640915206447625383/
There's a lot more to the technical and non-technical reasons behind our concerns, but the bottom line is that we would have had to drop what we're doing and scramble to support the new scheme in time for 19.10. We weren't confident we could do that without passing some of the churn to our users, and it would not solve the problems for third-party software outside of Steam upon which many of our users rely.

So they did not like that, and I and many other did not either, but it still would have been possible to run steam and other 32bit programs.
It isn't that I believe online trolls, I still see their original post as 'just use snaps'. Because that's exactly what it says. It also says if you have 32bit hardware, sorry bro, move to something else or stick to 18.04 until it's EOL. Granted that's 5 more years, and yes you probably shouldn't be running 32bit hardware at this point, but looking at it from another point of view, they're not vulnerable to the whole speculative execution bullshit :P
slaapliedje Sep 19, 2019
I moved from 11 years of Ubuntu to Manjaro, so I could have programs which Canonical has, over the years, decided to remove, because science.

All the software that Canonical:><: has removed from their repos is still available in the AUR. It's also faster than any Ubuntu flavor, running ~20% of the daemons that Ubuntu does, at boot, idle.

And you only need to install it once. After 11 years of installing new versions of Ubuntu, I've finally moved to a rolling release.

You don't get "RTFM" answers for your questions, unlike the rather arrogant Arch forums; it's pretty much like the Ubuntu Forums. Manjaro does test new programs from Arch, so no Win10-like breaking of a system after an update. Besides, it's easy to downgrade a problematic file.

It's as easy to install as Ubuntu, and it is as suitable for rank n00bz as for old hands.

HIGHLY recommended.
To be fair to the Arch forums, they do have REALLY good manuals. Probably the best I've seen for any distribution. Debian needs more volunteers to keep theirs up to date, where most of them still only cover distribution releases that are 3 or more older. Debian also has some highly unused forums. But then usually most people running Debian know what they're doing and only occasionally need some advice.

Ubuntu's forums were so good because it was the easy Debian for such a long time, and took off really well. I stopped using it once they started diverging from main Debian so much with upstart, then unity, and so on. I wanted to like it again when they dropped Unity and went back to the standard Gnome, but then they started this crap with snaps, and removing a bunch of things out of their repos, so I just go back to Debian that I've been using for a little more than 2 decades.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.