There's been an urgent security bulletin sent out in a few places today in the Linux sphere that relates to the XZ tools and libraries with liblzma, as certain version have been compromised.
From the OpenWall security list:
After observing a few odd symptoms around liblzma (part of the xz package) on Debian sid installations over the last weeks (logins with ssh taking a lot of CPU, valgrind errors) I figured out the answer:
The upstream xz repository and the xz tarballs have been backdoored.
At first I thought this was a compromise of debian's package, but it turns out to be upstream.
From what they say the issue is present in version 5.6.0 and 5.6.1 of the libraries.
This has led to Red Hat putting up an urgent blog post on the matter, noting that so far Fedora Linux 40 is okay but you should "immediately stop usage of any Fedora Rawhide instances" as they were updated but they're going to be reverting to an older version.
For those not clear on what it is, as Red Hat noted: "xz is a general purpose data compression format present in nearly every Linux distribution, both community projects and commercial product distributions. Essentially, it helps compress (and then decompress) large file formats into smaller, more manageable sizes for sharing via file transfers".
Red Hat also noted the "malicious build interferes with authentication in sshd via systemd" and so "Under the right circumstances this interference could potentially enable a malicious actor to break sshd authentication and gain unauthorized access to the entire system remotely".
Debian also has a security advisory up on it noting that "no Debian stable versions are known to be affected" but the compromised packages were part of "Debian testing, unstable and experimental distributions" which they have reverted as well.
On the Ubuntu side they have a Discourse forum post noting the affected package was removed from "Ubuntu 24.04 LTS (Noble Numbat) proposed builds" and they're continuing to investigate.
It has been assigned as CVE-2024-3094 noting it is a critical issue.
So you'll want to ensure any XZ packages are not at version 5.6.0 or 5.6.1, and check the news directly from your chosen distribution for updates on it.
Update 02/04/24: the Binarly Research Team announced a new free tool to scan an ELF binary for XZ backdoor detection.
Quoting: pleasereadthemanualGentoo has gone so far as to downgrade to 5.4.2, which is the last release signed by the previous maintainer.
Thanks for the analysis and scrutiny, I did a read the news announcement on their website, I don't have a developed informed opinion at this time, but it's good to know that using the Arch Linux Archive you can go back to that version like this
pacman -U https://archive.archlinux.org/packages/x/xz/xz-5.4.2-1-x86_64.pkg.tar.zst
Maybe it's time to get into Gentoo and cut the middle man out.
Edit: There is discussion about whether 5.4.x is old enough and some people suggest going back to 5.3.1 in the thread linked in the comment above.
Last edited by ElectricPrism on 30 March 2024 at 10:51 am UTC
Quoting: kaktuspalmeEven worse, Gentoo had marked a potentially affected version as "stable" and rolled it out to their regular (as in: not testing) user-base...Quoting: bonkmaykrGentoo users probably having a field day laughing at us right now compiling all their stuff from scratch.
Doesn't change anything if the sources are get from the tarball.
At least they are doing the sane thing and roll back to the last version from the previous maintainer.
Quoting: pleasereadthemanualMy issue is that Arch, unlike openSUSE, Gentoo, Nix, Fedora Rawhide, Fedora 40/41, Kali Linux, and Homebrew, are continuing to use the 5.6.1 release, just pulling directly from Github and not the binary tarballs which were compromised.
so you are saying they should use the compromised binary tarballs instead. You said it twice already.
I'm not sure you understand the situation and are basically copy-pasting stuff you find elsewhere here pretending to know a thing.
https://bbs.archlinux.org/viewtopic.php?pid=2160841#p2160841
Last edited by sudoer on 30 March 2024 at 10:35 am UTC
1. Why in the hell is anyone still using Github for FOSS? Projects should go independent or literally anywhere else.
2. This really drives home the point that __BINARY CANNOT BE TRUSTED__ -- we really should hammer this idea in HARD.
If you can't see the source, how are you going to verify (I) A Signature, and (II) What the software does and does not do.
I could see it being Legally Mandated that ALL SOFTWARE is required to publish their source code to ensure that malicious foreign actors haven't hidden things in the software.
__BINARY CANNOT BE TRUSTED__ -- of course most of you already know this -- Easy-Anti-Cheat? What does it really do? Denuvo? What does it really do? How do we know that it's safe? How can be verify that software is safe?
Binary Cannot Be Trusted. Fuck GitHub.
Does this mean every package ending with tar.xz have risks ?
Quoting: ElectricPrismHaving read and reflected more, I feel like there are at least 2 points to drive home.
1. Why in the hell is anyone still using Github for FOSS? Projects should go independent or literally anywhere else.
2. This really drives home the point that __BINARY CANNOT BE TRUSTED__ -- we really should hammer this idea in HARD.
If you can't see the source, how are you going to verify (I) A Signature, and (II) What the software does and does not do.
I could see it being Legally Mandated that ALL SOFTWARE is required to publish their source code to ensure that malicious foreign actors haven't hidden things in the software.
__BINARY CANNOT BE TRUSTED__ -- of course most of you already know this -- Easy-Anti-Cheat? What does it really do? Denuvo? What does it really do? How do we know that it's safe? How can be verify that software is safe?
Binary Cannot Be Trusted. Fuck GitHub.
I agree with most of what you say. It's why I am trying to use my system with the libre kernel as much as possible to try to limit my closed source footprint from binary blobs that can be updated on a whim, and include who knows what from the kernel. Using the law is not a solution in my eyes though. I don't feel like I have the right to kill someone in order to enforce FOSS which is exactly what you're asking if you want to create laws. As soon as you make a law you have to be willing to have have people killed to enforce it. If you aren't then you can't have the law. Now i'm fine with that for things like rape and murder but do I want it to happen over someone wanting to use software that I don't agree with ? No. How about we as people support projects that use the ideals we want to further and we as people actually put our money where our mouth is.
Quoting: sudoerAn OS like haiku for example, multi-user sure, multi-threaded sure, memory-protected sure, "network-aware" sure, but not including ssh and other server/corporate functionality.
A FLOSS OS just for the PC.
You are seriously telling us you have never used SSH in a home environment?
I guess Windows 95 was a server/corporate OS for including Network Neighborhood too.
We are going to see a LOT more of this going forward. Be on your toes.
Quoting: ElectricPrism1. Why in the hell is anyone still using Github for FOSS? Projects should go independent or literally anywhere else.In principle I agree with you, and a lot of people agree with you.
In practice it's not so simple. There are good reasons for well-established large and important projects to remain on Github and not migrate to another host (like Gitlab or self-hosting). Some of it is politics (office politics, mailing list politics, or straight up Washington DC politics). Some of it is feasibility. Some of it is because projects are just unbothered by Microsoft owning Github. Some of it is laziness.
In this particular case, I'm not convinced that hosting XZ on Github was a factor in this specific Supply Chain attack.
Quoting: ElectricPrismHaving read and reflected more, I feel like there are at least 2 points to drive home.
1. Why in the hell is anyone still using Github for FOSS? Projects should go independent or literally anywhere else.
2. This really drives home the point that __BINARY CANNOT BE TRUSTED__ -- we really should hammer this idea in HARD.
If you can't see the source, how are you going to verify (I) A Signature, and (II) What the software does and does not do.
I could see it being Legally Mandated that ALL SOFTWARE is required to publish their source code to ensure that malicious foreign actors haven't hidden things in the software.
__BINARY CANNOT BE TRUSTED__ -- of course most of you already know this -- Easy-Anti-Cheat? What does it really do? Denuvo? What does it really do? How do we know that it's safe? How can be verify that software is safe?
Binary Cannot Be Trusted. Fuck GitHub.
While there is some truth to what you're saying this is mostly unrelated to the issue at hand:
1. This would have happened whether the project source code was hosted on GitHub or not. The malicious archives were also available from xz.tukaani.org and the project also has its own git hosting at git.tukaani.org. How does that help mitigate the issue in any way?
2. The affected tarballs were source archives, not binaries. They do happen to contain binary test data because xz is a tool that reads/writes binary data. This test data is not supposed to be used at build time, only during tests by developers or on a CI setup, and it's not supposed to be executable. Now the malicious maintainer introduced some compressed scripts and executable code masquerading as test data in there some time ago, and in the last 2 releases they added a build step that reads and executes that at build time. That extra build step that only exists in the source archives is human-readable code and not binary. No one trusted the binary that gets loaded in the end because because no one knew it even existed.
Source Archives Cannot Be Trusted. Fuck Tar.
Quoting: Nic264Source Archives Cannot Be Trusted.I'm not convinced this is 100% correct. It is still possible to inject malicious code whether or not you are downloading the source code directly or via an archive file.
Reproducible builds plus reproducible archives PLUS much better source code auditing is necessary. The first two alone (reproducible builds plus reproducible archives) will still net the possibility of malicious code (as an OpenSUSE dev confirmed).
Basically, to prevent this going forward, we will need a paradigm shift in source code handling and security best practices, and a LOT more eyeballs on these critical projects that have the potential to sabotage the entire ecosystem.
Last edited by sprocket on 30 March 2024 at 4:19 pm UTC
Quoting: PublicNuisanceAs soon as you make a law you have to be willing to have have people killed to enforce it. If you aren't then you can't have the law.You're an American, aren't you? Just a wild guess.
Quoting: Purple Library GuyLast week, I stationed my car at a parking spot without paying for a parking ticket. I was therefore shot by a cop five minutes later.Quoting: PublicNuisanceAs soon as you make a law you have to be willing to have have people killed to enforce it. If you aren't then you can't have the law.You're an American, aren't you? Just a wild guess.
Advisory: Pay for your parking tickets!
Quoting: HamishI guess Windows 95 was a server/corporate OS for including Network Neighborhood too.
Windows 95 was the continuation of 3.11 (a tiling window-manager) on top of MS-DOS which was developed strictly for the PC in mind, as single-user, single-tasking and not network-aware. Then they started adding mutually incompatible systems on top creating the
Seeing all that, FreeBSD and Haiku make more sense. But they are not "plug'n play"-ready because they lack that manpower.
Last edited by sudoer on 30 March 2024 at 9:38 pm UTC
Quoting: sudoerGNU/Linux as it is now is a chaotic system developed mainly for server use, a victim of antagonizing corporations, each one with their "own" technologies, simply adopted by the PC, usCitation needed. And I would suspect that Stallman and Torvalds both would have objection to the statement that the GNU/Linux ecosystem is mainly for server use.
Quoting: sudoerSeeing all that, FreeBSD and Haiku make more sense.I've tried to use FreeBSD as a daily desktop driver, and it utterly fails to embrace the advances that desktop Linux has pushed over the last decade. As a server OS it is fantastic at what it does, though, provided the applications you want to use are available.
What does this have to do with the XZ tools issue? Nothing. FreeBSD used the same code as everyone else, and had to do the same audits to determine whether it affected them or not.
Quoting: sprocketQuoting: sudoerGNU/Linux as it is now is a chaotic system developed mainly for server use, a victim of antagonizing corporations, each one with their "own" technologies, simply adopted by the PC, usCitation needed. And I would suspect that Stallman and Torvalds both would have objection to the statement that the GNU/Linux ecosystem is mainly for server use.
Quoting: sudoerSeeing all that, FreeBSD and Haiku make more sense.I've tried to use FreeBSD as a daily desktop driver, and it utterly fails to embrace the advances that desktop Linux has pushed over the last decade. As a server OS it is fantastic at what it does, though, provided the applications you want to use are available.
What does this have to do with the XZ tools issue? Nothing. FreeBSD used the same code as everyone else, and had to do the same audits to determine whether it affected them or not.
Torvalds cares only for the kernel. You can find many citations easily. And having 15 (15* as a figure of speech) different compression methods to choose from, instead of one that would be properly maintained by a core team and others, as with many many other subsystems, compilers, musl vs. glibc, 15 different bootloaders, 15 different init-systems, 15 different package-managers, non FHS-respecting systems like NixOS, a dead LSB (Linux Standard Base) attempt, are not exactly what one would call "non-chaotic".
Have you looked into the kernel lately (decades)? The stuff that goes there every month is not for your PC or mine. 99% of servers worldwide are running Linux and our PCs on x86-64 with some petty CPU/GPU/RAM for office applications, multimedia and games are 2-3% of the total PC market.
Last edited by sudoer on 30 March 2024 at 10:21 pm UTC
Quoting: sudoerWindows 95 was the continuation of 3.11 (a tiling window-manager) on top of MS-DOS which was developed strictly for the PC in mind, as single-user, single-tasking and not network-aware.I don't want to argue about off topic issues in article comments but what the heck have you been smoking and where can I get some? Come on dude. Nobody here likes M$ or Windoze but let's stick with facts.
Quoting: sudoerso you are saying they should use the compromised binary tarballs instead. You said it twice already.I'm not. In the interest of caution, I downgraded to an earlier version of xz. I would have felt more reassured had Arch done the same, as every other affected distribution has done. I overstated that in my initial comment, I admit. Hopefully it turns out there was no need for Arch to go further back.
What worries me is that other parts of xz may have been sabotaged. Lasse recently reverted a commit that disabled the Landdlock sandbox e.g., but this is benign: https://git.tukaani.org/?p=xz.git;a=commitdiff;h=f9cf4c05edd14dedfe63833f8ccbe41b55823b00
Quoting: sudoerI'm not sure you understand the situation and are basically copy-pasting stuff you find elsewhere here pretending to know a thing.I'm doing my best to understand the situation. I'm not a security researcher, packager, or developer; just a user. Downgrading to an earlier version of xz sounds like the most reasonable thing to do based on what Nix and Gentoo are doing.
https://bbs.archlinux.org/viewtopic.php?pid=2160841#p2160841
Quoting: pleasereadthemanualDowngrading to an earlier version of xz sounds like the most reasonable thing to do based on what Nix and Gentoo are doing.Thinking about it some more, this could also be a bad idea if you downgrade to previous Arch packages. They were built from tarballs signed by the maintainer, which could be hiding more surprises. You would want to make doubly sure that the release you are using was based on tarballs signed by Lasse and not Jia Tan.
At least with the 5.6.1-2 release, you can be assured that Arch is building the package directly from git, with no binary surprises in the tarballs.
See more from me