There's been an urgent security bulletin sent out in a few places today in the Linux sphere that relates to the XZ tools and libraries with liblzma, as certain version have been compromised.
From the OpenWall security list:
After observing a few odd symptoms around liblzma (part of the xz package) on Debian sid installations over the last weeks (logins with ssh taking a lot of CPU, valgrind errors) I figured out the answer:
The upstream xz repository and the xz tarballs have been backdoored.
At first I thought this was a compromise of debian's package, but it turns out to be upstream.
From what they say the issue is present in version 5.6.0 and 5.6.1 of the libraries.
This has led to Red Hat putting up an urgent blog post on the matter, noting that so far Fedora Linux 40 is okay but you should "immediately stop usage of any Fedora Rawhide instances" as they were updated but they're going to be reverting to an older version.
For those not clear on what it is, as Red Hat noted: "xz is a general purpose data compression format present in nearly every Linux distribution, both community projects and commercial product distributions. Essentially, it helps compress (and then decompress) large file formats into smaller, more manageable sizes for sharing via file transfers".
Red Hat also noted the "malicious build interferes with authentication in sshd via systemd" and so "Under the right circumstances this interference could potentially enable a malicious actor to break sshd authentication and gain unauthorized access to the entire system remotely".
Debian also has a security advisory up on it noting that "no Debian stable versions are known to be affected" but the compromised packages were part of "Debian testing, unstable and experimental distributions" which they have reverted as well.
On the Ubuntu side they have a Discourse forum post noting the affected package was removed from "Ubuntu 24.04 LTS (Noble Numbat) proposed builds" and they're continuing to investigate.
It has been assigned as CVE-2024-3094 noting it is a critical issue.
So you'll want to ensure any XZ packages are not at version 5.6.0 or 5.6.1, and check the news directly from your chosen distribution for updates on it.
Update 02/04/24: the Binarly Research Team announced a new free tool to scan an ELF binary for XZ backdoor detection.
Quoting: pleasereadthemanualGentoo has gone so far as to downgrade to 5.4.2, which is the last release signed by the previous maintainer.
Thanks for the analysis and scrutiny, I did a read the news announcement on their website, I don't have a developed informed opinion at this time, but it's good to know that using the Arch Linux Archive you can go back to that version like this
pacman -U https://archive.archlinux.org/packages/x/xz/xz-5.4.2-1-x86_64.pkg.tar.zst
Maybe it's time to get into Gentoo and cut the middle man out.
Edit: There is discussion about whether 5.4.x is old enough and some people suggest going back to 5.3.1 in the thread linked in the comment above.
Last edited by ElectricPrism on 30 March 2024 at 10:51 am UTC
Quoting: kaktuspalmeEven worse, Gentoo had marked a potentially affected version as "stable" and rolled it out to their regular (as in: not testing) user-base...Quoting: bonkmaykrGentoo users probably having a field day laughing at us right now compiling all their stuff from scratch.
Doesn't change anything if the sources are get from the tarball.
At least they are doing the sane thing and roll back to the last version from the previous maintainer.
Quoting: pleasereadthemanualMy issue is that Arch, unlike openSUSE, Gentoo, Nix, Fedora Rawhide, Fedora 40/41, Kali Linux, and Homebrew, are continuing to use the 5.6.1 release, just pulling directly from Github and not the binary tarballs which were compromised.
so you are saying they should use the compromised binary tarballs instead. You said it twice already.
I'm not sure you understand the situation and are basically copy-pasting stuff you find elsewhere here pretending to know a thing.
https://bbs.archlinux.org/viewtopic.php?pid=2160841#p2160841
Last edited by sudoer on 30 March 2024 at 10:35 am UTC
1. Why in the hell is anyone still using Github for FOSS? Projects should go independent or literally anywhere else.
2. This really drives home the point that __BINARY CANNOT BE TRUSTED__ -- we really should hammer this idea in HARD.
If you can't see the source, how are you going to verify (I) A Signature, and (II) What the software does and does not do.
I could see it being Legally Mandated that ALL SOFTWARE is required to publish their source code to ensure that malicious foreign actors haven't hidden things in the software.
__BINARY CANNOT BE TRUSTED__ -- of course most of you already know this -- Easy-Anti-Cheat? What does it really do? Denuvo? What does it really do? How do we know that it's safe? How can be verify that software is safe?
Binary Cannot Be Trusted. Fuck GitHub.
Does this mean every package ending with tar.xz have risks ?
Quoting: ElectricPrismHaving read and reflected more, I feel like there are at least 2 points to drive home.
1. Why in the hell is anyone still using Github for FOSS? Projects should go independent or literally anywhere else.
2. This really drives home the point that __BINARY CANNOT BE TRUSTED__ -- we really should hammer this idea in HARD.
If you can't see the source, how are you going to verify (I) A Signature, and (II) What the software does and does not do.
I could see it being Legally Mandated that ALL SOFTWARE is required to publish their source code to ensure that malicious foreign actors haven't hidden things in the software.
__BINARY CANNOT BE TRUSTED__ -- of course most of you already know this -- Easy-Anti-Cheat? What does it really do? Denuvo? What does it really do? How do we know that it's safe? How can be verify that software is safe?
Binary Cannot Be Trusted. Fuck GitHub.
I agree with most of what you say. It's why I am trying to use my system with the libre kernel as much as possible to try to limit my closed source footprint from binary blobs that can be updated on a whim, and include who knows what from the kernel. Using the law is not a solution in my eyes though. I don't feel like I have the right to kill someone in order to enforce FOSS which is exactly what you're asking if you want to create laws. As soon as you make a law you have to be willing to have have people killed to enforce it. If you aren't then you can't have the law. Now i'm fine with that for things like rape and murder but do I want it to happen over someone wanting to use software that I don't agree with ? No. How about we as people support projects that use the ideals we want to further and we as people actually put our money where our mouth is.
Quoting: sudoerAn OS like haiku for example, multi-user sure, multi-threaded sure, memory-protected sure, "network-aware" sure, but not including ssh and other server/corporate functionality.
A FLOSS OS just for the PC.
You are seriously telling us you have never used SSH in a home environment?
I guess Windows 95 was a server/corporate OS for including Network Neighborhood too.
We are going to see a LOT more of this going forward. Be on your toes.
Quoting: ElectricPrism1. Why in the hell is anyone still using Github for FOSS? Projects should go independent or literally anywhere else.In principle I agree with you, and a lot of people agree with you.
In practice it's not so simple. There are good reasons for well-established large and important projects to remain on Github and not migrate to another host (like Gitlab or self-hosting). Some of it is politics (office politics, mailing list politics, or straight up Washington DC politics). Some of it is feasibility. Some of it is because projects are just unbothered by Microsoft owning Github. Some of it is laziness.
In this particular case, I'm not convinced that hosting XZ on Github was a factor in this specific Supply Chain attack.
Quoting: ElectricPrismHaving read and reflected more, I feel like there are at least 2 points to drive home.
1. Why in the hell is anyone still using Github for FOSS? Projects should go independent or literally anywhere else.
2. This really drives home the point that __BINARY CANNOT BE TRUSTED__ -- we really should hammer this idea in HARD.
If you can't see the source, how are you going to verify (I) A Signature, and (II) What the software does and does not do.
I could see it being Legally Mandated that ALL SOFTWARE is required to publish their source code to ensure that malicious foreign actors haven't hidden things in the software.
__BINARY CANNOT BE TRUSTED__ -- of course most of you already know this -- Easy-Anti-Cheat? What does it really do? Denuvo? What does it really do? How do we know that it's safe? How can be verify that software is safe?
Binary Cannot Be Trusted. Fuck GitHub.
While there is some truth to what you're saying this is mostly unrelated to the issue at hand:
1. This would have happened whether the project source code was hosted on GitHub or not. The malicious archives were also available from xz.tukaani.org and the project also has its own git hosting at git.tukaani.org. How does that help mitigate the issue in any way?
2. The affected tarballs were source archives, not binaries. They do happen to contain binary test data because xz is a tool that reads/writes binary data. This test data is not supposed to be used at build time, only during tests by developers or on a CI setup, and it's not supposed to be executable. Now the malicious maintainer introduced some compressed scripts and executable code masquerading as test data in there some time ago, and in the last 2 releases they added a build step that reads and executes that at build time. That extra build step that only exists in the source archives is human-readable code and not binary. No one trusted the binary that gets loaded in the end because because no one knew it even existed.
Source Archives Cannot Be Trusted. Fuck Tar.
See more from me