While you're here, please consider supporting GamingOnLinux on:
Reward Tiers: Patreon. Plain Donations: PayPal.
This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!
You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
Reward Tiers: Patreon. Plain Donations: PayPal.
This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!
You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
Login / Register
- Fedora KDE gets approval to be upgraded to sit alongside Fedora Workstation
- Steam gets new tools for game devs to offer players version switching in-game
- Palworld dev details the patents Nintendo and The Pokemon Company are suing for
- Sony say their PSN account requirement on PC is so you can enjoy their games 'safely'
- GOG launch their Preservation Program to make games live forever with a hundred classics being 're-released'
- > See more over 30 days here
-
Classic Unreal Tournament and Unreal now easier to down…
- Liam Dawe -
Classic Unreal Tournament and Unreal now easier to down…
- Termy -
Mesa 24.2.7 out now and Mesa 24.3 may come sooner than …
- whizse -
Mesa 24.2.7 out now and Mesa 24.3 may come sooner than …
- axredneck -
Mesa 24.2.7 out now and Mesa 24.3 may come sooner than …
- redneckdrow - > See more comments
- Who wants a free GOG key for Dishonored?
- poke86 - No more posting on X / Twitter
- Liam Dawe - Steam and offline gaming
- damarrin - Weekend Players' Club 10/11/2024
- Pengling - Upped the limit on article titles
- eldaking - See more posts
View PC info
As suggested by stan, a file-system with transparent compression should be a much easier, and overall better, option for on-the-fly game decompression. You can get this through a fuse-based file-system like fusecompress, or natively if you have a btrfs partition.
Original Post:
Some might find this useful, especially if you, like me, enjoy maximizing space usage efficiency.
I've been working on ways to maintain the more compressible games in my collection in archives, and decompress them on-the-fly when I want to play them. In a way similar to how some emulators treat ROMs in zip files.
Originally I was just making use of the /tmp/ folder, but I figured having to decompress things at least once in a session is a bit of an IO waste, especially if I was using an SSD. So I decided to make a cache for it, and I just released it on github:
https://github.com/ntfwc/decompression_cache
I'm currently using it with 9 games on my system and with an emulator, to add archived ISO support to it. I've been pretty happy with the results. And it could have applications in other more general tasks, as well.
I'll put two example wrapper scripts below. One for a game and one for an emulator:
Cave Story+ Wrapper:
#!/bin/bash
ARCHIVE=/home/ntfwc/Games/CaveStory+/CaveStory+.tar.gz
EXPECTED_OUTPUT_DIR=CaveStory+
DECOMPRESSED_CONTENT_DIR=$(/home/ntfwc/Programs/cached_decompression/cached_decompress "$ARCHIVE" )
cd $DECOMPRESSED_CONTENT_DIR/$EXPECTED_OUTPUT_DIR
./CaveStory+_64
PCSX Emulator Wrapper:
#!/bin/bash
GAME=$1
#Get the filename without the archive extension
FILENAME=$(basename "$GAME" )
OUTPUT_FILE=${FILENAME%.*}
DECOMPRESSED_CONTENT_DIR=$(/home/ntfwc/Programs/cached_decompression/cached_decompress "$GAME" )
OUTPUT_PATH="$DECOMPRESSED_CONTENT_DIR/$OUTPUT_FILE"
pcsx -nogui -cdfile "$OUTPUT_PATH"
Edit: Darn smilies.
View PC info
View PC info
Good question. Adding up the uncompressed vs compressed sizes among the 9 games and the 3 ISOs, then subtracting my current cache size setting (1.4 G), 281 MB or about 7.8%. A bit disappointing. I chose 1.4G especially for the ISOs, so I could have at least 2 in the cache.
A more conservative cache size of 0.7G would be at least a 28% savings, in my case. The percentage would be also be better if I had more items, as that will reduce the significance of the cache overhead. Adding another ISO, for example, could bump the savings another 200MB.
Update:
Adding a few more playstation ISOs got it up to 15.7%, at about 1GB spared. So it definitely something best used at scale. I should note that, for the vast majority of things, I'm using gzip, as it is quite fast.
View PC info
lzop is impressively fast. And yeah, transparent compression at the file-system level would be a much better solution, really. Personally, I haven't done anything with btrfs yet, since I heard it is still considered experimental and that the performance is a bit worse than ext4. There are some fuse-based file systems that do transparent compression though, so I think I'll try one of those out.
I could see a decompression cache still find more general use on low-power machines where real-time decompression is a bit too much for the archive format of choice, where it is only asked to use a small percentage of the archives most of the time. But, for modern machines you would play games on, a transparently compressed file-system is going to be more convenient and it probably won't take a noticeable hit for it. Though, there could still be a rare case where you can't get permission to use FUSE on a shared computer and you want to fit a bunch of games in a limited space. Regardless, I had fun programming the cache, so no regrets.
View PC info
While trying them out, I noticed rather long pauses at certain points where it is apparently trying to catch up with the emulator seeking around the disk image. fusecompress operates on a per-file basis, rather than blocks, so, unfortunately, random access is less efficient there. It's something to keep in mind if you run into similar behavior.