One of the big topics of discourse in the Linux gaming sphere recently has been Tim Sweeney's statement on porting Fortnite to the Steam Deck, where Sweeney argues that Linux would be too difficult of a target and the market not big enough to warrant the amount of resources it would take to bring all of Fortnite on the platform.
The central crux of the issue, from Sweeney's point of view, is that making Easy Anti-Cheat, with all of its capabilities, run on Steam Deck (and thus on Linux) would be extremely difficult. He argues, that for a game of Fortnite's size this would open the flood-gates to significant influx of cheaters.
There have been some responses to this from the Linux side, with some accusing Sweeney of exaggerating the difficulty of such a port or that his statements are conflicting, because he simultaneously believes the Linux market is too small to be worthwhile but also would provide a way for too many cheaters. I will address some of these aspects a bit later, but for now let's focus on the main technical blocker, which is Easy Anti-Cheat.
Easy Anti-Cheat, or EAC, is an anti-cheat solution which apparently comes in a few configurations. We know that it can be run in a configuration where it is compatible with Linux/Proton apparently with just a relatively simple toggle. However, this mode of operation is seemingly a comparatively high-trust configuration, where only part of the anti-tampering protections of EAC are active. This may prevent some cheats but fail to detect others, which can be perfectly reasonable for games, where the number of cheaters and potential cheaters are fairly low or other systems complement the anti-cheat solution. There are plenty of games, even some popular free-to-play titles, which at best have this level of anti-tamper protection and they don't seem to have a major cheating epidemic, so clearly in many cases this should be enough. We also don't know the scope of cheats that are detected by EAC in this configuration, so this system by itself may already be fairly comprehensive.
EAC also contains a kernel-level component, which on Windows is installed as a kernel driver. This allows EAC code to run at a very privileged level and inspect essentially any and all parts of the system in order to detect tampering. This provides a very broad level of monitoring, which is also harder to bypass. Based on Sweeney's comments, this is the mode of operation used by Fortnite. It is also a mode of operation that is technically incompatible with the Linux way of doing things.
In Linux, the standard way of delivering drivers is by submitting the driver into the kernel source code tree, which naturally requires that the driver be open source. Most drivers are delivered this way, where the driver gets tightly integrated into the kernel and the drivers are updated when the kernel is updated. There are of course some notable exceptions to this rule, the largest of which is the Nvidia driver. The Nvidia driver is instead loaded as a separate kernel module, which allows Nvidia to keep its source code hidden, but also allows the driver to be updated separately from the kernel. So, EAC could surely use this approach as well, right?
The separate kernel module approach comes with some gotchas. Firstly, the kernel is licensed under GPLv2 and many of the parts in the kernel require the calling code to also be GPLv2 due to the "viral" quality of GPL. This means that, legally speaking, if Epic were to turn EAC into a kernel module and started poking around the kernel APIs, they'd have to open source EAC or they'd be in a legal grey area. The first approach is obviously not possible due to their business model and the second is at least not a great look.
Another problem with separate kernel modules is that the Linux kernel only guarantees a stable user-facing interface. This means that almost anything is allowed to change inside the kernel as long as user-level programs continue functioning. This is also the reason why sometimes the Nvidia driver stops working when you upgrade from one kernel to the next without installing an up-to-date Nvidia driver as well. So, when Sweeney is complaining about the multitude of kernel configurations, he's not wrong. EAC would need to maintain a compatibility shim similar to that of the Nvidia driver, which ensures that the EAC kernel module functions with each kernel version out there. Every time the kernel updates, an EAC engineer would need to go over the changes and update the compatibility shim every time there's a breaking change while still maintaining the compatibility with older kernel versions.
Theoretically you could overcome this problem somewhat by only targeting the Steam Deck and its SteamOS. This would give you a single kernel version to target, although Epic would need to negotiate with Valve in order to ensure their driver is somehow shipped with SteamOS.
But the problems don't end there. Since Linux is a fully open platform, there is technically nothing that would prevent a determined cheater from cracking open the Linux source code and making some tactical changes to how the kernel behaves, building the kernel and then making the EAC kernel module blind. On Windows the EAC developers can assume that the black box that is the NT kernel is at least somewhat difficult to modify by users. This means that in kernel-space they can assume some level of security through obscurity. On Linux this assumption does not hold. The only way for Epic to overcome that problem would be to negotiate with Valve to lock down the Steam Deck, which Valve has already decided not to do.
So, from EAC's point of view the Linux platform can never be quite fully trusted, which is entirely fair, because from the user's point of view EAC can never be quite fully trusted.
But surely Epic could still somehow bring Fortnite to the Steam Deck, right? Surely they could ship a version of Fortnite without the kernel-level component, right?
That they could, which brings us to the points about market share and the viability of cheating. Sweeney argues that the Linux market is too small, which initially sounds a bit odd because he then goes on to worry about the large numbers of cheaters. The kicker is here that the small Linux market doesn't necessarily guarantee a low number of cheaters. If it turns out that certain cheats are possible via a Linux version of Fortnite, this will attract some cheaters to use the platform in order to bypass EAC. It won't be all of the cheaters, many casual cheaters would likely not bother to learn Linux in order to cheat in a video game, but there is no doubt a group of cheaters that would take the opportunity. So, Fortnite would see some increase in cheating, but without good data it is hard to determine how big that effect would be. However, considering the popularity and free-to-play nature of Fortnite, it could very well be that it would be an attractive enough target for cheaters to attack even if there is a slightly higher initial investment. Cheat makers on the other hand would probably eventually find ways to package their offerings in an accessible enough format, like boot-to-cheat USBs or pre-configured VM images.
Some solutions to this problem have been proposed. For example, they could silo Steam Deck/Linux users in such a way that they will never come into contact with the rest of the playerbase. This would contain cheating, but it's also a hard-handed measure that would likely be unpopular. It would also require some amount of work to accomplish and I think it's fair for Epic to discount options that would cause extra work on them.
So, what's the solution to the problem? Here's the thing: I don't think there is one. My personal opinion is that client-side anti-cheat is fundamentally limited and taking it into the kernel is a bandaid that comes with excessive cost and is simply incompatible with the Linux platform. So, as long as Epic insists on maintaining its current anti-cheat approach with Fortnite, I just don't think there's going to be Fortnite on Linux.
And that doesn't mean Tim Sweeney is wrong or lying about the difficulties of adapting that approach to Linux. It just means that a new or different approach is needed in the future.
Quoting: RCLThis is indeed an open area of research and work. However, as far as I am aware - and I'm not an anti-cheat specialist - there isn't enough robustness as of now yet, to avoid false positives or missing the cheaters, and more importantly, the current solutions have a fairly long lead time, which cheaters can beat by recycling the accounts faster (in a free-to-play game at least). So while a promising area, this is far from a solution that can be enabled "right here, right now" to combat today's cheaters, unfortunately.
I don't see it as an excuse to resort to unacceptable solutions. Rather it should be an incentive to invest more into such AI.
Also, the way this works is a slippery slope. Once these companies manage to portray this as "acceptable" and people start ignoring the issue, it becomes very hard to get rid of it even if there are better solutions. Because they don't want to give up power and control.
Same thing with other areas with similar issues.
I recommend reading the above Watchbird, it's very on point.
Last edited by Shmerl on 17 February 2022 at 4:19 am UTC
Quoting: RCLQuoting: Cyba.CowboyYour comments imply that you work for - or have some direct involvement with - Epic Games, @RCL...
Yes, I do work there, not on the anti-cheat though. And here I'm privately, as just another Linux user.
Well it's nice to have you here, even in an unofficial capacity... The Linux Community can still indirectly benefit from your presence, because of things like your ability to (privately) provide feedback the your superiors based on "the feeling on the ground".
Far too many senior executives mis-read their customer base; so any feedback, even indirect feedback, is always a good thing in my opinion.
---
I don't follow along enough to know how much Epic Games make off Fortnite: Battle Royale, but you would think that if they were making the sort of money others are making with in-app purchases - and I know some of the companies out there are making huge amounts of money off in-app purchases (only recently I read an article about how just one of EA's sports games was effectively paying for most of their expenses via its in-app purchases!) - that paying for the AI tech to enable server-side anti-cheat solutions would be entirely affordable...
Anybody know how much Epic Games make off Fortnite: Battle Royale (and can actually disclose said information)?
The problem with AI isn't insufficient investment IMO. While I'm not an AI expert either, I assume that it might be similar to the situation with self-driving cars - the whole industry pours tons of money on that problem yet no reliable solution is coming out. It's not like it's a new problem, it's at least as old as multiplayer games, and server-side solutions have been brought up years ago, but I am not aware of any breakthroughs there. Whereas current games need reasonable protections to be enjoyable for the masses right now, because otherwise it's just too easy for a minority of toxic players to spoil them for everyone, even with the anti-cheat measures in place it is a constant battle on platforms like PC. So it is what it is...
Quoting: RCLbut I am not aware of any breakthroughs there.
I agree that making such solution is harder than it sounds, I don't doubt that. But I just don't see lack of current solution as a reason to erode users' privacy. Same as let's say lack of good self-driving AI isn't a reason to use some other poor solution that will endanger people's lives becasue self-driving itself is considered "cool" and someone says it has to be enjoyable already today.
This is actually a good comparison, because most people take such kind of security seriously and understand the implications of using inappropriate solutions. When it comes to information security - it's more abstract. And people more easily ignore issues with it like the above anti-cheat rootkits.
Last edited by Shmerl on 17 February 2022 at 6:03 am UTC
Quoting: ShmerlBut I just don't see lack of current solution as a reason to erode users' privacy.
While I agree with you, I want to stress that anti-cheat isn't a malicious software that has no boundaries. Anti-cheats are governed by EULAs which sets the limits of what they are allowed (by the user accepting the EULA) to do (e.g. https://www.easy.ac/en-us/support/cardlife/account/eula/). In principle, the situation with trusting the anti-cheat does not really differ from trusting the closed source kernel you're running on other platforms, or closed source binary drivers (or firmware) you might be running on Linux. In all these cases the trust between the user and the vendor is enforced via legal agreements that both sides accept as a reasonable compromise between the system's functionality and their control over the system. Different platforms (PC vs console) offer a different degree of that control, but in all cases they are based on the mutual agreement.
Last edited by RCL on 17 February 2022 at 6:20 am UTC
Quoting: RCLWhile I agree with you, I want to stress that anti-cheat isn't a malicious software that has no boundaries. Anti-cheats are governed by EULAs which sets the limits of what they are allowed (by the user accepting the EULA) to do (e.g. https://www.easy.ac/en-us/support/cardlife/account/eula/). In principle, the situation with trusting the anti-cheat does not really differ from trusting the closed source kernel you're running on other platforms, or closed source binary drivers (or firmware) you might be running on Linux. In all these cases the trust between the user and the vendor is enforced via legal agreements that both sides accept as a reasonable compromise between the system's functionality and their control over the system. Different platforms (PC vs console) offer a different degree of that control, but in all cases they are based on the mutual agreement.
The problem with this (and same applies to DRM) is the very issue of trust that you bring. Users also need to accept the rules of the service, yet they still cheat, right? I.e. the company doesn't trust the user despite the EULA. And not only doesn't trust, but deploys spyware-like capabilities on user's system, meaning treating all users as suspects by default.
Now, why should the user trust the company in such situation? Trust goes both ways. That's the main pitfall of overreaching preemptive policing in general - those who use it don't have trust, but then why they themselves be trusted in return? Quite on the contrary, they also should never be trusted.
There was a neat illustration of this idea in this video:
https://www.youtube.com/watch?v=XgFbqSYdNK4
Last edited by Shmerl on 17 February 2022 at 6:28 am UTC
Quoting: ShmerlUsers also need to accept the rules of the service, yet they can cheat right? I.e. the company doesn't trust the user despite the EULA. And not only doesn't trust, but deploys spyware-like capabilities on user's system, meaning treating all users as suspects by default.
Well, this is a grim way to look at things, although I don't blame you for taking that POV. However, IMHO the situation is more analogous with, say, airport security. Both sides assume the good will of the other, but it needs to be enforced. However, the enforcement isn't arbitrary - there are accepted limits what can be checked and what cannot, and everything is done in a respectful manner. Similarly, anti-cheats (at least ones I am aware of) don't do keylogging, screenshotting, disk searches, or other sketchy stuff that indeed could put them into the malware category. Everything that they can possibly do is laid out in that legal agreement (which, granted, few people read carefully), and all these activities are limited to enforcing the integrity of the game. Pretty much the only thing that the anti-cheat prevents the user from doing on their machine is modding the game - which IMHO is a reasonable "price" to pay for playing it. For many players of online games, their account and its integrity, fair play during the prized competitive tournaments and such, is more important than the ability to run Reshade.
Of course among the Linux users, especially typical users, there's very little, if any, trust to closed source in general, and to closed source kernel components in particular. However, Linux gamers are crossing the boundaries - a lot of us do not mind running many closed source components on the system (the games themselves), including the drivers. Again, in that case anti-cheat doesn't stand out that much.
Last edited by RCL on 17 February 2022 at 6:49 am UTC
Quoting: RCLWell, this is a grim way to look at things, although I don't blame you for taking that POV.Not so much grim as simply more security conscious approach. You wouldn't want to trust those who don't trust you. It's not like you know the other side is a crook, but if they treat you as an a priory potential crook, you should do the same to them, it's only fair.
Quoting: RCLHowever, IMHO the situation is more analogous with, say, airport security. Both sides assume the good will of the other, but it needs to be enforced. However, the enforcement isn't arbitrary - there are accepted limits what can be checked and what cannot, and everything is done in a respectful manner.
I think the core difference is that such kind of security is external to your private space. I.e. imagine you get agents in your home all the time, because anyone can be a potential threat. Digital space is more abstract, but there is still the sense of private space there too. Like your computer, your OS it runs, programs that run on your OS and etc. It's not that policing in general is a problem, it's policing your private space that is and that's the core of this issue above.
Quoting: RCLOf course among the Linux users, especially typical users, there's very little, if any, trust to closed source in general, and to closed source kernel components in particular. However, Linux gamers are crossing the boundaries - a lot of us do not mind running many closed source components on the system (the games themselves), including the drivers. Again, in that case anti-cheat doesn't stand out that much.
Lack of trust for blobs is not unreasonable, but I'd argue blobs in general can be more neutral due to their goals. I.e. some normal game doesn't have a goal of spying on you, even though it could since it's a blob and you don't really know fully what it could do. But anti-cheat? It's explicitly designed to spy on you (though formally just for the narrow purposes related to the game). I'd say in regards to trust it's like in a whole worse category than other blobs above due to that.
Last edited by Shmerl on 17 February 2022 at 7:04 am UTC
Quoting: RCLSimilarly, anti-cheats (at least ones I am aware of) don't do keylogging, screenshotting, disk searches, or other sketchy stuff that indeed could put them into the malware category. Everything that they can possibly do is laid out in that legal agreement (which, granted, few people read carefully), and all these activities are limited to enforcing the integrity of the game.I couldn't say about game companies in specific, but I do know that many other kinds of companies violate the law all the time--the actual law, let alone probably-unenforceable EULAs. And what any given anti-cheat does is subject to change without notice any time there's an update.
Probably most anti-cheats aren't doing stuff like that. But we wouldn't have much idea if they were, and we do have the Sony example. We don't really know how many other "Sony rootkits" might still be out there.
And in this case, we're talking about Epic. Now, very likely, relevant decision-makers at Epic believe that the risk, including reputational, vulnerability to lawsuits and so forth, would seriously outweigh whatever gain they could get from using their anticheat as a spying tool. But it is not just very likely but an absolute certainty that, if that calculation were to shift such that they thought there was a net gain from such action, they would do it in a heartbeat. And Sweeney would cheerfully lie his head off about it.
Frankly, I think you're a little too optimistic about airport security, too. I've heard stories, particularly from the US, "land of the
Quoting: ShmerlBut anti-cheat? It's explicitly designed to spy on you (though formally just for the narrow purposes related to the game). I'd say in regards to trust it's like in a whole worse category than other blobs above due to that.
I guess I am a bit biased because I have _some_ idea what anti-cheat usually does and what it doesn't, so it's easier for me to see it as more benign. Yes, the kernel part of an anti-cheat (if it has such) usually hooks up to sensitive syscalls like process creation, library loading etc, which can raise a lot of suspicions. However, they are concerned with a particular process (the game), and what happens to it. E.g. they flag stuff like an unfamiliar dynamic library being mapped to the game's address space or certain parts of the process memory not being the same as they expect - that's usually the extent of their "spying". They have no interest to "spy" on things that a user would be concerned about (like e.g. files or what not), and try to filter out everything not related to the game as fast as possible, because they need to not slow down the game they are trying to protect. (Note that this is a difference between the anti-cheat and anti-virus software in that regard, as AV tries to "cure" the system and rid it of the virus, AC doesn't care about "fixing" a game that was tampered with).
But I totally understand that this can all sound very abstract. And especially if there's no trust towards the developer, the very fact of allowing their software to hook up to the kernel can be a big no-no.
See more from me