DISCLAIMER
This is exclusively the opinions of the article's author - it does not reflect the opinions of other game developers, end users, Linux users, the owners and writers for GamingOnLinux, and so on unless explicitly stated otherwise by the individual person(s).
About the Author
Hi there! I'm flibit. I work on a lot of games... and I mean a LOT:
- Over 70 PC and console ports shipped in 10 years as a one-man team
- Active maintainer of over 150 SKUs across said 70 titles (as in I still update them today)
- Lead developer of a middleware with 9 platforms actively supported, including consoles, with more on the way
- Lead developer of FAudio, which is why Proton has working audio for a huge percentage of the catalog
- Co-maintainer of Linux/Switch/Xbox/Stadia versions of SDL, which has WAY more platforms that all have to be able to pass cert at any moment (you might recognize me from the recent Wayland work in SDL)
In short: I'm responsible for making sure that new versions of games are performant and reliable, and this means passing many of these versions through certification programs. A new Steam Machine directly from Valve was an exciting prospect, but it hasn't come without its problems. Today we're going to talk about just one of them:
Introduction
I've been infamously critical of the Steam Deck from day one. From just about every angle imaginable you could be forgiven for thinking I entirely disagree with the way the Deck's software has been developed, advertised, presented to developers, and so on. I could go on for quite a while about all of these things.
Much of it is about the software and there are a number of reasons why, the main two being:
- The software side of things is where I'm most comfortable giving my opinions
- A lot about the Deck is genuinely good from a software guy's perspective (the hardware is great, the price makes sense, and with people so desperate for new hardware I get why they were so eager to get it out the door at any and all costs)
However, another reason is that we as developers didn't have a great look at how our software would interact with the Deck ecosystem until the device was pretty much already out. Because of this, it wasn't a huge surprise to see that there were issues regarding the Deck Verified program, across many different kinds of titles on Steam.
While not surprising, it is still disappointing - with the Deck becoming so influential so quickly, this has had a profoundly negative impact on some developers, particularly small developers, who in many cases put in exceptional amounts of work to prepare for the inevitable second generation of Steam Machines, only to be slapped in the face at the 11th hour with a number of results that anyone with a cursory knowledge of the game would know is invalid, including but not limited to native games that are inexplicably labelled as Proton-by-default, or native games that are marked completely unsupported because they only tested Windows binaries without even bothering to try the native ones, and a whole mess of things in between depending on what the cert process was like that day. This has caused problems at the other end of the spectrum as well, with reviewers regularly running into popular AAA games that were mysteriously labelled as Verified when they didn't actually work in reality. This might sound really harsh, but there's something important to consider here:
While not intentionally malicious, it's a good example of how one centralized superpower's decision making can affect an entire industry, whether they were part of the intended equation or not. You might recognize this narrative from the release of the original Windows Store! Given that WinRT and Windows Store were the catalysts for basically everything you're seeing on this website today this is not the comparison we want to be making, so how do we correct this?
Before we dive in we must address a possible conflict of interest, so let me make one thing absolutely clear: I do not in any way expect my titles to be given priority just because they're native, but I fully expect my work to be given the chance it deserves, and any organization that intends to insert itself between myself and the customers should expect to put in as much care into the product as the authors have. Similar to games in general, a select few AAA titles does not speak for an entire library. I and many other developers are doing our best to create an out-of-the-box catalog that directly targets the Deck platform and operating system with as few layers as possible to produce performant, energy-efficient software whose design is not dictated by a totally unrelated megacorporation, which is allegedly the long-term plan for this in the first place.
Developers, and independent developers in particular, are just as interested in expanding the PC beyond Windows as Valve is, and as early investors (some investing a decade or more in advance!) the Steam Deck's launch has not met our expectations. It is in everyone's interest to avoid another Windows 8 situation, and that means trying to fix this quickly and effectively for all developers, not just the highest-profile titles. Since launch was *checks notes* less than a month ago, there's still a chance to make this work! Even better, despite all the issues, Valve have continued to develop the platform in a largely transparent way, at least after the hardware launched: In addition to having their developer documentation be available to the public, they've also continued publishing source code for the Deck specifically, including the recent release of the devkit tools. This was a significant factor in choosing to make this critique public; Steam Deck is a rare opportunity for developers to be transparent about how these processes work with end users. In addition to building a positive relationship with customers in a way that can't be replicated outside of the PC, we can also make end users far more informed of the reality of game development on Steam, and the Steam Deck in particular. Informed customers are happy customers, even if the information isn't always pleasant.
With that in mind, let's dig into the Deck Verified program, how it relates to traditional certification programs, and what can be done to clean things up and get everyone back on the same page. If you want a good starting point as an end user, Valve's documentation for this is public and you should definitely read it.
What is Console Certification?
For those unaware, console certification (known usually as "cert", sometimes other names like "lotcheck") is when you submit a (hopefully) final build to the console vendor so that they can run a series of extremely strict checks to make sure your game is stable and has a fully working user experience. With devices getting more sophisticated and operating systems getting simpler for developers this has gotten slightly easier, but by no means is it easy period: Ask any Nintendo licensee what it's like to make controller support fully robust and you'll most likely get a look of utter defeat and bottomless trauma. Look to the Nintendo catalog, however, and you'll see why these procedures are in place.
Going through cert is a lot like paying your taxes. You know you have to do it, nobody really likes doing it, we can complain about how it might get convoluted at times, but at the end of the day it's an essential part of what makes everything work even if it doesn't feel like it half the time. And you sure as hell would not want the review process for this paperwork to be rushed and later revealed to be part of some temporary experiment. Oh, and before anyone makes the joke, yes rich people casually bypass it constantly.
In contrast, the games business likes everything to be fast, and I mean fast. Customers, investors, and industry insiders alike are often guilty of (perhaps unknowingly) pressuring studios to get the product out now and then finish it later. Cert happens to be the exact one thing that people acknowledge is a big pile of red tape and bureaucracy, and it is a tremendous waste to not capitalize on this! You literally have full carte blanche to take your time and be very thoughtful and meticulous about how you want things to work, and it is extremely important that you take advantage of this.
There had to have been a lot of pressure to showcase big games working out of the box, but there's a big difference between a recommended showcase and building a trustworthy bureaucratic process that people can rely on in the long term. Mix these two up and you're just building up a huge amount of debt that, if not paid back very quickly, will leave players and developers wondering why you even have a cert process at all, which is an attitude that's anything but trivial to turn around. It's hard to build a reliable Trust System(TM), but it's much harder to build one right after making a system that turned out not to be trustworthy. Correcting this quickly will be really important as the rest of the platform begins to stabilize and more people have a Deck in their hands (author's note: there is absolutely no way I'm not wording it like this for all eternity).
I've seen a lot of platforms and a lot of cert submissions from various companies, including the oldest in the industry to companies who only became a major platform within the last few years. Some vendors do things better, some do things worse, but they all have a lot of in common and in general these would be great things to copy for your own program:
0: Each SKU gets its own review
Look, I know it seems obnoxious for me to have this in the article, but when you are testing two different SKUs, each one should be tested separately. When you look at how things are executed now, you could be forgiven for alleging that Valve tried to get by with Wine-only, only to fail horribly and get caught immediately, but regardless of the real story we can at least be sure that native is part of the process today, which is great!
Problem is, from what I and other developers have observed, the results are still just a mishmash with no sort of clarity as to what notes are for what build. Specific devs working on a game might know that the Windows build has patented codecs in it while native does not, but a manager looking at these results probably won't know what this means. Two tables for two SKUs is not a huge ask, especially if you aren't going to let developers tell you what you should test and deploy. (As an aside: People who phoned in their Linux versions will choose Proton from a drop-down themselves, saving you a lot of time coming up with wacky schemes to try and autodetect... even those who did a pretty good job will still do this work for you, so if there's not going to be any effort to improve the native versions at all, why not let them make that choice directly?)
1. Actual notes with test results
Currently when you get test results, you get a series of items with PASS/FAIL on them, with some automated text when there's a FAIL. This is very typical for cert processes, however the automated text is not helpful at all, and that's assuming there are any notes to begin with. To emphasize my point, I can even disclose a private test result because there's nothing private about it:
Test failed due to the title not providing external gamepad support for the primary player. We highly encourage providing support for externally connected gamepad to offer the player with the option of playing using the gamepad of their choice. Partial controller support is available but the following actions are not available via the default controller configuration: [Space, Escape]
You can even see where the autogenerated part starts and ends! In this entire paragraph, the game-specific notes are two words long. End users are held to higher standards than this, so why not the cert program?
This is a pretty severe contrast with competitors' processes, where we get reasonably good notes including screenshots, diagnostic information, and sometimes even video! This information is not difficult to produce if you're a human testing a game, especially if these are things you can get from the OS out of the box. (So okay, maybe video is out of range for Steam right now... having played on an Xbox recently I would love for this to change though.)
With good enough notes, you would also be able to find common patterns and make the official test suite more robust, and eventually build an entire conformance test suite for developers to reference before they've even submitted. Relying solely on the existing framework is a one-way ticket to stagnation, which is the opposite of what anyone wants for their product.
This isn't just about being thorough, it's also about clarifying things that can be interpreted in many ways: When a bunch of native games started to get labeled as Proton titles, you can imagine the panic at my office when I had to explain to my own clients that "yes I actually tested on real hardware, yes the Linux version is optimal, no I don't know why Valve is blocking my build, no you should not delete the port" over and over again basically since all results became public. It would have been nice if the experimental test data had been marked as such, even a "this is temporary, we plan to migrate verified titles to native closer to release" would have saved me a lot of trouble and, to put it bluntly, a whole lot of money I'll never see again. Those opportunities and resources could have been used to make more games, and sadly some of these effects are likely irreversible. Preventing these outcomes is important for the future of the platform.
Finally, making the notes public would be useful as well - while some of the technical notes might include things like NDA'd symbols in a stack trace, in all my years of doing console builds I have never at any point run into high-level notes that were sensitive or required to be private, with the exception of spoilers which are trivial to mask (as Steam already does elsewhere). If a game is only unsupported because of codecs, for example, that is useful information for everyone!
2: Running cert behind our backs is the last option, not the first
Again, I get it, Steam's strength is in its back catalog (I even gave a whole presentation at MAGFest about this), but even the older games that are popular still have someone keeping an eye on them, so it should be no problem to let them submit to cert themselves. This one is hard to compare to other vendors because they aren't releasing games before they get certified, but a big part of the cert process is communication between the console vendor and the app vendor. This back-and-forth process is essential in getting good data, for a number of reasons including but not remotely limited to:
- App vendors can get clarification on strange test results
- Console vendors can get clarification on unusual (but not broken) program behavior
- In rare instances, a developer can file a waiver stating that a cert rule may not be strictly relevant (for example, if the control scheme is deliberately unusual and has clear disclaimers on the store page)
But that's basic, basic stuff. The one that I know Valve cares about:
- App vendors can provide feedback on the certification program
You have this entirely new platform, with a slightly stricter ruleset than your existing one, and the data set is unspeakably huge with variety that no program in the world is going to be able to automate. You're on record, multiple times, across multiple decades, as saying that human feedback matters a huge deal, and these statements were recorded during a time when everything seems to be getting automated (whether it even makes sense or not). You have a very large subset of games with active developers, who actively target roughly the same platform (natively even!) and would be very proactive if given the chance to be a part of the program, from submitting builds to providing feedback on your program before it's even out. Some of them even had the hardware over half a year before it was available to the public.
Why, why, why was the answer to withhold the submission process to this very day and secretly shotgun blast the catalog with experimental procedures a month before launch? I'm sorry to be so harsh here, but the results have been horrific: It completely ignores the human element of cert (arguably the best part), blocks proactive developers from contributing to the platform, annoys people who are actively watching random, clearly abandoned games get pushed to the front of the queue, and wastes the effort of people who had the working program in their hands from the start and, to put it bluntly again, deserved the spotlight at launch a whole lot more. As a reminder, this isn't about me, there's a whole group of early investors who feel cheated by the current system and these will likely be the experts you will need to have readily available for the alleged long-term plans.
Looking at the cert results, the dates of the tests, and the runtimes being used, I can only assume the timing had something to do with the release of Wine 7, which would be fine for Windows games but completely ignores the breadth of native games that could have been a part of early cert prototypes. I can only hope I'm wrong here, but I can't help but look at the timeline and wonder why people had the hardware for all that time when the paperwork came so late.
Right now a lot of discussion is about the OS and bugs, but my guess is when people call the Deck "rushed" years from now this is what they will be talking about. Sure, the OS bugs are rampant, but they will probably be fixed eventually. As of now I'm not as confident about the cert results and I'm not alone here.
3: Cert results should not be silent and on an unspecified timer
The Deck Verified program's results have the potential to be hugely influential in customers' purchasing decisions, whether they have the hardware or not. The data that comes out of this should be given a lot of care, and more importantly a lot of attention from everyone involved in maintaining the game.
Out of the five products that have been given ratings on Steam that I have partner access for, I have received a single notification for one of them. Developers with less than 70 games under their belt have similar ratios as far as I can tell. (So yes, for the past couple weeks I have had to go game-by-game to manually search for test results for dozens of games, just in case results were sent without a notification. No, the recent per-organization view didn't help, because I'm also in dozens of separate organizations.)
This would be less of a problem if 100% of the testing I'm able to see wasn't being done in secrecy - I have dozens of titles I could submit and actively wait for results on, but I'm also not sure if I'd even get the notes back in time. If you want something to shotgun-blast, this is the place to do it. Send the message to everyone in the organization, and be explicit about who the data is for. I'll even write the multiplat one for you:
These test results were based on both your Linux and Windows versions. We strongly recommend forwarding this information to the appropriate engineers immediately so that the appropriate changes can be made. The sooner the results are verified by your team, the sooner they can be published!
Send that every 12 hours to every account in the org until you get a response or the results are published, then you'll have a legitimate reason for the timer. To get back to the taxes comparison: It sounds obnoxious, but having constant reminders for something this important is trivial to justify, no matter how much you might hate it at the time.
This would mitigate the other problem which is that nobody seems to know how long we actually have until the results get published. Is it a week? Two weeks? Are we even supposed to know about it in advance? When I ask around, the answer is always different, and the docs aren't much help with "approximately a week" as the official timeline. Until looking it up for this post I honestly thought it was twice as long for manually submitted results (and for secret results it seems to just be zero), because in my experience a cert back-and-forth being a week is unrealistic even for the biggest console vendors who have invested far more in this area.
For end users that might see "a week" and think that's a lot, it actually is even less time than I'm making it out to be, because the timer doesn't stop if a back-and-forth starts! That's right, if you get marked "unsupported" and the timer hits zero before you've had a chance to figure out the problem, up on the store page it goes, regardless of the test result's accuracy. This is a direct contradiction to the same sentence that says how long this process is (emphasis mine):
If you take no action, after approximately a week your review results will automatically be published and show up on your game detail page as the "results of Valve's testing" (see Steam Store on Deck section above).
This isn't without reason, it's just not the best reason: From what I can tell, "action" is defined as one of two things:
- Resubmitting to cert
- Publishing the results
You can optionally file a ticket to get clarification on the results, but this appears to be completely disconnected from the process itself (in fact, it's just using the end user support system, not a direct contact form, so there's no guarantee the person you get in touch with has anything to do with your test results). I filed tickets for two titles and, after getting no response, I bit the bullet and put both games back on the queue with zero notes to work from. This is a massive waste of everyone's time: As a developer I have no idea if I did anything wrong and therefore can't spend my time making the game better, and Valve now has to go through the entire process for literally the same build only because I'm crossing my fingers and hoping the cert process is nicer to me this time. For a catalog as big as Steam's (or even mine, honestly), this time adds up very quickly!
If you want to point to exactly one scenario where taking advantage of "the bureaucracy" would prevent a whole lot of headaches, this is the one to cite. I know putting hard deadlines on anything sucks, but because cert is the inverse of the rest of the industry the solution is surprisingly simple: Just pick a stupidly huge number that's impossible to object to. Forget "approximately a week", make it 30 days from the timestamp of the first e-mail and then permanently kill all timers the moment you get a response of any kind. Nobody in their right mind is going to ignore 60 copies of the same e-mail, particularly for something that actually matters, unless they're long gone. At that point you have a pretty solid excuse to publish behind their back.
Lastly, having direct contact with the cert tester is really important because they will understand the notes better than anyone else, and it avoids getting lost in the monolithic beast that is public Steam support.
4. Probably more?
The Steam Deck is very new which means it's extremely volatile, whether it be the technical details or the boring, bureaucratic stuff. Looking at its current state, the team involved is probably going to be in crunch for pretty much the rest of the year, which is awful to think about. While I can't act like I can make decisions for anyone but myself, I would like to suggest that if there's one thing they should slow down on, it's the Deck Verified program. To carry out even a subset of changes that I've suggested will take a lot of work, but if Crunch Mode continues unchecked, that delta between what's in production -> what needs to be in place is going to get bigger, which means it's going to be more work and therefore be harder to pull off. I would very happily accept a delay in getting my catalog certified if the results are something that my customers can trust without having to sift through user reviews and databases like ProtonDB, all of which are far more likely to have errors or omissions in their entries. Not that they do this on purpose or anything, it's just the difference between end users and trained QA.
Even an "open" platform like Steam deserves to have the latter, and I will continue to do my best to support a program that demonstrates great care in its quality assurance, because it's absolutely possible to have such a service available to PC gamers and nothing would be more satisfying than having GNU/Linux be at the center of it.
Quoting: GuestQuoting: d10sfanVery interesting article, and I appreciate all the ports and work you have done in the past!
Not sure how it applies to your catalog of games, but many native games work better in proton, so in that case it would make sense to pick it over native on a case by case basis, since we are trying to go for the best user experience possible, at least in my opinion. The games that you've ported I've tried in the past have worked great though.
I know there's been a few times where ports have been delayed in terms of the Windows version being updated first, which can be an issue, but not sure how that is taken into account for certification. For example, Superliminial added their multiplayer Dec 17th, and it wasn't until Feb 21 that it was out on Linux(according to the developer's articles). Streets of Rage 4 took some time to get the Linux version on the store as well.
I'm not sure how those type of delays are taken into account with the certification. We've seen alot of ports being abandoned such as with Aspyr, Feral, and Virtual Programming.
What regularly broken games are you referring to, where many reviewers were seeing broken results from verified games? I haven't seen anything like that mentioned, but I haven't been following specific games too much, so was curious.
It does sound like in general that Valve has alot of areas to improve on with their cert process, especially in communicating it with the developers. I hope that they pick the native version where it's the best, as I still try to buy native games over proton where possible.
GNU+Linux native games don't use or require Proton, it's purely to make Windows games run on GNU+Linux. I have no idea what gives you the impression Proton is required for non-Windows games.
While i strongly prefer native games i just had a case today where i bought a game with native linuxsupport and switched to proton.
After installing the title i found out that my controller wasn't detected by the game. I tried all the different steaminput settings. Then i went to the community forum where i found reports on not working controllersupport regarding the linuxversion from years ago. Finally i forced Proton on the game and my controller worked without any further configuration needed.
Sadly native ports are clearly often of lesser quality than the windowsbuild. I have many native games in my steamlibrary where using Proton results in better performance and compatibility.
Last edited by ripper81358 on 8 March 2022 at 8:05 am UTC
I hope Valve takes this to heart, and gets to work.
(Though I'll say even for the same depth of analysis, being more succinct could probably make the main points clearer - especially for people that haven't been following everything closely)
I also assume all of this is particularly concerning (in addition to, obviously, for porting studios/devs) for indies. For games that don't have a massive marketing budget, however Steam chooses to showcase their game is a huge deal, and adding an endorsement or disavowing the game for the deck can mean a lot. The opaqueness in the process is concerned, because it both erodes trust in the "fairness" of the platform and stops devs from taking direct meaningful action, putting them at the mercy of the "algorithm gods.
Last edited by mr-victory on 8 March 2022 at 5:10 pm UTC
With regards to preferring proton builds over native ones, aside from bugs, there's also a matter of feature-parity. For example, I was very much a fan of the Linux ports of the Tomb Raider franchise, but the fact that saves aren't compatible between platforms always irked me. Didn't care much for the first version, but for Rise of the Tomb Raider I started on my PC (Windows) rather than on the HTPC (I have with Linux + Steam BP) and I just defaulted to the proton version.
I didn't even bother to try the native version of Shadow of the Tomb Raider. I'm ashamed to admit it, but it is what it is.
That's but one example. Between delayed updates, platform-specific bugs and lack of feature parity, many native ports feel like second class citizens even when compared to proton versions.
More than native ports, I want companies to pay attention to supporting Linux even if that's only using proton as a middleware. And to be honest, it's not like many (most?) native ports didn't rely on proton-like middleware anyway. I think Valve probably thinks similarly.
Quoting: lugaidstermany native ports feel like second class citizens even when compared to proton versions.Creative Assembly (SEGA) even started to skip creating native Linux builds with Total War: Troy, they said Proton makes it pointless. However with their newest entry Warhammer III there will be a native Linux version again (Feral Interactive).
https://www.gamingonlinux.com/2021/07/feral-no-longer-porting-a-total-war-saga-troy-to-linux-citing-less-demand-since-proton/
Quoting: audiopathikQuoting: lugaidstermany native ports feel like second class citizens even when compared to proton versions.Creative Assembly (SEGA) even started to skip creating native Linux builds with Total War: Troy, they said Proton makes it pointless. However with their newest entry Warhammer III there will be a native Linux version again (Feral Interactive).
https://www.gamingonlinux.com/2021/07/feral-no-longer-porting-a-total-war-saga-troy-to-linux-citing-less-demand-since-proton/
I think the real reason was that it was Free on Epic and that it was feasible to run this using Wine/Proton. They probably did not expect to have much of a market left a year later after the exclusivity period ended.
The big steps in verification should be:
1. The developer tests the game internally.
2. The developer submits the game for review.
3. Valve tests the game.
4. Valve either certifies the game or sends the developer back to step 1, with notes on why the game failed.
Steps 1 and 2 are missing from Valve's process.
Quoting: LestibournesI haven't read the article yet, but before reading it this is what my opinion is:
The big steps in verification should be:
1. The developer tests the game internally.
2. The developer submits the game for review.
3. Valve tests the game.
4. Valve either certifies the game or sends the developer back to step 1, with notes on why the game failed.
Steps 1 and 2 are missing from Valve's process.
To be fair, the majority of the games on Steam are too old for the developer to want to or even exist to do anything about it.
Your idea only makes sense for games released in the last O(months) time.
See more from me