We do often include affiliate links to earn us some pennies. See more here.

Google announce ‘Stadia’, their new cloud gaming service built on Linux and Vulkan

By -
Last updated: 19 Oct 2020 at 8:25 am UTC

Google have now finally unveiled their new cloud gaming service named Stadia, offering instant access to play games in Google Chrome.

What they joked was the worst-kept secret in the industry (no kidding), sounds like quite an interesting service. Certainly one that could eventually end up redefining what gaming is. A little hyperbolic maybe? I'm not so sure considering how easy this should be to jump into a game. On top of that, they very clearly talked about how it's built on Linux (Debian specifically) and Vulkan with custom GPUs from AMD.

Something they showed off, was how you could be watching a game trailer with a button to play it on Stadia and (supposedly within a few seconds) you would jump right into it. That's quite en exciting idea, one that would easily pull in quite a lot of people I've no doubt.

As for resolution, they said it will support 1080p and 4K around 60FPS at release with 8K being worked on as well but that sounds further out if anyone even cares about 8K right now.

They also showed off their new controller, with a dedicated Google Assistant button and a button to capture video immediately for YouTube:


While Google are making their own dedicated gamepad, they did say it will be compatible with other devices too.

They also announced partnerships with both Unity and Unreal Engine and Stadia will "embrace full cross-platform play" including "game saves and progression". They also had id Software, talk about how it didn't take long to bring the new Doom Eternal to Stadia, thanks to how they made the previous Doom game with Vulkan.

This means, that development for Linux is suddenly going to become a priority for a lot more developers and publishers. I don't want to overstate how important that is, but it's a very exciting prospect. This doesn't suddenly mean we're going to see a lot more Linux games on the desktop, but it's entirely possible after they go through all the work to get the games working on Linux with Vulkan for Stadia.

Stream Connect is another service they talked about. They mentioned how developers have pushed the boundaries of gaming but often local co-op is left out, as doing it multiple times in top-end games can require really beefy hardware. With Stadia, each instance would be powered by their servers so it wouldn't be such an issue. They also talked about how if you're playing some sort of squad-based game, how you could bring up their screen to see what they're doing which sounds very cool.

Google also announced the formation of their own game studio, Stadia Games and Entertainment, to work on exclusive games for their new service.

As for support from more external game developers, they mentioned how they've shipped "development hardware" to over 100 developers. From what they said, it should be open to smaller developers as well as the usual AAA bunch.

Stadia is confirmed to be launching this year and it will be first available in the US, Canada, UK and "most of Europe". One thing wasn't mentioned at all—price, but they said more details will be available in the summer. The official site is also now up on stadia.com and developers have their own website to look over.

Google also posted up some extra information on their developer blog:

Google believes that open source is good for everyone. It enables and encourages collaboration and the development of technology, solving real-world problems. This is especially true on Stadia, as we believe the game development community has a strong history of collaboration, innovation and shared gains as techniques and technology continually improve. We’re investing in open-source technology to create the best platform for developers, in partnership with the people that use it. This starts with our platform foundations of Linux and Vulkan and shows in our selection of GPUs that have open-source drivers and tools. We’re integrating LLVM and DirectX Shader Compiler to ensure you get great features and performance from our compilers and debuggers. State-of-the-art graphics tools are critical to game developers, and we’re excited to leverage and contribute to RenderDoc, GAPID and Radeon GPU Profiler — best of breed open-source graphics debugging and profiling tools that are continually improving.

There's probably plenty I missed, you can see their video on YouTube here.

As exciting and flashy as it sounds, it's obviously not Linux "desktop" gaming which is what the majority of our audience is likely interested in. However, things change and if it does become a huge hit we will cover it more often if readers request it. Linux gaming can mean all sorts of things from native games to emulators, Wine and Steam Play and now perhaps some cloud gaming so I don't want to rule it out. However, I can't see this replacing Steam, Humble, GOG, itch.io and so on for me personally.

Obviously there’s still a lot of drawbacks to such a service, especially since you will likely have zero ownership of the actual games so they could get taken away at any time when licensing vanishes. At least with stores like Steam, you still get to access those games because you purchased them. Although, this does depend on what kind of licensing Google do with developers and publishers, it might not be an issue at all but it’s still a concern of mine. Latency and input lag, are also two other major concerns but given Google's power with their vast networks, it might not be so bad.

Also, good luck monitoring your bandwidth use with this, it's likely going to eat up a lot all of it. YouTube and Netflix use up quite a bit just for watching a 30-minute episode of something in good quality, how about a few hours per day gaming across Stadia? Ouch.

That doesn't even address the real elephant in the room, you're going to be giving Google even more of your data if you use this service, a lot more. This is the company that failed to promptly disclose a pretty huge data leak in Google+ after all. I don't want to be some sort of scaremongering crazy-person but it's something to think about.

As always, the comments are open for you to voice your opinion on it. Please remain respectful to those with a different opinion on the matter.

Article taken from GamingOnLinux.com.
52 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly checked on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. You can also follow my personal adventures on Bluesky.
See more from me
The comments on this article are closed.
All posts need to follow our rules. For users logged in: please hit the Report Flag icon on any post that breaks the rules or contains illegal / harmful content. Guest readers can email us for any issues.
283 comments Subscribe
Page: «11/15»
  Go to:

Purple Library Guy 21 Mar 2019
For me, the interesting implication of Stadia is its ability to change the supply side. The Steam survey shows that the average PC gamer does not have particularly good hardware, and this actually limits developers in what they can do and still address a large enough purchase market.

If Stadia has nodes with Vega56 GPUs as a minimum, and allows arbitrary combining of nodes to produce output, then the complexity of what developers may produce for Stadia can scale very quickly to the point that you actually could NOT run it on any normally available desktop hardware, let alone the average rig, making traditional sales of such games redundant. That may be why the new Google game studio is suggesting their titles will be exclusive to Stadia.

Of course, however amazing their back-end might be, Google still need to get the right price model, overcome the possible network limitations and avoid their normal habit of turning everything into advertising revenue.
Interesting point. Mind you, for most games most of that power would be dedicated to graphics stuff, in which case wouldn't those extra-power-hungry games also be extra-bandwidth-hungry? You could end up trading one bottleneck for another.
Which in turn makes me wonder about two futures clashing. Imagine the future of gaming is this kind of streaming solution. Now imagine the future of gaming is VR. I don't think it can be both unless someone spends a bunch of billions on last-mile fibre optics.

The bandwidth required for graphics stream presentation has historically increased quite slowly. It is proportional to frame rate multiplied by pixels per frame multiplied by bits per pixel. Desired frame rate has remained at about 60 for decades, and bits per pixel for most people has been 24 for decades. That leaves pixel resolution as the main variant, which has risen from 1M pixel screens 30 years ago to 6M pixel screens now. Network bandwidth increase in those 30 years far exceeds the increased requirements of a graphics stream, so if both network and graphics bandwidth trends continue, the streaming itself should reduce as a cause of bottleneck. Even the bandwidth to support binocular XR presentation should not be an issue since the size of XR screens you can put in front of your eyes is physically limited, and the human eye's ability to resolve detail at close range tops out at around 1000 pixels per inch.

In contrast, the amount of additional processing power you can put into determining the content of the graphics stream is effectively unbounded, since almost every aspect of current real-time game production is subject to approximation, simplification and deception, in order to fit into the processing 'budget' available.
Huh. Somehow I was under the impression that video streams were compressed, and so just how detailed the actual picture was (as opposed to the number of pixels) might be relevant to how compressible it was. But yeah, I guess if they're just dumping all the pixels it doesn't matter what the programs are doing with those pixels. Given the pauses I often experience with simple streamed video I can well imagine streamed games having some problems, but that is a separate issue from the backend power needed to run the games.

In terms of VR (XR?) I was thinking more that as I understand it, for it to work without messing up people's heads you need really, really low latency. I can imagine streaming working well enough for ordinary games in some places with some ISPs and data plans. But well enough for VR not to feel bad wonky? I seriously doubt it outside maybe South Korea. Mind you, I'm quite unconvinced that the future of gaming is VR. But if it was, it would be damn tough to stream effectively.
Purple Library Guy 21 Mar 2019
Is it TCP/IP, UDP or something I haven't heard of? Isn't UDP faster but prone to packet loss thus reducing the quality of the stream?

TCP includes the control mechanism to deal with packet loss (detection and resending). For UDP it is up to the application to decide whether to detect it and what to do if something is lost.

And still be faster than TCP? Or is it better to go with TCP, in that case?
I might imagine that in a game, (as etonbears points out, without buffering, everything happening in real time) by the time lost packets get re-sent they'd be irrelevant, so it would be better to just ignore them and leave a little fuzz in the picture than to, like, refuse to show the image until it's all complete. That might suggest this UDP thing. But I don't know anything about this, I'm just trying to do logic from too little data.
etonbears 21 Mar 2019
Which is one argument against net neutrality - you can't guarantee the quality of service you think you are paying for.

Network congestion due to load is not an argument against net neutrality. Net neutrality is about preventing deliberate traffic discrimination (such as for anti-competitive purposes). Managing the network due to congestion is fine according to the concept of net neutrality. Mind you, something like data caps is not a network management tool, it's users fleecing, anti-competitive trash. Limiting bandwidth when network is overloaded though is a legitimate network managing technique.

You are adopting the narrow view of "network neutrality as monopolist tool" popular in the United States. I mean it in the wider sense of equality/restriction for any purpose. There are good arguments for allowing different quality of service offerings based on what you pay ( providing it is transparent ) and good arguments for variable service for particular traffic types ( largely safety-critical ), for example.

As for the current US "debate", monopolist/unacceptable business behaviour is a general problem not unique to the operation of the internet, and would be best addressed in that light. Should the US really want a completely undifferentiated network backbone operated as a public utility, it would be better to pay for it from the public purse, either publicly managed or sub-contracted, rather than the current arrangement. However, I suspect that might be considered "un-American".
x_wing 21 Mar 2019
Is it TCP/IP, UDP or something I haven't heard of? Isn't UDP faster but prone to packet loss thus reducing the quality of the stream?

TCP includes the control mechanism to deal with packet loss (detection and resending). For UDP it is up to the application to decide whether to detect it and what to do if something is lost.

And still be faster than TCP? Or is it better to go with TCP, in that case?
I might imagine that in a game, (as etonbears points out, without buffering, everything happening in real time) by the time lost packets get re-sent they'd be irrelevant, so it would be better to just ignore them and leave a little fuzz in the picture than to, like, refuse to show the image until it's all complete. That might suggest this UDP thing. But I don't know anything about this, I'm just trying to do logic from too little data.

Yeap, TCP has way too much overhead for an application that requires very low latency (the way this stream works is very similar on how tv broadcasting works). Also, in the Google backbone for this service they may be using other less known optimization in order to reduce the latency (Jumbo frames!)
Purple Library Guy 21 Mar 2019
Which is one argument against net neutrality - you can't guarantee the quality of service you think you are paying for.

Network congestion due to load is not an argument against net neutrality. Net neutrality is about preventing deliberate traffic discrimination (such as for anti-competitive purposes). Managing the network due to congestion is fine according to the concept of net neutrality. Mind you, something like data caps is not a network management tool, it's users fleecing, anti-competitive trash. Limiting bandwidth when network is overloaded though is a legitimate network managing technique.

You are adopting the narrow view of "network neutrality as monopolist tool" popular in the United States.
Debate and discussion of the term as used by Shmerl has been extremely widespread for a number of years. Even if it's used differently elsewhere, it's probably not used nearly as much your way overall because your sense is more technical and less controversial in its implications, so probably just less talked about. So you shouldn't be surprised if Shmerl's is the sense people expect. And if you think it's going to stay limited to the US, well, maybe, but I've sure noticed that nasty practices often start in the US and are then exported to much of the rest of the world through trade agreements and by the same interests elsewhere latching onto the American example to make their greed respectable.

I do think that public provision would be a good idea. The internet is infrastructure; infrastructure works well public.


Last edited by Purple Library Guy on 21 Mar 2019 at 7:03 pm UTC
etonbears 21 Mar 2019
For me, the interesting implication of Stadia is its ability to change the supply side. The Steam survey shows that the average PC gamer does not have particularly good hardware, and this actually limits developers in what they can do and still address a large enough purchase market.

If Stadia has nodes with Vega56 GPUs as a minimum, and allows arbitrary combining of nodes to produce output, then the complexity of what developers may produce for Stadia can scale very quickly to the point that you actually could NOT run it on any normally available desktop hardware, let alone the average rig, making traditional sales of such games redundant. That may be why the new Google game studio is suggesting their titles will be exclusive to Stadia.

Of course, however amazing their back-end might be, Google still need to get the right price model, overcome the possible network limitations and avoid their normal habit of turning everything into advertising revenue.
Interesting point. Mind you, for most games most of that power would be dedicated to graphics stuff, in which case wouldn't those extra-power-hungry games also be extra-bandwidth-hungry? You could end up trading one bottleneck for another.
Which in turn makes me wonder about two futures clashing. Imagine the future of gaming is this kind of streaming solution. Now imagine the future of gaming is VR. I don't think it can be both unless someone spends a bunch of billions on last-mile fibre optics.

The bandwidth required for graphics stream presentation has historically increased quite slowly. It is proportional to frame rate multiplied by pixels per frame multiplied by bits per pixel. Desired frame rate has remained at about 60 for decades, and bits per pixel for most people has been 24 for decades. That leaves pixel resolution as the main variant, which has risen from 1M pixel screens 30 years ago to 6M pixel screens now. Network bandwidth increase in those 30 years far exceeds the increased requirements of a graphics stream, so if both network and graphics bandwidth trends continue, the streaming itself should reduce as a cause of bottleneck. Even the bandwidth to support binocular XR presentation should not be an issue since the size of XR screens you can put in front of your eyes is physically limited, and the human eye's ability to resolve detail at close range tops out at around 1000 pixels per inch.

In contrast, the amount of additional processing power you can put into determining the content of the graphics stream is effectively unbounded, since almost every aspect of current real-time game production is subject to approximation, simplification and deception, in order to fit into the processing 'budget' available.
Huh. Somehow I was under the impression that video streams were compressed, and so just how detailed the actual picture was (as opposed to the number of pixels) might be relevant to how compressible it was. But yeah, I guess if they're just dumping all the pixels it doesn't matter what the programs are doing with those pixels. Given the pauses I often experience with simple streamed video I can well imagine streamed games having some problems, but that is a separate issue from the backend power needed to run the games.

In terms of VR (XR?) I was thinking more that as I understand it, for it to work without messing up people's heads you need really, really low latency. I can imagine streaming working well enough for ordinary games in some places with some ISPs and data plans. But well enough for VR not to feel bad wonky? I seriously doubt it outside maybe South Korea. Mind you, I'm quite unconvinced that the future of gaming is VR. But if it was, it would be damn tough to stream effectively.

Yes, you're right, the data streams would most likely be compressed, but compression and decompression schemes have to be computationally simple to work in real-time, so they less likely to have any interest in how detailed each frame is, and more likely to be interested in how much changes between one frame and the next. If it does not change, you don't need to send it. Compression is an active area of research for many uses, often with quite different characteristics/needs.

In terms of VR latency-induced sickness, yes, a problem. Even locally, and directly connected, and largely related to how an individual mentally tolerates the disjuncture of being in an Immersive environment that does not behave in an immersive manner.

I, personally, will probably never get on with VR as it exists today, and while a lot of people do think it's great, a surprisingly large number still have quite limited time tolerance. Network latency will only make this worse, and is best reduced through short internet paths, since the latency is primarily in the switches, not the cables/wires. But you're right, in the end it may come down to choosing one or the other.

P.S. XR is eXtended Reality, a convenient lumping together of Virtual Reality, Augmented Reality, Augmented Virtuality, and any other marketing buzzwords that come along.
Mohandevir 21 Mar 2019
Wow! Thank you all for your great explanations. You are awesome!
Klaas 21 Mar 2019
Huh. Somehow I was under the impression that video streams were compressed, and so just how detailed the actual picture was (as opposed to the number of pixels) might be relevant to how compressible it was. (…)

That's definitely the case if the frames contain lots of moving noise e.g. falling snow. An easy example to see the problem is a classic Doom stream on a map that contains the texture FIREBLU (e.g. E3M6 Mt. Erebus) and the image quality drops while the bitrate explodes.
etonbears 21 Mar 2019
Is it TCP/IP, UDP or something I haven't heard of? Isn't UDP faster but prone to packet loss thus reducing the quality of the stream?

TCP includes the control mechanism to deal with packet loss (detection and resending). For UDP it is up to the application to decide whether to detect it and what to do if something is lost.

And still be faster than TCP? Or is it better to go with TCP, in that case?
I might imagine that in a game, (as etonbears points out, without buffering, everything happening in real time) by the time lost packets get re-sent they'd be irrelevant, so it would be better to just ignore them and leave a little fuzz in the picture than to, like, refuse to show the image until it's all complete. That might suggest this UDP thing. But I don't know anything about this, I'm just trying to do logic from too little data.

Yeap, TCP has way too much overhead for an application that requires very low latency (the way this stream works is very similar on how tv broadcasting works). Also, in the Google backbone for this service they may be using other less known optimization in order to reduce the latency (Jumbo frames!)

I don't see any direct reference to their stream formats, but would guess that Google are using HTTP3 for the Stadia data protocol, as it is only currently implemented in Chrome and runs over UDP rather than TCP. Using UDP brings much lower latency because it does not block on errors or waste time acknowledging packets, and it allows the seamless connection migration between different devices shown in their demo. It implements TLS security by default and uses Forward Error Correction to ensure the receiver has enough redundant information to resolve errors from occasional missed packets, so HTTP3, or a custom version of it seems a good candidate.

Not quite sure where Jumbo Frames would fit in though. I only know it as an Ethernet optimisation that is rarely worth enabling.
ShabbyX 21 Mar 2019
Any recommendation to whom send such proposal? Google isn't exactly known to be very open to external communication. I don't mind sending a suggestion, as long as it won't go to some usual stone wall of support.

Google is huge, and so is the amount of feedback they get. Depends on the team, but they usually go through everything, even if they can't literally reply to everyone. Probably the first step would be to wait for launch, then submit feedback through whatever interface they have. Mind you, they could disagree with the suggestion or have it as low priority, but they won't be able to engage with you personally due to the massive amount of feedback they get.
ShabbyX 21 Mar 2019
I wouldn’t want to suffer a whole game playthrough over a fallible network, but I do prefer streaming 25 Mbit of data every second for a few hours to just try the game than downloading the whole game before I can get a taste of it.

Did you see the presentation? The demo guy literally just left one device and picked up the game on another. I'm sure connection drop is not at all an issue as you should be able to just pick up where you left off.
ShabbyX 21 Mar 2019
Any recommendation to whom send such proposal? Google isn't exactly known to be very open to external communication. I don't mind sending a suggestion, as long as it won't go to some usual stone wall of support.

Google is huge, and so is the amount of feedback they get. Depends on the team, but they usually go through everything, even if they can't literally reply to everyone. Probably the first step would be to wait for launch, then submit feedback through whatever interface they have. Mind you, they could disagree with the suggestion or have it as low priority, but they won't be able to engage with you personally due to the massive amount of feedback they get.

Actually thinking about it more, does it even make sense? If you have a game you bought but want to play on Stadia (having paid subscription fee):

- If the game runs on Stadia, it's likely already on Stadia's catalogue, which means you can play it regardless of having owned it.
- If the game doesn't run on Stadia, well it doesn't, you can't ask Stadia to run it.
Shmerl 21 Mar 2019
You are adopting the narrow view of "network neutrality as monopolist tool" popular in the United States. I mean it in the wider sense of equality/restriction for any purpose.

You probably meant anti-monopolist tool, which net neutrality in part is. However I don't see this being very widely differently defined, given the author of the term is Tim Wu who is a law professor in Columbia University. I'm pretty much using his original definition.


Last edited by Shmerl on 21 Mar 2019 at 9:49 pm UTC
Shmerl 21 Mar 2019
Actually thinking about it more, does it even make sense? If you have a game you bought but want to play on Stadia (having paid subscription fee):

- If the game runs on Stadia, it's likely already on Stadia's catalogue, which means you can play it regardless of having owned it.
- If the game doesn't run on Stadia, well it doesn't, you can't ask Stadia to run it.

That's the whole point of offering a DRM-free option. Currently game on Stadia is DRMed. To make any of them DRM-free, means to offer a downloadable version (I suppose it would mean adjustments like you said, to provide ability to run it on regular desktop Linux). Technical changes aside, it will simply make it possible to back up the game and run it without the service.

I already brought examples of other stores that do that, like Bandcamp for music. You could ask the same question, why offer downloads there, if you could just allow only streaming (Spotify-like).

If your question is, what value is there in having the ability to back things up, I think it's quite self explanatory, and the value ranges from games preservation to ability to use it without Internet connection and avoiding the risk of losing all your catalog if the service shuts down, or your account is simply closed for instance.

And if the question is, what's the value in the streaming use case, it's obvious too. You can play it while not having your high end rig with you, for example. I.e. both use cases have value, and the store can as well offer both. It's only when they are both provided, when the user isn't losing out.


Last edited by Shmerl on 21 Mar 2019 at 10:02 pm UTC
etonbears 21 Mar 2019
Which is one argument against net neutrality - you can't guarantee the quality of service you think you are paying for.

Network congestion due to load is not an argument against net neutrality. Net neutrality is about preventing deliberate traffic discrimination (such as for anti-competitive purposes). Managing the network due to congestion is fine according to the concept of net neutrality. Mind you, something like data caps is not a network management tool, it's users fleecing, anti-competitive trash. Limiting bandwidth when network is overloaded though is a legitimate network managing technique.

You are adopting the narrow view of "network neutrality as monopolist tool" popular in the United States.
Debate and discussion of the term as used by Shmerl has been extremely widespread for a number of years. Even if it's used differently elsewhere, it's probably not used nearly as much your way overall because your sense is more technical and less controversial in its implications, so probably just less talked about. So you shouldn't be surprised if Shmerl's is the sense people expect. And if you think it's going to stay limited to the US, well, maybe, but I've sure noticed that nasty practices often start in the US and are then exported to much of the rest of the world through trade agreements and by the same interests elsewhere latching onto the American example to make their greed respectable.

I do think that public provision would be a good idea. The internet is infrastructure; infrastructure works well public.

Perhaps, but it seems like more like Political ( big P ) footballs that are being kicked around in the wrong Stadia ( OK, I'll stop that :) ).

It's unlikely that countries in the European Union would be "infected" by bad network practices of the sort some Americans fear, as the Competition Commission has a good record at acting on complaints concerning poor behaviour. Most countries here also have regulatory frameworks that work, more or less in the general interest of everyone.

Even the UK, which is much closer that any other EU country to American Ideals, really isn't all that similar. We have our own brands of loonies trying to impose their ill-informed worldviews on us, like most countries, but they don't really resonate with US groups.

I'd have to say that in Canada, neither BC nor Quebec ( the provinces I have visited ) seemed much aligned with US values, and your current administration doesn't exactly seem to idolise its US counterpart; but I suppose Canada might be the most likely domino because of co-location and economic pressure, so I can see your concerns might be more aligned with US sentiments.
Shmerl 22 Mar 2019
Google is huge, and so is the amount of feedback they get. Depends on the team, but they usually go through everything, even if they can't literally reply to everyone. Probably the first step would be to wait for launch, then submit feedback through whatever interface they have. Mind you, they could disagree with the suggestion or have it as low priority, but they won't be able to engage with you personally due to the massive amount of feedback they get.

Would it make sense to write straight to Stadia chief Phil Harrison about it? I doubt this kind of decision can be made without his involvement.

I've recently seen an interesting case, when one Linux user wrote directly to Intel CEO Robert Swan about frustrations with bluetooth on Lenovo E485 laptop, when using Intel wireless chip (in the end it was Lenovo's UEFI fault which they still didn't fix). And in response Intel engineers thoroughly investigated the issue and wrote a detailed response to him which he shared. I was pleasantly surprised how well Intel handled that. I guess sometimes CEOs don't ignore direct feedback.


Last edited by Shmerl on 22 Mar 2019 at 12:54 am UTC
Shmerl 22 Mar 2019
Hm, I've just seen this article from The Verge, which mentions that Phil Harrison is sympathetic to games preservation concerns (which are part of what DRM-free is about):

https://www.theverge.com/2019/3/21/18275806/google-stadia-phil-harrison-interview-cloud-gaming-streaming-service-gdc-2019

Though his response doesn't sound like his is considering a real DRM-free option at the moment:

Another issue some critics of cloud gaming have raised in the wake of the Stadia announcement is around game preservation. Moving game software to the cloud means not only will it be harder for players to retain ownership of a product over time now that it’s both no longer on a disc and no longer even on your hard drive, but it could also make it much harder for games built just for the cloud to exist years or decades from now when the service has been upgraded or potentially shut down. Harrison says he’s sympathetic to that view:

I completely understand that concern. And I think it’s frankly no different than how games are on mobile today, and not really different to how users trust us today with the most precious thing in their life, which are their memories, with Google Photos. I think we would apply the same standards of care to our data going forward as we would to something like Google Photos. This is a moment in the game industry where technology opens up a whole new set of new capabilities for gamers and I would obviously focus on what those incredible advantages are. And that’s going to be our point of view of the future of games.

The article also mentions his take on Linux porting by the way.

At least if he is sympathetic, he might not dismiss the idea outright.


Last edited by Shmerl on 22 Mar 2019 at 12:57 am UTC
Shmerl 22 Mar 2019
May be Liam can get an interview with him asking more from the perspective of Linux gaming. That would be pretty cool.
etonbears 22 Mar 2019
You are adopting the narrow view of "network neutrality as monopolist tool" popular in the United States. I mean it in the wider sense of equality/restriction for any purpose.

You probably meant anti-monopolist tool, which net neutrality in part is. However I don't see this being very widely differently defined, given the author of the term is Tim Wu who is a law professor in Columbia University. I'm pretty much using his original definition.

Quite right, I did mean an "anti-monopolist" in there. And of course quite right that Tim Wu initiated the network neutrality debate. But like most debates and discussions, particularly online, the original theme expands and morphs to encompass different contexts, especially when considered by those in different societies operating under different jurisdictions and regulatory conditions.

While I spend quite a few weeks every year in the US, I do not live there, and I do not usually consider it as my context for discussion unless I say so. The discussion in the US, concerning how the Internet is constructed managed and used, seems to have become intensely partisan and thus narrow because the participants simply assault each other with their best dogmatic assertions and withdraw.

You may not yourself be a US resident, but the tone with which you offered your "correction" to my "misunderstanding" certainly suggests that narrow partisan mindset.
Purple Library Guy 22 Mar 2019
Which is one argument against net neutrality - you can't guarantee the quality of service you think you are paying for.

Network congestion due to load is not an argument against net neutrality. Net neutrality is about preventing deliberate traffic discrimination (such as for anti-competitive purposes). Managing the network due to congestion is fine according to the concept of net neutrality. Mind you, something like data caps is not a network management tool, it's users fleecing, anti-competitive trash. Limiting bandwidth when network is overloaded though is a legitimate network managing technique.

You are adopting the narrow view of "network neutrality as monopolist tool" popular in the United States.
Debate and discussion of the term as used by Shmerl has been extremely widespread for a number of years. Even if it's used differently elsewhere, it's probably not used nearly as much your way overall because your sense is more technical and less controversial in its implications, so probably just less talked about. So you shouldn't be surprised if Shmerl's is the sense people expect. And if you think it's going to stay limited to the US, well, maybe, but I've sure noticed that nasty practices often start in the US and are then exported to much of the rest of the world through trade agreements and by the same interests elsewhere latching onto the American example to make their greed respectable.

I do think that public provision would be a good idea. The internet is infrastructure; infrastructure works well public.

Perhaps, but it seems like more like Political ( big P ) footballs that are being kicked around in the wrong Stadia ( OK, I'll stop that :) ).

It's unlikely that countries in the European Union would be "infected" by bad network practices of the sort some Americans fear, as the Competition Commission has a good record at acting on complaints concerning poor behaviour. Most countries here also have regulatory frameworks that work, more or less in the general interest of everyone.

Even the UK, which is much closer that any other EU country to American Ideals, really isn't all that similar. We have our own brands of loonies trying to impose their ill-informed worldviews on us, like most countries, but they don't really resonate with US groups.

I'd have to say that in Canada, neither BC nor Quebec ( the provinces I have visited ) seemed much aligned with US values, and your current administration doesn't exactly seem to idolise its US counterpart; but I suppose Canada might be the most likely domino because of co-location and economic pressure, so I can see your concerns might be more aligned with US sentiments.
That's quite reassuring. Although yeah, Canada, no matter what the Prime Minister of the day's rhetoric, has a tendency to end up doing a lot of things the American way. BC and Quebec aren't much aligned with US values, but Alberta totally is, a distressing amount of Saskatchewan seems to be, and the big enchilada, Ontario, is in a culture war between US-conservative and Canadian "liberal" values which local Trump-wannabes win all too often. And anyway, what Canadians think doesn't always matter. Although so far we do have significantly better copyright law, despite a lot of pressure to go full DMCA.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.