There have been many many failed cloud gaming initiatives over the years and the discourse over it never seems to understand the interplay between technology and market economics that doom every effort before it begins.
The first and most obvious question around Cloud Gaming is why? There are many legitimate consumer focused answers to this question but streaming media has never been driven by consumer benefits, it’s primary motivation is control and the concept of rent seeking. This is why spotify, netflix or youtube doesn’t (by default) cache more than a few seconds at a time, it could cache whole movies locally and even speculatively cache media, both making it’s own service better and cheaper to run.
In today’s world individuals are in control of immense amounts of computing power, the need for it is almost entirely driven by the games industry. Corporations would like control of those devices, they would like the consumer to depend on them for the right to access it and Cloud Computing generally and Cloud Gaming specifically is how they would like to achieve that.
The concept of rent seeking is old, it essentially means finding a way to turn a product you sell into a service you rent. The concept of the End User License Agreement has allowed corporations to transition almost any product, especially one that uses software into a service instead of a product. Throughout history it has been aggressively legislated against because, the sale of goods is the cornerstone of a capitalist market. When you sell a product, legally you are required to transfer all rights over that product to the purchaser, however with a service model you don’t have to transfer any rights, you can lay out exactly how the end user is allowed to use their purchase and you can collect money from them on any schedule the service market will support. It has the potential to kill market economics and capitalism by transferring all consumer rights to corporations.
The popular criticism of cloud gaming revolves around latency, the idea that there are physical limits to how fast data can move from the server to the client and this does hold true and means a certain category of game is likely never going to be competitive with current gen consoles and pc games for latency, there have been many excellent technical solutions to work around the problem. The problem with that is they have mostly also been adopted by consoles and were never a problem with PC games to begin with so the streaming services got their end to end latency to match the ~100ms of last gen consoles just as this generation improved their technology to match the PC’s sub 50ms latency, and at this point the streaming services have nowhere left to go because the only latency left is the network latency that is physically impossible to improve. This criticism does ignore the fact, however, that most games have been running at the latencies available for years and no one complained.
The thing that really dooms cloud gaming however is thermodynamics, the cutting edge of datacentre research is all about how to keep the servers storing the data cool, cooling datacentres is the vast majority of the cost of running them. And the thing about datacentres is that they are comparatively cold, when compared to compute. If you stick a gaming class cpu and gpu in there, you need a lot more cooling. Compared with a games console in someone’s home, in most homes it’s going to help to heat their home, so rather being a problem that needs to be dealt with it’s a tiny benefit. Even in countries where they use air conditioning, the console doesn’t contribute significantly to the individual’s cooling bill. This adds up to the fact that running cloud gaming infrastructure is, on balance, more expensive. This means the only way to provide it competitively, is for the corporation to subsidise it, in the understanding that it will confer benefits for them elsewhere, primarily that they will get control of consumer compute and get to dictate what runs on it, so far no company, including google, has been willing to eat the costs in exchange for the power. The companies still in the market (microsoft and amazon) have, however, historically proven to be capable of making these kinds of long term power plays.
Another possibility for rendering it sustainable is to offer a hybrid model where you can start a game via streaming but also have a download running that when finished takes over, this means the cost of the datacentres can potentially be offset by the impulse purchases it facilitates while also trying to reduce actual usage of the service to where it is actually beneficial, this wouldn’t facilitate a subscription model however, only a traditional digital distribution model.
The heat problem should be enough to render the idea dead on arrival for all but the most committed corporation but on top of that there is the reliability issue to contend with. With streaming video or music, it’s infuriating when the connection degrades and it triggers stalls but compared to gaming they are infinitely more tolerant. Again this does depend on the game but in some games it only take a couple of dropped packets to render hours of work undone. Compound on top of this encoding artefacts degrading the visuals, internet outages and the multitude of other reliability issues when you’re talking about making the global internet an integral part of your gaming experience, it’s a very difficult problem to solve. The touted solution is moving the compute centre closer to the player, sometimes, even as close as their local exchange. This obviously is not a solution, mainly because if the service provider has any hope of offsetting the significant costs of cooling the hardware, they need to be able to use the same hardware for multiple people the closer the machine is to the end user the fewer end users can timeshare it. It’d be cheaper and less complex renting them a console in their own home, an approach that microsoft has already rolled out on account of it’s comparative simplicity.
Compound on top of all the issues above the fact that the bandwidth requirements have always been extremely optimistic, when stadia touted it’s recommended internet connection numbers it assumed that one person playing games was the only thing that happened on those lines. Any home with multiple people playing games and anything else going on would need to multiply those requirements. For some people, with gigabit plus connections things are starting to move to a place where bandwidth constraints aren’t going to be a limiting factor but it’s worth bearing in mind that those links currently cost 50-100% extra and so when calculating the value to the end user those costs will need to be accounted for in your value proposition.
All these things combined, some of which are solvable with a great investment in infrastructure and technology, some of which will need to be addressed via market positioning, all of which make the net cost of the technology more expensive than the comparatively simple and better alternatives already available. It makes an incredibly steep hill to climb for any would be cloud gaming provider which goes in part to explain why geforce now (probably the most palatable cloud gaming service available) has recently has to increase it’s prices, and why it will ultimately, most likely fail.