Sponsored By

Cyberpunk 2077 or The Problem That Is the AAA Industry

A look at Cyberpunk 2077's release, and how the game highlights a problem facing the AAA industry regarding hardware trends, and gamers expectations.

Marten Jonsson, Blogger

December 17, 2020

9 Min Read

So, at the time of writing this, Cyberpunk 2077 has been out for about a week, to much accolade, distraught, joy, hate, and pretty much every emotion that exists. People have opinions, to put it mildly. I had been looking forward to this game for quite some time, I love the aesthetic, CD Project Red has a good reputation as a developer, everything pointed to this being a smash hit. Then it finally came out, and I bought it on my PS4 since my computer isn’t really powerful enough to run it properly, and I have no interest in the next-gen consoles. And for everyone not living under a rock, you know the deal with the game on PS4 and Xbox One. It’s… problematic, to say the least. But, I enjoyed the game regardless, the story is good, there are interesting well-written characters, and despite it crashing a few times I kept returning to Night City. But I’m not here to talk about the game in itself, but rather a somewhat disturbing trend that can be seen as a result of the debacle surrounding the game.

 

So, on the internet people are saying in response to the critique against the PS4 and Xbox One versions: “Well, you shouldn’t expect it to run on last-gen consoles”, and to that I say: Why in the hell shouldn’t I expect that? The PS5 and Xbox Series X have been out for a month, and Cyberpunk 2077 was announced in May 2012. As in, more than 8 years ago. The PS4 and Xbox One were released in November 2013. Meaning, the game was announced before the now-last generation had even been released. Saying it wasn’t developed for the PS4 and Xbox One, or at the very last had them as part of the pipeline, is simply ridiculous.

 

Historically, games released at the end-life of a console have been the best, and most technologically advanced, simply because the hardware has been around for a long time, and developers have learned how to use it as efficiently as possible. Just look at games like Breath of the Wild for the Wii U, The Last of Us for the PS3, Shadow of the Colossus for the PS2, to quickly name a few. Go even further back and there are games like Yoshi’s Island for SNES and Super Mario Bros 3 for the NES, that both pushed their respective hardware to the limit.

 

Well, all of those games were exclusive for the console. True, and this brings us to a problem: The industry that is PC gaming. Traditionally, developing for consoles is, in a lot of ways, easier than PC. You have your devkit, you test the game and can feel confident that you’ll deliver the same experience to anyone on that platform. Easy enough, right? One set of hardware, one configuration. Now, that is obviously not the case with PC gaming, wherein there are thousands of configurations, in a wide range of performance. And obviously, a PC from 2020 is gonna run way, way better than a PC from 2012, and better than a PS4 or Xbox One. Now, the console industry has tried to remedy this by releasing pro-versions of consoles, with upgraded hardware. And herein lies part of the problem, an emerging trend that bodes ill for the future. I grew in the 90’s and 00’s. Anyone who remembers Windows 95, 98 and the likes will remember that they were not as… stable, as a modern OS, or a modern computer in general. And this was way before Steam and the likes, you bought a game on a physical disc, and hoped it worked. That’s why a lot of people liked consoles. You had a console, and if you bought a game for that console, you knew it was going to work, you knew what kind of experience you would receive. And that has been the case as such ever since. You buy a console at launch, you know you’re gonna have a steady release of games optimized for that hardware for the coming 7-8 years. At least, that used to be the case. The recent “invention” of upgraded versions of consoles breaks this tradition, and means that consoles are creeping closer to what a desktop is, in a true motion of capitalism.

 

The point is really that the problem is twofold. 1, there is an exponential increase in required hardware. 2, there is an exponential increase in the expectation from AAA games. Think of it like this: A game like Cyberpunk 2077 is developed. It’s going be cross-platform, because business-wise of course that makes sense, more customers, more players. Now, a lot of people are sitting on a high-end PC, and seems to expect an experience that’ll run in 120 fps on a 4k monitor (a bit of exaggerating, admittedly, but they’ll expect a good, smooth experience, whatever that entails). And of course they do, they have paid for that privilege, invested in their gaming experience. And of course Cyberpunk 2077 wants to please that crowd, it’s the biggest release at the time, and it is a visually stunning game, or at the very least have the potential to be. On top of that, with every new game there is the expectation that it has to have more than the previous, more content, more hours, more features. Consumers, gamers, are constantly expecting everything to be Harder, Better, Faster, Stronger. A game isn’t worth it if doesn’t have open-world, crafting, branching narrative, and 150 hours worth of game time.

 

So, we have this new game, it has to be bigger, better, longer, more good looking than any game before it, and it has to run on a PS4 and Xbox One as well. Because, again, it was designed for it. It was announced during that cycle, has been in development with that hardware in mind, and generally has always had the intention to be released on those platforms (disclaimer: I am assuming here, but unless CDPR could see into the future, or was somehow privy to technology ahead of anyone else, that had to be the case). Not to mention the financial implications of not releasing it on those platforms, not making it available for the around 160 million consoles that have been sold? Starting to see the problem?

 

Some PC gamers are certainly going to say “but why should we have to compromise to seven year old hardware?” Fair question and the response is, and I admit freely that this is opinion, you are part of the problem that is that graphical fidelity is what determines the technological development. You have graphic cards manufacturers releasing new and updated hardware, everything can look 4k, 8k 12k, and whatnot, higher resolution, better graphics. As if that is some sort of staple of quality on game in itself? Some people are willing to spend hundreds or thousands of dollars over the years to keep their desktop machines at peak performance. Some are not, some are happy with buying a console, and not having to spend more money on hardware. And a game like Cyberpunk 2077 wants to please them both. Obviously, it’s gonna have a hard time to do so.

 

Is a game automatically fun just because it is pretty? No, it isn’t. Is a game good just because it’s open-world? No, not necessarily. A game is like a puzzle, all pieces have to connect with each other to form a satisfying whole. A game isn’t good just because it has a lot of content. Maybe Cyberpunk 2077 shouldn’t have been an open-world game, there is clearly a fairly straight forward narrative that wants to be told with a protagonist that is already defined (character customization that ultimately serves no purpose doesn’t count), in a beautiful world that could have been more streamlined had it not been necessary to be so big? Maybe, but then it wouldn’t have been able to boast 150 hours of content, and if that’s the case, gamers complain about some made-up dollar/hour ratio of what constitutes a good game. But I digress…

 

What’s the solution then? There isn’t an easy one, if there even is one. Maybe games shouldn’t be cross-platform? Most game’s that are console-exclusive do tend to run really well on that particular hardware. Games only for PC are more adapt at utilizing the platform. But then again, that’ll likely never happen. People want options, companies want revenues. Should consoles come with upgradeable hardware? Oh god no, if that was to happen, they would have doomed themselves, proven that there is no longer a point in buying a console. And there would be strictly less people playing games, because a lot of people can’t, or doesn’t want to spend money on a good gaming rig. Should people stop focusing so much on graphics, better graphics and number of hours of content? Yes, probably. People have been complaining for years about the game industry’s business strategy of loot-boxes, paid DLC, additional content and whatnot. And that’s fair, but gamers also have to realize they are part of the problem because they are constantly setting a bar that just keeps getting higher and higher, with the added cost of development that comes with it, and that is a fact.

 

Now, as I said, I grew up in the 90’s with NES and SNES, and of course that affects my opinion on games, and what constitutes good graphics, and value of games. I’m also still an avid retro-gamer, as well as a fan of indie games. That too, affects my sensibilities. Still though, I would like to argue that the quality of graphics doesn’t determine if a game is good or not. Of course it’s a focus, and of course they should be good, I’m not saying game development shouldn’t evolve. I’m saying it should evolve equal in all directions. If the aforementioned puzzle is just a pretty box, it’ll feel incomplete. And honestly, I just don’t want to wake up in a world were I have to spend thousands of dollars every few years just to be able to play new games. Games should be fun to play, NOT have the best graphics ever. To quote a certain anarchist rocker: “Come on. Really think they give a rat's dick how you look?”

Read more about:

Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like