Sponsored By

Those who do not know their history...

The games industry is now roughly 30 years old: how much of that experience is being fed back into modern practices?

Jamie Mann, Blogger

October 15, 2010

3 Min Read

The recent release of Sonic 4 threw up some interesting score variations.  The majority of players seem to love it: with 2000 votes racked up, it has a rating of 4.5/5 on the Xbox 360.  Reviewers seem to like it too: on metacritic, it's managed an average score of 80%, placing it firmly in the "good, but not great" category.  However, the average "gamer" review score is just 60%, with the physics engine and level design coming under heavy criticism.

(interestingly, the scores significantly vary between the three platforms.  But that's something to ponder for another day...)

So, why are we getting these distinctly banded responses?  I think the answer's fairly simple: Sonic 4 effectively comes 15 years after Sonic 3.  As a result, anyone under twenty is unlikely to have ever played any of the original Megadrive Sonic games.  They may have played one of the modern 3D sequels or a 2D port, but the former offer significantly different gameplay and the latter have tended to be on handheld consoles with significant compromises in screen size and resolution.

In other words, the majority of mainstream gamers don't have anything to directly compare Sonic 4 to.  Still, shouldn't the reviewers be better equipped?  Or to put it another way: is the games industry aware of its heritage?

Sadly, I suspect not.

To keep things simple, let's say the industry got properly underway when the Apple II and Atari 2600 were launched in 1977, thirty-three years ago.  On the consumer front, that means there's been two, maybe three generations of gamers: a teenager who received Pong as a Christmas present back in 1978 is probably buying DS or PS3 games for their grandchildren now.

However, on the industrial front, there's probably been six or seven generations: with a high risk of title cancellations, relatively low pay and a heavy emphasis on crunch time, game development has traditionally been an industry for younger people willing to take a chance and burn the midnight oil.   I'd be surprised if more than 10% of the developers from the 1980s were still involved in the games industry.  And I'd be even more surprised if more than 1% of game reviewers from the 1980s were still involved in the business.

(As an aside: how much opportunity is there for career development in the game industry, or as a games journalist?  I get the impression that there isn't much, but I'd love to find out if this is actually true or not!)

Simply put, there isn't that many people who can pass their knowledge on - and formal educational courses tend to focus on production of "modern" games, with minimal reference to the design and implementation of past games. Worse, as far as I can see, there's no videogame journalism courses: each journalist stands or falls based on their own personal experience.

Thankfully, it's actually gotten easier to re-experience the past: where previously, you had to dig out a PC emulator and some dodgy ROMS, compilations such as Capcom's Classics and services like the Wii's Virtual Console and Xbox Live Arcade have made some of the classic games available at a (mostly) reasonable price.  There's also no shortage of "greatest game" lists for people to peruse and review.

Still, at the moment it's all down to whether or not an individual chooses to look into the history of games.  And until that changes, we're going to continue to get games which have forgotten their heritage, together with wide variations in how they're recieved...

Read more about:

2010Blogs

About the Author(s)

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like