[In this reprinted #altdevblogaday-opinion piece, SCEA senior software engineer Drew Thaler analyzes RAM fill time for consoles, which he says will give you a ballpark idea of how fast your game should be able to load.]
Today I want to talk about the baseline load times for games on consoles. I'm going to dive right in with the equations here, so be prepared.
One of the useful metrics for understanding game load times is RAM fill time. It's more-or-less true that most console games tend to fill the available RAM entirely with very little left over. There are exceptions, of course, but it's not bad as a general rule.
Similarly, we can more-or-less assume that a game can't really start until RAM is basically full. There are plenty of exceptions here too which I won't get into, but again – not a bad first order approximation.
To The Math-Mobile!
So: For a given hardware platform, what is the minimum time possible to fill RAM? The answer:
tfill = sRAM / vfill
tfill = minimum time to fill RAM
sRAM = size of RAM to fill
vfill = data load speed
In plain English, the time it takes to fill a given amount of RAM is equal to the size of the RAM divided by the speed at which you can fill it.
You may have noticed that I subtly weaseled a bit there. Why did I use vfill
instead of, say, vdisk
? Because of data compression! Many games keep their data compressed on disk and then decompress it into RAM. So the effective rate at which you can load data depends quite a lot on your compression ratio.
vfill = vdisk / rcompression
vdisk = speed of the disk
rcompression = compression ratio = scompressed / suncompressed
scompressed = size of data after compression
suncompressed = size of data before compression
In plain English, the speed at which you can load data is the speed of the disk divided by the compression ratio of the data.
By the way, I'm also presuming here that your data-loading subsystem is optimized for fast loading: basically that you've chosen a data compression algorithm which is not CPU-bound, and you overlap reading with decompression.
Putting Values To Some Of The Unknowns
There are only three real unknowns here: rcompression
, and sRAM
Let's start with compression. In general, I've found that game data of this generation tends to have a compression ratio of about 50% to 60%. There isn't a recent published corpus of data on this that I'm aware of, so you'll just have to take my word on it. We'll be conservative and look at two compression ratios: 60% and 100% (uncompressed).
= 0.60 or 1.0
The other two parameters vary depending on the console, and the drive on which the data resides, and often even the location on the disk.
is fairly straightforward: it's known to be 448 MiB on the PS3 and 480 MiB on the Xbox 360. Thanks, Wikipedia!
requires a bit of research. For simplicity's sake we'll look at the original hardware for each unit, since those are typically the minimum-spec devices that set the standard. Both the first Xbox 360s and the first PS3s used Seagate LD25.1 hard disks (source
), and public benchmarks are available for those which place the min/max throughput at about 20-40 MB/s. The PS3 uses a 2x CLV Blu-ray, which is defined to be a constant 72Mbps = 9MB/s. The Xbox 360 uses a 12x CAV DVD, which ranges from 5x DVD to 12x DVD speed across the disc.
(Know your units!
Since we're talking about bandwidth, in this article I'm using "MB" to mean 1 million bytes, and "MiB" to mean 1024*1024 = 1048576 bytes.)
Putting It All Together
Let's take all of that and build a table:
Or, here's the same data graphically. Shorter bars are better:
The above numbers are for perfect
loads, i.e. a linear load of completely sequential data.
In practice, game loads are rarely that perfect… so this is really only a first-order approximation of load times.
To get a more accurate approximation, you'd next want to estimate the average seek cost, and the average number of seeks per load. However, since that starts to creep out of the realm of publicly available data, I'm afraid I'll have to leave that as an exercise for the reader. :-)
What Good Is All This, Anyway?
So, yeah, that's a whole bunch of numbers and a bunch of data. What was the point?
Well, this type of analysis can be very useful whenever you're looking at a console for the first time. Maybe you're bringing a game over from PC, or porting a game from X360 to PS3, or maybe you're working on a launch title for a next-gen console of some kind. These kinds of analyses will give you a ballpark idea of how fast your game should
be able to load.
It's also useful for cross-platform console games to understand; you need to understand the strengths and weaknesses of each console, and what you can do to maximize your throughput.
For example, you can see quite clearly in the data that naive loading of uncompressed data directly from optical disc is really quite hurtful on current consoles. On all consoles you should really be using compression. On Xbox 360 (where you can't rely on the existence of a HDD) you'll enjoy a big boost from organizing your DVD layout. On PS3, you should both compress your data, and prefetch it from optical disc to the HDD whenever possible.
Food For Thought
For the past ten years or so, RAM costs have dropped much faster than HDD/BD/DVD speeds have increased. As a result, the time to fill RAM on PCs has been creeping upward. However, now we've got solid state (SSD) drives which are finally bucking the trend.
But the big problem with SSD for game consoles so far is cost: anything but the most basic SSD upgrade probably costs more than your entire game console.
What do you
think will happen in the next console generation?
[This piece was reprinted from #AltDevBlogADay, a shared blog initiative started by @mike_acton devoted to giving game developers of all disciplines a place to motivate each other to write regularly about their personal game development passions.]