Sponsored By

Collaborative Game Editing

Individual game developers take responsibilities for different parts of game development - sometimes leading to content mixups and bottlenecks where their work overlaps. In this in-depth article, originally published in Game Developer magazine, Mick West discusses how collaborative editing may be the future.

Mick West, Blogger

May 20, 2009

15 Min Read

 [Individual game developers take responsibilities for different parts of game development - sometimes leading to content mixups and bottlenecks where their work overlaps. In this  in-depth article, originally published in Game Developer magazine, Mick West discusses how collaborative editing may be the future.]

Game development teams are getting quite large. The largest teams can have more than a hundred developers working on a game at any one point in the process. Yet many developers still use work practices that revolve around individuals taking the entire responsibility for large chunks of the game.

For many things, such as the creation of the 3D models and textures for individual game objects, this is still quite reasonable. However, for the creation of large game levels and continuous game worlds, this can create problems and it is worth investigating methods of content creation that are more collaborative in nature.

This article takes a look at some of the issues involved, and discusses a few technical problems from the tool programmer's perspective.

Code Conflicts and Bottlenecks

When multiple people are working on overlapping aspects of a game, there are two problems that arise: edit conflicts and edit bottlenecks. These problems are two sides of the same coin. Fixing edit conflicts can create edit bottlenecks, and vice versa.

An edit conflict happens when two people edit the same thing at the same time and their changes create two conflicting versions. The nature of this problem depends on how fine-grained the "things" are in the game being developed.

Programmers have always had this problem when two of them work on the same area of code at the same time. If both programmers make changes to the same file, an edit conflict occurs, and it needs to be resolved.

Generally, programmers will not be working on the exact same line of code, but rather will work on separate tasks, which use code that overlaps in various files. When a programmer wants to make a change to a file, he or she will check it out from the version control system (VCS), edit it until the changes are working, and then check it back into VCS.

The problem of code edit conflicts can be handled by allowing only one programmer to check out a particular file at one time (this is typically a setting in the VCS software). Since only one programmer can edit the code at a given time, there can never be an edit conflict. However, we still get bounced over to the other side of the problem -- the edit bottleneck.

An edit bottleneck occurs when two people want to edit the same thing at the same time, yet person B cannot edit it because person A is editing it. With code, this is particularly problematic when large sections of functionality are incorporated in key files (typically with rather overarching names, like: "player.cpp" or "globals.cpp"), which affect many areas of the game and need many changes.

The common solution is to allow for multiple checkouts. Thus, more than one person is allowed to check out the same file at the same time. They make their changes, and then check the file back in. If the changes conflict, then they have to be merged. This step can usually be done automatically, but occasionally it requires a little manual intervention. Sometimes programmers end up editing the exact same piece of code and have to actually talk to each other to figure out a solution.

Some programmers prefer to structure things in such a way that multiple checkouts are never required. Theoretically, we can achieve this solution by making individual code files as small as is practical, splitting up files that contain multiple functionality, and establishing programmer procedures for rapid iterations, minimizing the length of time files can be checked out.

In practice a combination of efficient code division and multiple checkouts is commonly used. While edit conflicts can occasionally create problems, these can generally be mitigated by keeping a reasonable amount of functional separation in your code organization, and ensuring programmers check in (or merge) their code reasonably often.

The problem of edit bottlenecks can stop programmers in their tracks, or at least seriously cramp their programming options, and the occasional problem caused by an edit conflict is well worth the extra flexibility that simultaneous editing can provide.

From Code to Data

Traditionally, level editing is done using some kind of standalone tool that is not part of the game engine. The level designer loads the level, makes changes to it, and exports it to see how it plays in the game.

This level-editing tool might be a commercial product, such as 3ds Max, or it might be an engine-specific tool that comes with a third-party engine, or it might even be a custom in-house tool.

Level editors can be integrated with the game engine, allowing the designer to view and play the level as it's being edited. This is ideal from the point of view of rapid feedback, but the problems of edit conflicts and bottlenecks still remain. If two level designers want to edit the same part of the level, they usually will have to check out the entire level from the VCS.

With code, multiple checkouts of the same file are less problematic because it's relatively easy to merge text files automatically, as the changes are usually on well separated lines. Conflicts that cannot be resolved automatically are usually taken care of very easily manually, as the nature of the conflicting changes is readily apparent to the programmer who makes the merge.

Unfortunately, this solution does not work with level data, which is often stored in a binary format that's impossible to merge, especially if the level editing tool is some thing like 3ds Max.

Even if the level is stored in a text-based format, such as XML, it's much harder to merge as there are many internal dependencies and automatically generated data that create a broader mesh of changes that are difficult for a human to read. What we get are more conflicts, which then are much harder to resolve without breaking something.

Splitting Data

The difficulty in merging changes in level or world files would inevitably lead to bottlenecks if no steps were taken to mitigate it. Various solutions have arisen to handle the problem. The simplest is to break the level down into the smallest chunks possible, so that individual level designers can check out only the sections of the level they need.

Using small chunks improves matters by reducing edit bottlenecks, but adds complication in how the level is split up and subsequently pieced together. Implementing high-level changes to large sections of a level also becomes difficult.

Though bottlenecks are reduced, they're not eliminated, as there will still be merge work needed at the borders of the areas assigned to individual level designers. Edits here must still be manually coordinated. The large number of individual parts that make up the divided level now places an undue organizational burden on the level designer.

Sometimes more ad-hoc solutions are used. A level might still be stored in one large file, but when two people need to work on it at the same time, it is manually split into two sections. The work is done (with some discussion to avoid conflicts), and then the two sections are visually merged back together.

Sometimes, special merge tools are written just for this purpose. This technique is fraught with problems, as the merging process is rarely simple, and can take some time.

Another problem is that using the VCS is inherently a manual process. When a designer is working on a level, she'll check it out, perform her work, and eventually check it back in. During that time, no one else can actually work on it. But the designer who has the level checked out is unlikely to be working on it every minute she has it checked out.

There might be long periods of time when she's working on something related. If the designer has to keep track of all the pieces of the level she might want to edit, then she might be tempted to check out large chunks, just to make things easier. In other words, designers will start checking out sections they are not in fact editing.

If instead the check-in and check-out procedures are handled automatically, then in theory you could set up your tools so that as soon as the level designer started to edit a particular piece of the level, then that section would be checked out from the VCS; and as soon as the designer stops editing (for example, the file is saved or successfully viewed in the game), then that piece is checked in.

Various problems obviously arise from using this automated method. Most notably, you're going to end up with people checking in things that are broken, so perhaps it's better to still rely on a manual commit at some point.

Still, checking out sections automatically can work well if the level is fine grained enough that conflicts are minimized. The level designer need not specifically be aware of the parts of the level that he has checked out -- simply that he's made a set of changes and needs to check them in. The tools will keep track of what has been changed, allowing him to simply check in everything with a single click.

The Ideal Process

In designing any toolset, it's useful to imagine the most desirable possible end result, regardless of how practical it seems. Such blue-sky designing can uncover possibilities that might not be considered with a more traditional incremental approach to adding features to an existing tool or process.

With our level-editing problem, we have a two sided problem: edit conflicts and edit bottlenecks. Fixing one problem can mean making the other problem worse, so we end up simply balancing them as best as possible, accepting the inevitable manual resolutions and underutilized resources that come from this.

But in an ideal world, there would be neither conflicts nor bottlenecks. Anyone would be able to edit in any area of the game at any time; nobody would have to wait to start working, and all changes would be merged automatically. In addition, all changes would be visible immediately in the game as they're made, with no delay and no "exporting" of files.

Sure, that would be great, but can we get there? If we can't, then how far can we get in that direction?

Second Life

In some respects, the online virtual world Second Life is already there. Players in the game exist in a large continuous world, which they can edit in real time.

Multiple people can edit in the same area of the world at the same time, and their changes are visible instantly, both to themselves and to others.

They can edit 3D models and scripts simultaneously, editing, testing, and playing all at the same time in a truly collaborative editing environment.

That does sound somewhat like our ideal, but if you've ever edited something in Second Life, you'll see it's not quite perfect. The editing process is based on you editing objects that exist in a remote database. But working over the internet introduces some annoying delays. The tools themselves are rather primitive, which is to be expected.

But the most glaring omission from a game developer's perspective is the lack of version control. When you edit something, that's it. You've edited it. No rolling back to the version from yesterday (or five minutes ago) if you accidentally delete half your script or mess up your UV coordinates.

Version Uncontrol?

The problem of not having version control in an editing environment such as Second Life could obviously be addressed by incorporating some kind of VCS directly into the project. But this again raises the issues we had earlier of potential bottlenecks: how to divide things into files, and the manual burden of checking things out and back in again.

Take a step back and ask the sacrilegious question: Do we need version control?

The immediate and emphatic answer is, "Of course," especially if you remember the dark days of early game development when all code was manually merged on Fridays, nobody had the right version of anything, and programmers lost days of work if they accidentally overwrote a file. Version control is an invaluable part of the game development process that is impossible to discard.

But consider what you use version control for, especially with game assets. You use it to prevent edit conflicts, to distribute the latest versions, to back up files, and to create branched versions of the game.

Now on a daily basis, the vast majority of what a VCS is used for is the first two; preventing edit conflicts and distributing the latest version. Developers check out or "open for edit" some files, do their work, and then check in or "submit" those files.

They then "get" or "sync" the latest version of the project and tools. The backup aspect is a necessary part of version control but is used relatively infrequently. The branching is used even more infrequently and is a high-level task performed by very few team members.

Consider now how this works in Second Life. We don't need to ever get the latest version, as we always have the latest version. We are in the latest version. We don't need to check things in or out, as we simply start editing them, and they get locked. Then we stop editing them, and they are unlocked. We don't actually need the conflict prevention and version distribution of a VCS.

But what about backups? You'll note that backups are what you really miss about the lack of a VCS in Second Life -- and remember, the backup functionality is one of the least used functions in VCSs.

We still obviously need it, but we might now want to perform a little paradigm split and separate the VCS into two separate function groups, one being the file ownership and distribution system and the other the file backup and history system.

Since they have been so closely linked in the past, it has been natural to treat them as being the same thing. When you check in a version of a file, it becomes the file that gets backed up.

Yet the real utility of a backup system would be the ability to undo your changes to an arbitrary point. If you've been editing a file for a while without checking it in, then you might have made a lot of changes that you might want to undo. Designers often save multiple numbered iterations of a file locally before checking in.

Programmers who keep a file checked out for a long time frequently get nervous and make local backups, or even check-in with changes (hopefully) stubbed out, just to have the security of that VCS backup. Here the act of checking in a file is a useful label in the backup process, but it should not be the sole method of backing up a file.

Edit conflicts, bottlenecks, file distribution, and backups are, to some degree, historical problems, forced upon us by the constraints of a simple file-based editing system. Files are large, which means you get edit conflicts or bottlenecks, and that it's expensive to back up every single change.

But if we transition to a game editing process more like that found in Second Life, we are no longer editing files, but are editing much smaller objects within the world, which are stored on a shared database which needs no distribution.

Here it would be quite possible to back up every single change and every single edit to each object, essentially giving the developers both infinite "undo" capability and the confidence to make bold changes knowing they can safely return. Branching is also quite possible within such a scheme.

I think it is inevitable that some aspects of game development will move to using this collaborative model. It will require a rethinking of the way version control is handled, and even the way version control is viewed -- a perhaps violent division of the functionality of file locking, distribution, and backups, that will allow each separate area of functionality to develop unfettered with new and powerful tools and processes.

[EDITOR'S NOTE: This article was independently published by Gamasutra's editors, since it was deemed of value to the community. Its publishing has been made possible by Intel, as a platform and vendor-agnostic part of Intel's Visual Computing microsite.]

 

 

Read more about:

Features

About the Author(s)

Mick West

Blogger

Mick West was a co-founder of Neversoft Entertainment. He's been in the game industry for 17 years and currently works as a technical consultant. Email him at [email protected].

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like