Sponsored By

Why Should Someone get a Degree in Video Games, Anyway?

A degree in video games can be an excellent first step to joining the industry, but this new field of academia has an identity crisis that's going to need to be resolved. I consider some of the different purposes of a game design education.

Joshua Kasten, Blogger

January 9, 2013

6 Min Read

The School of Hard Knocks

Imagine you had a friend who told you he was planning to enter into a college curriculum designed around making video games.  What do you imagine that program being like?  If you work in the industry, you probably have some ideas about the kind of skills a student needs in order to succeed.  You most likely gained those skills through practice, experimentation, and some painful-yet-valuable life lessons out in the field.  A college that has determined to start a video game curriculum has the unenviable task of taking those experiences from the warzone of professional development and bringing them to the boot camp.  Unfortunately, the translation from work to curriculum is not always direct, and some things critical things can be left behind, leaving the students to pick up the slack on their own.

It was only last December that I graduated from The Guildhall at SMU, one of the few graduate programs focused on video game development. To their credit, they were straightforward with what the program was about—it’s even in the name.  A guildhall is a place where apprentices learn a trade from the previous generation of master artisan, and as such, my education there was practical and focused on learning the technical skills and programs needed to find employment within the industry.

 

The ultimate goal of any young student.

Two Schools of Thought

The focus on practical skills served me well during the time there, but as I approached graduation, several of my peers suggested that they both expected and desired a more academic approach from the school.  Their complaints and suggestions made me wonder what the actual purpose of a video game curriculum was supposed to be.  Is it a simplification of the difficult road experience by most designers just starting out, or does it provide its own unique and irreplaceable service?

There are two fundamentally different approaches to game industry education, though neither one would entirely function without at least a little of the other.  I wanted to use this post primarily to look at the pros and cons of each—and then address why neither one of them solves the actual problem with taking college skills into the professional environment.

A Practical Education

As a level designer at The Guildhall the core curriculum was straightforward; we learned a different game/engine in every class.  For two months we would be dedicated to making levels in Skyrim  Later, we moved on to Unreal and made levels for Gears of War and eventually Half-Life 2.  The core Level Design classes have the same essential structure.  We spent half the class learning technical aspects of the engine and design aspects of the game.  The second half was dedicated to our personal project making a level for the game, with a different weekly milestone. During development of the personal project, students learned advanced skills as needed and good design through play testing and practice.

This teaches the students primarily to teach themselves and be self-directed, while giving everyone the chance to have 3-4 pieces of portfolio quality work.  It also ensures that everyone is at the same fundamental skill level with the same tools. However, a common complaint is that a large portion of the information taught directly by teachers comes directly from online resources. Most of what I learned about Unreal, I learned from my peers, by practice, and by researching on UDN.  I learned well in this environment, but the argument can be made that if most of the information the students use is free for all, why should they invest in special college education, particularly when it comes at great personal cost?

"The Design of Everyday Things" The de-facto guide to what good design means, was a recommended text, but not required.

Teaching the Design

Early in the curriculum, our teachers taught us the basics of design theory.  We learned that a thing called “Flow” existed; giving us a name for a thing we had already intuited from our previous experience with games.  We learned that there were mistakes to be avoided, and some general advice towards making good levels and not bad levels.  This was the extent of this education—we received commonly shared archetypes of a concept that we could then proceed to fill with knowledge gained though experience.  If, like some students, you thought the purpose of a video game master’s program was to learn high-level design theory, you were probably disappointed.  There is a good reason it’s like this, however.  The truth is that there has yet to be a unified idea of good design for teachers to quantify, teach, and make students practice.

The appeal of a curriculum geared towards theory is, perhaps, that it has a bit more gravitas. Short of a lifetime of experience—high-level design theory is something that really can only be learned by someone who has already had a lifetime of design experience.  In practically minded classes, this wealth of experience is generally limited to learning best practices.  That said, because we lack a unified theory of design to teach, the differences between teachers may conflict with each other in a way that is disruptive to learning.  For most of my education, I had three different level design teachers, who were as different as could be in their approaches, teaching techniques, and grading practices.  If all three of them were in a room discussing design theory, they would probably disagree on as much as they agreed on. Perhaps the goal of a game curriculum is simply to simulate the conditions that created the previous generation of designers—learning by doing.

The Third Estate

The truth is that neither an education completely devoid of practical knowledge nor theoretical education would succeed. The question is one of percentages.  Should it be a 50/50 split, or is one more critical than the other is? The other, unexplored, issue here is that the most important skill in being a good game developer has nothing to do with either of these two things.  I may have the skills needed to create small projects on my own, but without the experience of working in a group and cooperating to create a major project, I could not really say that I would feel qualified to be hired by any company.  This, more than any other issue is the problem facing video game college programs.  The academic system rewards those for the work they do on their own—but no man is an island in the games industry.  The work we do only succeeds when everyone communicates and works with each other.  

I’ve  only explored this issue from a Level Design perspective, but I know artists and programmers can have wildly different experiences and I’d like to hear from them, as well as people who have gone through similar programs.  If you have gone to a video game school and had a different experience, post about it in the comments!

Read more about:

2013Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like