Sponsored By

"On a AAA project of this scope, most of the smaller cut scenes are never seen by the animators. Instead, you create a system for designers to use and trust the designers to script everything properly."

Game Developer, Staff

March 24, 2017

17 Min Read

This roundtable is reprinted with permission from AnimState.

With so much attention being paid in the last week to the animation issues seen in the previews leading up to Bioware's next big release, Mass Effect: Andromeda, we thought it would be interesting to get a few experienced animators together to discuss the challenges animators face when dealing with these types of projects. Also, a special thanks to Daniel Floyd for moderating this round table for us. So, let's get to the introductions!

Dan Floyd: I guess I'll start things off, since I'm mostly here to moderate and ask the smarter people questions. I'm Dan Floyd, I co-run a YouTube channel called Extra Credits and make a show on that channel about game animation called Extra Frames. I'm also an animator with film and game industry experience, but not nearly as much as my counterparts here.

Simon Unger: Hey! I'm Simon Unger! I'm an animator at Phoenix Labs working on a new game called Dauntless. Prior to that, I was a lead animator at Electronic Arts, Square Enix, and Robotoki. I've also taught at schools such as Gnomon, iAnimate, and Vancouver Film School.

Gwen Frey: I'm Gwen Frey. I've been in the industry for a decade now. Most recently I formed a small indie studio in Boston and shipped The Flame in The Flood as the sole animator and FX artist! Prior to that I worked in technical animation on Bioshock Infinite and the various DLCs, and before that I worked on MMOs such as Marvel Heroes Online. I speak publicly about game development, have a podcast (The Dialog Box) and a YouTube channel (GwenFreyTheTA)

Tim Borelli: Hello! I'm Tim Borrelli. I've been doing some form of something with animation in games since 1998, working on more failed projects than not. I've been at Volition (from Freespace 2 to the beginnings of Saints Row 3), 5TH Cell (where I learned more about being a good leader than anything), and am now Animation & VFX Director at First Strike Games working on the secret sauce.

Dan: So I guess let's start broad: what kinds of challenges are inherent to working in this medium? If you are an animator working in video games, what sort of limitations are you going to run into most often?

Simon: There are so many little technical things we run into on a day to day basis, often they're specific to the project, sku, or engine. I think the main constant throughout my whole career has been the high level goal of maintaining "believability' throughout a character's performance. We have to deal with so many variables and external inputs (read: a user wiggling the control stick back and forth or skipping a cut scene) that it makes it an interesting challenge to handle these in a way that doesn't pull the viewer out of the experience. Sometimes the slightest pop or blend will take you out of the moment.

"Animation in games doesn't end when you are done animating. In many ways, that's barely halfway there."

Tim: The number one challenge is remembering that animation in games doesn't end when you are done animating. In many ways, that's barely halfway there. Once you've exported your animations, there are any number of systems that will touch them and affect how they are played back, how they look, and how they feel. From blend times as animations move from one to the next, to IK systems that change where your feet are planted and what your root offset is from the ground, to additive systems that will layer a pose onto an animation or group of animations, changing their appearance, and even to procedural systems that need your animations to be perfectly tuned to how the system works or they can look just plain wrong.

Dan: So we've seen some footage coming from the early release of Mass Effect Andromeda, and there appear to be a fair number of animation bugs, especially with the navigation and conversation systems. Before we get into particulars, would one of you like to explain how conversation systems work? What makes animating a Mass Effect or Witcher dialog system different from animating story scenes in a game like Uncharted or Kingdom Hearts or what have you.

Tim: I've had experience in much smaller scope systems, and even those were a nightmare of organization and best-guesses. Instead of creating custom animations per line of dialogue and acting choice of the VO talent, you are likely creating phoneme shapes (Aa, Eee, Oh, Mmm, etc), emotion shapes (angry, sad, happy, etc) for upper and lower face sections, body language animations (happy, frustrated, sad, excited, etc) at various intensity levels (VERY angry, kind of angry, about-to-boil-over angry, etc). I'll keep it to this for now to simplify explaining the process, but there is usually a very small animation team working on this- in my experience, one person.

Layered on top of that data are procedural systems either in-game or in the editor. These systems control eye and head look-at targets (where the head and eyes track something of interest), as well as potentially auto-generating the lipsync of the dialog, and more but we'll leave it there for now.

Depending on the project and team size, this data is then used in a number of ways:

  1. Stitched together by the team responsible for setting up all conversations in the game. For example, a line could be "And then he stole the pig! HAHAHA!' as a punchline to a joke. The team would find the body language animations that best fit the line, find the facial emotion that matches. The lipsync would be already generated with a tool that reads in the line and spits out the animation.

  2. Procedurally stitched together by a system that reads in "tags' on the data. These "tags' are set by an animator or designer, and are defined in two places: the body language and emotion shapes that denote what they are (for example, "angry') and the VO line for what emotion the VO talent is currently displaying. The system would then take a random variant of "angry' and play it at the appropriate time. The lipsync would be already generated with a tool that reads in the line and spits out the animation.

  3. Finally, the data could used as a base for animators to tweak and polish in their 3D app (like Maya). They would pick and choose what motions to use, stitch them together, and then hand-tweak them to look as good as possible.

During all of this, various points of interest can be defined for a character to look at. They should be contained to the conversation at hand, but sometimes checkboxes aren't flipped and suddenly a high-priority NPC pathfinds right past you and there go CRAZY EYES.

All of these solutions come at a cost, which I am sure we will outline below. But given the complexity of even the most simple of versions that I laid out above, it's easy to see how daunting of a task it is to create a believable conversation system.

To compare it to Uncharted- those scenes are all custom done, by a giant team of animators, cutscene artists, camera operators, etc. They are done alongside the VO talent, reshot to look as good as possible, and are always done with the emotional arc of the game in mind.

Dan: On any big project like this, you are inevitably going to have to face the realities of the Production Schedule. Time and money are limited resources, and the team must plan how to spend them to get the best results, which involves making a lot of predictions. For an animation team, how do those choices get made? What goes into planning the animation schedule of a AAA video game?

"Production planning is such an intricate dance at larger studios."

Gwen: Hahaha, production planning is such an intricate dance at larger studios. After Pre-production is complete, a producer will ask the animation team for a bunch of estimates. "How many hours of animation work is required to animate locomotion for a new bipedal NPC? How many hours of an animator's time is needed for a new 2-handed weapon for both the player hands and all NPCs?' and so forth. The animators answer these questions with whatever estimates they can, and then the production team gets to work. The production team collects this data from every discipline, and then presents the directors with data based on these estimates: "We can either add 1 new gun to the game, or add 3 more NPCs' and so forth.

At this point scheduling is no longer within your control. At a large studio there is way too much to do and many tasks rely on someone else to do a certain task before you can get started. When everyone is a specialist that does a very specific task, a lot of cross-discipline dependencies crop up. So a team of producers are telling you what the most important task is and how long you have to complete it. They do this for a living, and you have to do what they say (because doing what they say is what you do for a living.)

Obviously, for the good of the game, priorities change throughout production. Major player powers and scenes are scrapped, dialog changes very late, etc. Sometimes major deadlines completely change which moves around everyone's tasks wildly. It's a fun ride :)

Simon: Yeah, Gwen explains it well. If a team has done pre-pro well enough (by that I mean, investigate or prototype any unknowns, prove out the core gameplay loops, etc.), the estimates will be closer to the production reality. When an animation team is given a vague design or feature to scope out, that's where they can run into problems. Experience helps, but many times a game has something in it that nobody has ever done before. That's really difficult to estimate to the hour (or even day) how long it will take to implement. You have to put your features and tasks in priority order, try to align that with the other discipline's schedules, and start with the most important thing. Anything that falls off the edge of the deadline either gets cut or put in an update or DLC.

Dan: One thing I also wanted to touch on is that, when you see animation in a game (good or bad), that's not the work of just the animation team. I'm sure you all can both speak to this from experience, but game animation depends on many disciplines working in coordination, from programming to design. This is a true team effort. What are some of the ways that other disciplines within the team impact your work?

"On a AAA project of this scope, most of the smaller cut scenes are never seen by the animators. Instead you create a system for designers to use and trust the designers to script everything properly."

Gwen: I thought it was funny that people are piling on animators for these glitches, when in reality the animators are probably blameless. On a AAA project of this scope, most of the smaller cut scenes are never seen by the animators. Instead you create a system for designers to use and trust the designers to script everything properly. For instance, you will create a "mood' system so that a character can be happy, sad, or angry. Then you trust the designers to set the mood of the NPC during that scene. You give the designers a stable of placeable background NPC characters that have idles basic behavior, and you trust the designers to place those characters in logical places. As an animator you spend most of your time either on hero scenes (scenes that are entirely animated by hand) or working with animation programmers to make sure that the gameplay systems are air tight. You make sure that when a character is running up a slope they lean forward, and you work on the blending between different locomotion animations – that sort of thing. All the glitches I've seen online look like technical bugs and/or implementation bugs. I'm not saying the animators are blameless – everyone is responsible for making the game AAA quality and everyone is responsible for beating the drum when things are not shaping up in game. However, as a technical animator, I can't help but look at these bugs and think they look like ik problems and scripting errors.

Tim: To add to what Gwen said, there's also the aspect of character creation & customization that comes into play. The character and animation teams have to take into account the various facial features and proportions that they make available to the player during the character creation process, and any of these when pushed to their boundaries can result in not-as-ideal animation being played back on the faces. Just google "Ugliest character in Saints Row 2' to see what can happen ðŸoO Often, larger games will employ this same character creation system to fill the world with NPCs, and while we try our best to police the quality of characters that are created with those systems, sometimes things either fall through the cracks or, in an effort to save time, are randomly generated.

Dan: I don't think it's possible for any of us to guess EXACTLY what is going wrong with the anim bugs in the Mass Effect Andromeda (not without knowing BioWare's pipeline or the complexities of their system), but when you look at some of the footage of the bugs, what are some of your best guesses as to the cause? What variety of things could go wrong in a conversation system that would result in animation bugs like these?

Simon: Before I speculate on what the cause of these animation issues are, I think it's important for people to understand some of the numbers behind a game like this. I don't have exact figures from ME:A, but we do know that Mass Effect 3 had over 40,000 lines of dialogue and Dragon Age had about 60,000. If we split the difference at 50,000 and conservatively estimate that each line averages out to about three seconds, that puts us at around 41 and a half hours of dialogue. That's about 21 feature films worth of just talking. Most of the major animated feature films have a team of about 70+ animators working for two or more years to complete just one movie. A game like Mass Effect might have somewhere between five and ten focused on more than 20X the content in the same amount of time. To add to that, we need to also factor in localizing (translating) the game into at least 4-5 additional languages.

"Many bugs look like FaceFX gone awry. I suspect that a lot of the implementation was not even done by an animator. Frequently you will have an intern or junior simply copy-paste the written script into FaceFX as a starting point."

Now, it's just not possible to keyframe that amount of content to any acceptable level of quality, so teams looking at that much scope try to find procedural solutions. I know in the past they've used an off the shelf solution called FaceFX, which analyzes the audio tracks and creates animation based on the waveforms, projection, etc. At a base level, it can read as a very robotic performance and I suspect that is what we're seeing in some of the footage. You can work with the audio and the procedural tools to polish the performances in various ways of course, but when you're staring down thousands of minutes of performance to clean up, your definition of "shippable' is a sliding bar that moves relative to team capacity and your content lock date. If it were my team and project, I would try to gather metrics on which scenes were the most watched based on playtest and use whatever polish time I had with those as a priority, letting the lesser seen ones go with a default pass.

Gwen: I agree with Simon that many bugs look like FaceFX gone awry. I suspect that a lot of the implementation was not even done by an animator. Frequently you will have an intern or junior simply copy-paste the written script into FaceFX as a starting point. The system will automatically generate facial animation based on the letters/sounds from the text that is input. Generally this looks bad, and you need to spell out words phonetically, rather than typing them in properly – but this takes a lot of time and resources are often limited.

Have you seen the "crab walk' glitches when a character is running quickly? There are several common causes for this. If the collision for the ground isn't setup properly then the character will be clipping through the floor somewhat. However, the in-game ik system doesn't expect a character to be partially through the ground so it would pull the feet back up to the ground level. This would result in the knees being bent strangely. Another way this could happen is if the in-game ik system is detecting a slope that isn't there, or there is some math error that is pulling the pelvis down to the ground. For instance, if you run up stairs quickly the system will detect that you are on a slope, and pull the pelvis towards the ground to compensate. If the system breaks and is never gets told "hey, we aren't on a slope any more' then the pelvis will stay permanently pulled down which will result in some really funny "crabby' locomotion.

There are many possible causes for the eye flickering as well. There is definitely a dynamic eye tracking system in ME that has NPCs looking at interesting objects that walk by. Perhaps some of the cutscenes never turn that off and the procedural eye tracking is fighting with the authored eye animation? Or perhaps an object is dithering between being in range (and should be eye tracked) to being just out of range (and should not be eye tracked.) Or perhaps there are 2 things that an NPC is trying to track at the same time and the system is constantly getting targeting information from two different targets?

"Everyone involved wants to make the best choice possible for the game, but the larger the game, the more chance for miscommunication there is."

Tim: I think the thing to keep in mind is that everyone involved wants to make the best choice possible for the game, but that the larger the game, the more chance for miscommunication there is. Animator A could have the best laid plans for how their conversation animations should be used based on a design doc they got from Scripter A, but Scripter B in another city (or even across the building) might need one of those animations for a purpose for which it was never intended. Sure, they might run a request up the flagpole to get a specific one-off motion, but Animator A might never see that request as deadlines approach and triage of feature requests takes place.

To lay blame on either of those two people would be wrong. Criticize their work, that's fine- as creators we often welcome critique as a way to better ourselves and the product we provide to gamers. But with an endeavor as large as a conversation system, for example, it's important to know that it takes a large effort to execute, time to execute it, and if it falls short of our expectations as gamers, we need to be OK with that being the beginning of a learning experience for the developers that will ultimately provide better results over time.

And we need to let that time happen.

Dan: I want to just close by giving a salute to the Mass Effect Andromeda team. Now matter how this game is ultimately received, a lot of people clearly put their all into making it, and that deserves respect. Congratulations for getting this game to the finish line, y'all! I can't wait to play it.

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like