Sponsored By

Sony's San Diego studio is perhaps best known for sports games such as NBA '07 for PlayStation 3, but, as this exclusive multi-interview Gamasutra tour reveals, the studio also has major audio, mocap, and even downloadable PSN game plans.

Brandon Sheffield, Contributor

August 22, 2007

32 Min Read

Sony Computer Entertainment's San Diego studio is perhaps best known for creating the company's platform-exclusive sports games such as NBA '07 for PlayStation 3, but is also a major audio and motion capture/cinematics hub for Sony at large. The office executes on audio or production work for titles as varied as God of War and Home, and has built massive new facilities to accomodate their next-gen needs.

From the re-done audio building to the 120 x 120 x 35 foot motion capture facility, SCEA San Diego is rather well funded and well equipped. But far from being just a big-budget house of sports and mocap, new teams have surfaced to work on original PSN titles, as we discovered when invited to check out the office and conduct multiple interviews.

Sony San Diego, Now And Future

Earlier this year we had a chance to tour the studio and speak with SCEA San Diego's Director of Product Development, Jim Molinets, who filled us in on the direction the studio is headed. We were then offered a chance to inspect the facilities -- from the audio production to the mocap -- and learn more about what's going on here and why.

What are the properties that are worked on here?

Jim Molinets: Our external producer is in charge of the ATV franchise, MotorStorm, F1, and other PSN external titles. Internally, we have shifted our focus from doing large-scope PS2 and PS3 titles to doing significantly more PSN titles. We've got three smaller groups now, and they're all doing very differentiated products with different requirements.

Is there a PSN team leader?

JM: Our maximum team size is ten people, so we really utilize the strengths of all the different people within the studio. Each individual product has a technical lead, a design lead, and an art lead.

Are these people who used to work on the larger-level titles?

JM: Yes.

Do you find that it's going to ultimately be worthwhile to pay people their normal salaries to be making these smaller games? The traditional model has been for indie guys to make stuff on their own dime.

JM: I think there's room for both, actually. We're privy to things from a developer standpoint that indie developers can't pay for. We have the ability to utilize our motion capture, and the ability to use our QA directly as a focus test group. We also have expertise that goes years beyond any indie developer on a PlayStation platform. Our group here has several people who have been with SCEA for over 15 years.

Does that make a longer tail in terms of the game itself becoming profitable, or is it more about building the experience that there's good stuff at this stage?

JM: From a business standpoint, of course we want everything to be fantastic and profitable. But you deal with the exact same experience if you're dealing with a PSN title, a PSP title, a PS2 title, or a PS3 title. We want to focus on what our innovation is, how we're going to expand the core experience for the user, and what we're bringing to the table from an experience level from our team members. The internal teams also have an ability to strategically leverage relationships with our partners that we've worked with before, with the music and movie industries.

It almost sounds about halfway between a smaller indie title and a full production.

JM: That's a safe assumption. I come from a shareware background, and it's really intriguing for me because it's kind of like that model. One of the exciting things about PSN titles to me is the direct correlation between product development and the end user. The community based around PSN is going to tell us if they like or don't like something, or suggest features. Because the team is smaller, the dev cycle is smaller, and because our expertise level is high, we may be able to address things like that very quickly, as opposed to having to wait for a full two-year product cycle.

Were you thinking of being able to patch games?

JM: Patches have the lowest common denominator, I'd say. Patching insinuates you're fixing something that's broken. With PSN titles, what we're focusing on is expanding the user experience.

At Microsoft, they're mandating that extra downloadables like that be paid for. Is that similar at Sony?

JM: It's a little bit of both. For the most part, it's a paid service, because a lot of it is already free. As far as expanded content goes, we're hoping that the lower price point of PSN titles will drive people to purchase expanded content.

How did you evolve from the 989 name?

JM: With a large talent pool like this, we wanted to have the ability to utilize that creativity and talent on a larger basis, so we decided to merge the studios. 989 was associated with sports, and we were studio two -- we never really had a name. Since we're all together, we didn't feel that 989 did it justice anymore, so we switched it over to a single title. We also wanted to be consistent with Santa Monica.

The 989 brand was always Sony internal, right?

JM: It has gone through many different iterations. It was internal, then external, then internal.

Do you have any concerns in the sports arena about competition with Electronic Arts, given that they've been snapping up long-term licenses?

JM: Like anything else, competition is healthy. We don't do a football game anymore, but the strength of our baseball and basketball franchises are getting progressively better year by year. Last year, baseball was a massive hit, and I see basketball headed in that same direction. You just have to overcome the barriers associated with people saying, "This was my favorite game before." Once you start to beat that expectation year after year, people will gravitate towards the better product.

What are the benefits of San Diego, as a studio?

JM: The locale is fantastic, first of all, as an incentive for getting talent here. San Diego is within two hours' driving distance of many things that are interesting to people. From an SCEA perspective, we have on-site QA, our IT group, our mocap and cinematics studio, access to marketing, access to promotions, access to legal, and our sound studio is here as well. From a product development standpoint, it's a one-stop shop for us.

Multiple disciplines are spread across multiple buildings here. Is that beneficial, or do you find that people wind up not communicating as much?

JM: It hasn't hindered us at all. Part of communication is knowing who you're supposed to talk to and when, and in product development, you have milestones when you know you have to get the audio group involved, for example. Everyone here is great about getting up and going to talk to [whoever is needed], so it's not an issue. Intermixing everyone would be fantastic, but you'd have to have an airplane hangar to do that. If we had some cinematics people over here and some over there, that segregation is going to hurt their ability to do their job and share their expertise with their team members.

Some people talk about more vertically sliced development, where you make a chunk a game at a time, so you have artists and designers working together in tandem.

JM: Absolutely. We do that here. We have our art side and our design side, and there's separation, but never a day goes by where there's not meetings about that stuff. We've got a bunch of different communication channels we use. We use an asset control called AlienBrain. We use Instant Messenger, LotusNotes, and phone and verbal communication to a large degree. Having a small team like this, there isn't an issue of not being able to talk to someone if there is a problem.

How many people are here at the studio?

JM: There's 32 people right now, and 28 those are product development. The others are external, an admin, and myself.

Is that across all of these buildings?

JM: Just this building right here.

How about the entire San Diego studio?

JM: It's about 450 employees, including the service groups and administration. As far as product development goes, we've got about 200 people.

Are you finding that the E for All Expo is similar to E3, in terms of having to bust out a demo in time for that?

JM: We have a very structured development-stage gate system. Those have outcomes that are demos or videos or documentation, and we never really adhered to having to have an E3 demo.

You guys are doing some production for Home here, right?

JM: We're localizing the title here. We're trying to help them out as much as we can, but the main development is in Europe.

What specifically are you guys doing with the localization?

JM: We're doing QA support, getting the IT server set up here, and doing any kind of background help that they want. We don't need to do any kind of language support, or product development technology support.

What kind of reaction are you anticipating from Home?

JM: I hope it's a great reaction. I'm excited because I think it's going to be interesting to see, and it's going to be fantastic to integrate the PlayStation experience overall. It's totally different.

Do you have anything in particular to say to potential employees?

JM: I come from an independent developer background, and working for Sony is fantastic. People sometimes don't understand what the advantages of working for a company are. They'll say, "You're going to deal with red tape," and all that, but that happens everywhere, frankly. Believe me. Being an independent developer, I dealt with that from a publisher angle. I wouldn't worry about the "megacorp" as a main concern.

As a company, we're really showing a lot of progress and innovation on all of our product platforms, and that's what drives developers: the ability to make great games. Sony is definitely a place where I, as a developer, feel that they're saying, "You tell me what you want to make, and tell me why it's going to be fantastic and exciting and new, and we're going to support you." It really is your job to stand up and make that happen. That's unique and exciting to me.

Do you perceive a change in Sony over the last few years, in terms of the kind of stuff it's willing to put on its consoles? I'm speaking specifically of PSN-type games. A lot of smaller publishers used to have trouble pushing their stuff through at that level.

JM: I don't think there's a shift; I think that's always been consistent. That drive for quality and innovation has always been part of the development scheme. When you look at PSN, there's a lot more opportunity there. Would Blast Factor have been a boxed product? No, but it's fantastic for PSN. flOw is like that as well. Because we're more of the shareware model on PSN, that level of innovation can be done because our team sizes and budgets are smaller.

Is there a different kind of production tactic for digital downloadable stuff versus disc releases?

JM: No. We use the same exact measurements of preproduction and production processes that we use [for retail games]. The differentiating factor is the timeframe, the product style, and what the requirements of the title are.

Looking forward, how do you see digital compared to boxed releases? It seems like a lot of stuff is starting to go download-only.

JM: My personal opinion should be obvious, because all three teams at my studio are doing PSN titles. I personally am very excited about that, because the direct corollary between what we're able to do with a title of this size is different. The ability for us to integrate into a consumer community and really listen to their voice and have our own voice heard at the same time is a lot higher than a boxed product. Turnaround time is also quicker. It provides way different opportunities for me. I see digital distribution as a way to close the feedback loop quicker than I ever could before, and make a stronger, better product overall.

Do you envision full titles going digital at any time in the future?

JM: If it's right for the title. It depends on what you're trying to do and what the consumer wants. It may happen with some titles and not with others.

Do you think that having a boxed hard drive with the PS3 makes that more possible?

JM: Yes. Emphatically yes.

From an awareness standpoint, how are you having to change tactics in terms of getting people interested in these titles?

JM: It's a shift for us on the PlayStation side of things. We're getting out there with some awareness where we can, but it's a shift in terms of how we're distributing code to folks. It's certainly going to scale in terms of the kind of promotions that we want to do from a PR standpoint. We'll be applying different tactics for promoting as we go.

Will the digital age mark an end to preview code? There are downloadable demos and levels and such, which could arguably be a good basis for a preview.

JM: Again, it's based on a per-product basis, in my opinion. Some things may have episodic or expanded lifecycles, and that particular situation may not be as large of a concern to the developer or PR. If it's something which is a one-shot product, you may not necessarily want to do that. I think that the titles that have a more unusual development style or goal may see more of that type of offering than something which is more traditional and easier to understand by looking at screenshots or reviews.

Do you envision episodic content ever becoming a part of consoles? It's largely been a PC realm.

JM: That's definitely an opportunity on PSN. Sam & Max is a great example of that. You've got great characters that can go through a bunch of crazy scenarios. If you don't have an IP or an idea that doesn't support that and you try to shoehorn it in there, it won't be right for the consumer. That just depends on what you want to do with the game and how you want to expose your user to it.

What is your background?

JM: I started in 1991 doing a Galaga-like PC shareware game called Galactics, working with three friends of mine from high school. It was a garage band thing. My role on that was to script in AI for the flights of enemies. We put it up on the Internet and got noticed by John Carmack of id Software, who told us they were working with this new technology on something called Wolfenstein 3D, and wondered if we'd be interested in working with it. We immediately said yes!

I wound up working with Apogee and doing a game called Raptor: Call of the Shadows. It was initially in Texas, after we moved from Chicago. I then started working with John and his group on an action-RPG called Strife, and that was the formation of our company, called Rogue Entertainment. Then we did two mission packs -- one for Quake and one for Quake II. We also helped out with the N64 port of Quake and then did American McGee's Alice for EA. After that, we closed the studio, and that's when I came here.

Is there anything else you want to get across about your department here?

JM: I think the strongest thing that I have to deal with from a managerial standpoint is this general feeling about Sony. You get all these lovers and haters, and from an employee standpoint, you get a lot of people saying, "I don't want to get involved in the megacorp! I don't want to work for EA, Sony, or Nintendo!" The only thing I have to say is that opportunity is what you make it. If you want to come in here and do a great job, you have the opportunity to do that. If you want to come in here and just be a worker bee, that's part of it, but we don't expect that. As a matter of fact, we discourage that openly. We'd rather have you try and help everyone do better. We all want to do great, and I see that as a consistent theme [at Sony].

On a personal note, I think you're going to see some really great things coming from Sony very soon on the PSN side. Innovation, differentiation -- it's going to be across the board. There's a lot of great things, and Phil's GDC keynote was just the tip of the iceberg on some of this stuff. I'd love to be able to go, "Check this out!" but we're not there yet.

A World of Sound

With that, things moved on to a discussion with SCEA's Director of Service Groups, Worldwide Studios America, David Murrant, who gave a tour of the studio's brand new editing facilities, designed to bring top-quality sound to Sony's games, across all platforms.

David Murrant: My group consists of four departments. I oversee cinematics, motion capture, sound and music, and multimedia. My last gig was managing the sound group, and as part of that we built these facilities here. We've got 22 edit rooms for sound designers to do their work. Each one is a THX 5.1 surround facility. We have two mix rooms which are 7.1 facilities. We have a foley room and a control room for that as well. That foley room is a multipurpose room. We also do a lot of recording and music, and we're starting to do more of that in-house.

This is our mix room. We do a ton of cinematics through this group each year. It's probably the equivalent of six full movies, or six hours of movies. We use John Roche from Warner Bros., who does a lot of foley for us. He's an absolute star when it comes to that. We use more and more foley work.

It's really good that you guys actually use that, because a lot of people don't seem to.

DM: The difference is huge. We never used it, and it was only in the last few years that we really [started], and we've seen the difference in quality. Plus, the music was recorded in London and Prague. We used real orchestras for this, and different composers. That's pretty much the standard we've set for a lot of our projects. That goes for the voice talent, too. We're trying to raise the bar all over the place. Everybody talks about chasing Hollywood, but we're different. As such, we're trying to make our own sound and our own direction.

Another thing that we use this room for is game mixes. We'll sit here and mix all of the elements in-game. We'll have a tester play through it, and we'll be tweaking stuff and making changes on the fly. What that does is give us a reality check at beta, where we can go, "Okay, now we've got to finally draw all of our threads together, and find all of the bugs and get them fixed."

At what point do you start implementing in-game audio?

DM: As soon as possible, really. We can see what comes up on the producers' radar, and as soon as we know that, we start to make connections with them. We have a great internal team, and we want to make sure that they're getting their audio in, both from a design and a concept [standpoint]. We then assign a person who's a lead for the lifetime of that project. Sometimes there's not a lot of work to do because there's not a lot of game assets to work with, but as soon as that starts to come online, anything we can do we'll start to pour data in. We'll use temp sounds, and then we'll move to temp dialogue and music.

Do you do much with truly interactive audio, where the player can affect it?

DM: Yeah, most obviously in music. We try to tie the music into the mood of the game, whether you're exploring or whether you're in battle. We try to make sure that the music transitions smoothly. Hopefully the player doesn't notice it, but feels it more than anything. If the scene changes, the sound should change for that scene, so that the player's perspective may be changing on the fly as well.

It's good that it's not coming in at the last phase.

DM: We couldn't do it at the last phase, but we know that there's no such thing as true post-production in video games. Everybody wants to keep changing until the last day.

 

 

This is our foley room. We use all these great foley artists, but we also want to be able to do our own. This is an eight foot by eight foot pit that we have in here, so that we can fill it with water or do whatever we want with it. We also use this space to do music and ADR. The problem that we have here is that we have F-16s flying over all day. We wanted to have a room that was isolated from that noise.

Let me show you one of the pods. This is an edit room, essentially. Each of these rooms are 5.1, THX-certified rooms. We wanted to set a standard that's used in the industry, so we could lay our mark in the sand with that. We standardized all of the equipment, so that everybody could move from room to room and know that what they're hearing is [the same]. It's a nice environment for the guys to work.

And this all got built around a year and a half ago?

DM: It just got finished at Christmas. It's still all pretty fresh. It spent two years in design and development and building. I thought it would take six months and we'd be up and running!

Another element that I haven't talked much about is the dialogue side. We have a dialogue team that mostly works in Hollywood, with our external vendors. For us, we see them as an asset that we can use. They have connections to the actors that we may want to use. The difference is in having somebody focus on looking at and reviewing scripts in the beginning. Having somebody who knows the script and is at the session and is helping with the casting process is shown in the quality of the dialogue in cinematics. They have ownership from beginning to end on the project.

 

And those guys are in-house here?

DM: They're actually in Foster City, but they're traveling down to LA all the time. We're looking for strong people all the time, and that's not always easy. We want people who are going to fit with the team and bring something extraordinary to the table. What these facilities give us is the opportunity to do better work, and hopefully that translates in what we do for the game at the end of the day.

Once we have these different groups, the route is very flat. If we've got a project coming to the end and we need to dogpile it, we use our own proprietary tools and technologies that we use on all of our projects. If there's a project happening in Foster City and they need 20 ambiences for it, the development team is based in Seattle, so we're off-site already. What we then do is find out who's available and who's ready, and people will just start working on it. That goes for in-game assets as well. Wherever we need, we can dogpile, and once that crazy crush is over, everybody goes back to what they were doing.

There's a lot of communication. We want to make sure everybody's in sync with each other. There's a lot of sharing of knowledge that's been gained through the process.

What tools do you use?

DM: We're using Pro Tools in all of the rooms. We're using Sound Forge on the PC side for doing stereo editing. For our proprietary tools, we use a tool called Scream, which I know is available to PlayStation developers. What that does is it allows you to do random volume and pitching of sounds. The one thing that drives me mad is when you hear the same, repetitive thing over and over, so one of our goals on PS2 -- and it's getting easier on PS3 because we've got more memory -- is that we want to make sure that every time you hear something, it's evolving and changing.

Scream allows us to do that. It allows us to script things in many different ways. Plus we have tools for doing engines, crossfades, and music as well, and how the interactivity works between those. We've been pushing hard to make everything data-driven. For all of those, we just get events from the game. We're essentially trying to score the game -- not just the music, but also the sound effects and how all the interaction works.

Do you have any kind of dedicated audio QA?

DM: We have the QA department, but frankly, we tend to do it mostly ourselves. The reason is because the leads should be playing the game pretty consistently, so they know the game inside and out. But if you send it to test, they might go, "You know, that sounds okay. I'm not sure if I like it or not, but it's playing a sound." It could be the completely wrong piece of music or ambience. The only true way is if we were able to send them files and say, "OK, play that one, and let us know if it's in the game." In reality, that's more work [than is reasonable].

One of the things that we have considered is having people working in here as part of QA, so we can just say, "Hey, play through this level and make sure that you know the material as much as we do."

That's one of the things our audio columnist was recommending. It was the idea of creating an audio map for a QA person who could basically become your dedicated audio QA and figure out if the implementations are proper.

DM: I think there's a lot of value to that, especially with the PS3. These games are getting huge. PS2 games could be big, but you could stay on top of it and play through. But now you have these huge, unwieldy worlds. You don't know what's going on in that corner of the world unless you go over to it, and these guys don't necessarily have the time to play across it. So I think there is going to become more value to the implementers and the QA, because we can't stay on top of it, especially when they start shifting stuff to some obscure corner.

[For example, there could be] a windmill, and then suddenly they move that windmill 50 feet but the sound is still attached to the old space. Chances are, you could miss it, and you don't want that. Part of that we can avoid using tools that tell us what things have moved. Some of it we can catch that way, but the rest of it is just done by playing through.

Capturing It All

Brian Rausch, manager, motion capture and cinematics group and Aaron McFarland, editor/compositor in the cinematic solutions group, work hand-in-hand to provide the high quality visuals that accompany many of SCEA's games. With a huge motion capture studio, these guys are responsible for putting the players into SCEA's sports lineup, the dragon riders into Lair, and the soldiers into the upcoming PS3 sequel to Killzone.

What tools do you use [for motion capture and animation post-processing]?

Brian Rausch: To acquire on-stage, we use Vicon IQ, Vicon Diva, Motion Builder, and our end product 90% of the time is Maya.

Aaron McFarland: [regarding cinematics and video services] We're a Mac editing house. We are almost entirely on Final Cut Pro, andthen Shake for our compositing. On the back wall we have high-end editing suites and compositing suites in each bay. It's kind of a separate little Macintosh pod in an otherwise PC world.

BR: This is our lighting and rendering area. This room's pretty malleable. When we staff up for production, we'll bloat that area. We're right at the tail end of Lair production, so our lighting team is the largest team right now, and they're in this room. When we start our next production, we'll move the animators back in.

BR: This is our online room.

AM: We do an offline edit, and then at the very tail end as we go up to HD, we have a PlayStation 3 development kit all wired through to every possible file format that a marketing guy, PR guy, or anybody could request. Any team within Sony who wants anything in any format can come to us and we can either capture it ourselves or take what they can give us. We cut a lot of the trailers you'll see.

BR: This is the control room for the machine room. We bring in objects and use these surface markers to realign textures and realign the model once we've acquired it with the rig. Our target texture size is always about 4k. Ultimately, this room was specifically designed so that we could bounce light off the walls, and make everything incredibly flat. If I were to bring on the other lights, what will happen is that there will be no shadowing on an object, because all the lights bounce back from all angles. Then we're able to control the lights and the darks inside of the game engine, versus having any prebaked shadowing, or worse, prebaked shadowing where we have to have an artist go in and pull that shadow out of the texture. We built this room to specifically avoid issues like that.

BR: This is our scanning room here.

What are the dimensions of this place?

BR: 120 feet by 120 feet by 35 feet high. The trusses in the ceiling are capable of holding over 5,000 pounds. We can connect them together if we need to go above that, so we could pretty much shock-lift an elephant off the ground if we ever need to. The floor was ripped out and laser-leveled so that it was about as flat as a human can make it. The point of that is to keep the data drifts out.

Even in our smaller space, we didn't do anything with the concrete, and you'd have waves in the concrete and have an angle. We don't have that issue here. We're using Vicon MX-40s. We have 88 of them in the room. We don't use them all concurrently in the same zone normally. We can connect them together, but that's just a tremendous amount of overkill.

BR: We can talk about the difference between the two volumes. With the larger volume over here, we capture full body motion. It's gross hand motion but no finger motion, and gross head motion but no face motion. Over on this side, this is our integrated volume. We'll capture full body, face, and fingers simultaneously. We have sound-controlled the environment, and all of these over here will surround certain areas as well to control the audio even more.

BR: For our NBA game, we acquire audio as we're capturing, so we capture the entire production all at once. We don't do separate audio tracks or separate VO. It's all acquired simultaneously. There's something about when an actor acts, versus when he stands inside a control room and tries to deliver lines. There's something so much more natural and flowing when an actor's acting and we capture his voice at the same time.

Do you have to do a lot more takes that way, though?

BR: Yes, and the only reason we need to do more takes is because we live in San Diego and there's an Air Force base close by. This room is not soundproof; it's sound-controlled. So when we get jet fighters going over, it causes a little noise in here.

AM: Actually, it's the Marines. It's helicopters and F-18s.

BR: I'll take you to the control station here. This is the Sony Anycast system. Essentially, this is a real-time view of all the marker data and connections. We've got multiple reference video. All the cameras are controllable and zoomable from here. One of the interesting things about this is that I was over in the UK last week, and they were actually able to sit at their desk and direct a capture from the UK in real-time to our stage.

AM: We can direct like a live television show. We can go from up to six cameras, remote controlled on the camera to zoom in on the actual action for an individual shot. We can broadcast that over Macintosh's iChat. Everybody's on headsets, so they can talk to a director out in the space who can come back over and have a face-to-face teleconference in the middle of the mo-cap session for them to put their two cents in. In the meantime, other folks on the team can sit back and watch a "making of" for the motion capture for their game, like a television show, live as it happens.

BR: By actually being able to control the camera views, people are able to stay focused quite a bit longer. It's something we're capable of doing for up to eight hours a day.

AM: All the video of all the cameras can be captured so that when we clean up data or whatever later, we can go back to the reference video and see if there was some noise that got in there. We can see what really happened, and we can do motion editing based on this reference video.

 

It seems like a pretty expensive setup.

BR: Each motion capture camera is about $20,000 apiece.

AM: This system is not nearly as expensive as you'd think. This is most often used by churches and such. They'll have multiple cameras set up and they'll webcast their services and videotape their services for later use.

BR: Welcome to the church of motion capture!

How many actors can you do simultaneously?

BR: We've done thirteen. We haven't hit the upper limit; we've just done what we've needed to do.

Do you service all of SCEA?

BR: Yes.

What about the worldwide studios?

BR: We're an asset that the worldwide studios can utilize as well. We're all starting to work together -- America, Europe, and Japan. This is the most significant motion capture resource we have on the planet, and we're trying to tie as many resources into it as we can.

AM: And the Anycast is something we're trying to leverage to make us more of a reasonable use for people who are far away, even if they're in the States, so they don't have to fly a team over.

BR: Normally when you're doing a large project, you'll be over for three weeks to acquire all of your data. For Rise to Honor we were twelve weeks. It's too many people to fly away from their families, so this is a way we can all do it, and it's not too big of a stressor on folks.

 

 

BR: Those are all the Killzone casts that we built, that we scanned in.

AM: We scanned them and retextured them, then put them in. The idea was to save time, but...

BR: We did the opposite of saving time! We had a really cool technology and a really cool pipeline, but since it was the first time the pipeline had ever been done, it didn't save a whole lot of time.

Might it save time in the future?

BR: Oh yeah.

AM: Yeah, once pipelines for any of these things get established, most everything that we try and do as service groups here allow us to save time. If people took a sports game and tried to keyframe every move, obviously that's a huge amount of time. If we can do more to create models for people, we're going to try and leverage that to make everything look better and go quicker.

BR: There's a lot of detail that's really hard to model in. We can knock these out pretty quickly and put in a lot of that detail, then scan it in. Then it's a matter of texturing it correctly.

AM: The guy who made these has a background in miniatures for people who do dioramas and miniature war gaming and things like that. When we tapped him to do stuff for video games, it wasn't something he had ever thought about. He was really stoked.

BR: One of the things we need to do in the future is to build these modually. We need to be able to pull chunks off and scan them in individually. Scanning all of this in would be super cool, but it takes a tremendous amount of time. If we could break this apart into modual levels, scan them in, and put them back together, we'd be able to turn around much faster. As far as the exercise went, it was a successful exercise that we'll expand on when the need arises.

 

Read more about:

Features

About the Author(s)

Brandon Sheffield

Contributor

Brandon Sheffield is creative director of Necrosoft Games, former editor of Game Developer magazine and gamasutra.com, and advisor for GDC, DICE, and other conferences. He frequently participates in game charity bundles and events.

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like