'AI and Games' is a crowdfunded YouTube series that explores research and applications of artificial intelligence in video games. You can support this work by visiting my Patreon page.
One of the most challenging and unforgiving genres for artificial intelligence in video games is stealth. Players are tasked with sneaking into locations and conducting their business, be it to steal items of interest or eliminate targets all the while adapting to new threats and exploiting opportunities as they arise. There is an expectation that stealth games provide a real challenge - especially on the highest of difficulty levels - but all the while providing small areas of opportunity for players to exploit. This presents a lot of challenges for developers when creating enemy opponents. They need to be smart and react to the world, communicate their thought processes such that player can compensate but ultimately provide a fair and balanced experience as best as possible. In this blog let's look at how to build balance in stealth games by exploring the inner workings of Splinter Cell: Blacklist.
The Challenge of Stealth AI
Blacklist, developed by Ubisoft Toronto and released in 2013 is the seventh entry in the long running Tom Clancy's Splinter Cell franchise. The game is a marriage of previous entries in the series in that it allows for players to take different approaches towards completing each level. Balancing the stealth puzzles of the series roots alongside the more action focussed gameplay exhibited by 2010's Splinter Cell Conviction. As you take charge of Sam Fisher in Blacklist, you can decided how bombastic or otherwise you want to tackle each new locale.
This presents a massive challenge for level design, game pacing and the design and development of AI characters, since this effectively creates two games in one and the games systems need to effectively manage the transition from the stealth into the combat game if the player makes a mess but also switch back when things die down. Stealth AI is already a challenging problem to solve. Given enemy non-player characters must satisfy several criteria.
First of all the AI characters need to be consistent in their response: NPCs should recognise the world is changing around them and go into a heightened state of alertness. These responses need to be consistent across the gameplay experience such that you can rely on specific tactics to lure enemies into positions that either compromise the security of the patrol routes established or their personal safety.
Secondly there is the issue of communication and feedback: if a character notes something is not right, or spots the player from far away, not only must they act in response, but they must effectively communicate what their intentions are. This includes their animations, talking aloud to themselves or communicating with other NPCs in proximity. Given the player needs to know what they're thinking and doing in order to get themselves out of the current situation.
Lastly, there is the issue of predictability vs novelty. Ultimately you want the responses and feedback to be consistent to a point of predictable. That said, predictability means the game can become repetitive and in-turn stale after several hours of play, largely because players are unpredictable and difficult to account for. Hence there is a need to maintain novelty where NPCs might do something a little different the next time around, or new enemy archetypes are introduced that force players to approach problems differently.
All of this comes under the umbrella term of 'fairness': you want players to develop their own idea of how the games systems will operate such that they can exploit it to their advantage. The more you play the game, the stronger your understanding becomes as new concepts are introduced and prior knowledge is reinforced. Hence playing through the game on that first playthrough versus the second or even the tenth is a different experience and higher difficulties put that knowledge to the test. That said, this isn't something that will work for everyone, as fairness is entirely subjective and may differ from player to player.
The Four Pillars of Stealth AI
When discussing how AI systems need to support fairness Martin Walsh - Blacklist's lead AI developer - had this to say at the 2014 Game Developers Conference:
"If your opponents feel dumb then you get no real satisfaction in beating them. But you know what's interesting is not being dumb doesn't necessarily mean being smart. And what it actually means is always being plausible."
"It's interesting to note that there's often a conflict that exists between fairness, consistency and intelligence and that's something that's important to be aware of."Martin Walsh, "Modeling Perception and Awareness in Splinter Cell: Blacklist", Game Developers Conference (GDC) 2014.
These ideas are core to the design of the Splinter Cell franchise, not just Blacklist specifically and as Walsh explained in his GDC talk, there are four key pillars of enemy AI design that need to be balanced in order Splinter Cell - or any stealth game for that matter to work effectively.
- The Visual Perception of the Non-Player Characters: How do friendly and enemy AI see the world around them and spot the player if they're exposed.
- Environmental Awarness: Meaning does the character have some understanding of the local geometry. Do they notice when doors or windows have been opened or closed. Or when things go loud do they know good cover or chokepoints to establish.
- Auditory Perception: Meaning how do we model the NPCs ability to hear noises or more critically, what the player thinks NPC should hear.
- Social and Contextual Awareness: Recognising what's happening in the world around them as the player eliminates their comrades, and even just having realistic conversation between guards as they're waiting for the player to come in ando their business.
These principles hold not just across Splinter Cell and stealth games, but a variety of genres. In fact we recently explored the sensory systems of the xenomorph as we revisited the AI of Alien: Isolation and as you'll see there is a lot of overlap between the systems in Creative Assembly's horror game and what I am about to cover. And this makes sense, given a lot of what I'm exploring in this episode isn't new, but has went through years of iteration and refinement within game development. So let's take a look at how the four pillars are managed in Splinter Cell: Blacklist.
First up let's look at the Visual Perception of AI characters and how to balance this for stealth games. Vision systems are built to allow for AI characters to recognise objects within defined fields of view, typically through use of view cones: where a cone (or a triangle if it's a 2D environment) of a fixed radius and height is projected from the front of the non-player character. If an object intersects with the shape, it registers a signal that says that object is 'seen' by the AI. Typically the longer an object stays within the cone the signal becomes stronger over time. Hence this is used to ensure that you're not spotted immediately upon intersecting the view cone, as an AI will typically wait until the signal is strong enough to recognise you.
Now view cones are a well-known and commonly used technique, but they're fundamentally flawed, because that's not how human eyesight works. A view cone is often a fixed angle and distance ahead of the character, whereas humans have close to 180 degrees of vision. So one solution is to make the view cone wider, but this presents another inconsistency, given our eyes are built to allow us to focus on specific areas, making what we see sharper and more clear. But in the process of doing so makes everything else in our peripheral vision blur.
Hence Blacklist, like many other games, adopts multiple view cones to represent different types of vision and the detection thresholds and rates of signal increase vary depending on whether they're designed to model close or distant vision. But the thing that Splinter Cell does to stand out is that view cone isn't a cone-shaped, it actually looks a lot more like a coffin. This ensures the NPC maintains clarity of vision in front but also prevents them from having strong peripheral vision at longer distances, making it more realistic. In addition, the size and shape of each cone is tweaked for each difficulty setting, allowing for more blind spots on lower difficulties, but giving expert player a run for their money.
Plus for spotting the player, there is an additional suite of sight checks that are made given the player is more or less exposed depending on the stance they are currently in. Sam Fisher can be in a variety of different poses, with some - like being in cover - designed to conceal your position. Hence in order for a the vision code to detect the player as seen, they not only have to be within the cone itself, but also the character has to successfully run multiple sight tests on the player to see how much of their body is exposed. The NPC runs a raycast from its eyes to eight of the bones in Fisher's skeleton. The more of these bones that are visible without anything in the way means the character is more exposed. Depending on the stance the player is in, the enemy NPC needs to succesfully see a minimum number of these bones before it will recognise your presence. Hence standing in the open versus crouched behind cover have different rates of detection.
But despite this more nuanced system, there are still special cases where, irrespective of an AI characters vision, designers wanted more control over how exposed a player is in certain locations. Hence there is a special custom cover point system that allows for designers to tweak the players visibility at specific cover points, allowing for more control over gameplay in each mission.
Next up, let's look at environmental awareness, which encompasses all of the knowledge that a given AI character can have about the world and the ways in which it can reason about that information once it has been gathered.
This is driven by what is known as TEAS: the tactical environment awareness system that - like many of the AI tools in Blacklist - was originally built for Splinter Cell: Conviction. It contains a variety of information about the topoggraphy of the local environment as well as what objects are in the local area that the NPC should be aware. TEAS was expanded upon in Blacklist to provide a variety of useful features, not just for environmental but also in cases of auditory perception and contextual knowledge which we'll see in a minute.
TEAS stores a series of positional nodes that are autogenerated from designer annotations within the map. This collection of nodes creates represents the areas within which AI and the player can go into cover, areas that they can reach by moving around the environment as well as chokepoints that appear within the environment. Hence guards can patrol and guard chokepoints more effectively and make it more difficult for the player to sneak past, given they have explicit knowledge of the importance of a particular doorway or similar entry point.
In addition to TEAS, in Blacklist objects that the player can interact with such as doors, windows and lighting to maintain history of their state. So when a light switch is turned on or off, we know how long it has been since the action occurred and a lifetime within which that action might be deemed suspicious. When a guard is out on patrol, it can query whether the state of the lights has changed and whether that is something it should be worried about. Hence if the lights are now off, but they were on when the guard looked at it 5 seconds ago, that could be suspicious and merit invesitgation. But conversely, they're not going to walk into a room and go into an alert phase immediately because someone turned the lights off 5 minutes ago and it's only now noticed.
The third pillar is auditory perception, or put simply, sound. The ability for an AI to hear a noise nearby and recognise whether that noise is nearby and should be investigated. Typically in games if we want an AI to hear something, we trigger an 'event' in the game. Each event will have a distance from the character that heard it as well as a priority. It's useful to have both, rather than prioritising sound solely by distance, otherwise a guard might lose their mind and start opening fire at a tea cup falling over while a gunfight 50 meters away is completely ignored. For stealth games its important that sounds are captured and reported to non-player characters correctly, but also in a manner that seems fair to the player.
So first up, how do you ensure a guard hears distant sounds that they should be able to hear while ensuring sounds they shouldn't be able to hear are ignored. A straight line distance from the point of the sound to the guards ear might work fine if the sound was made in the same room. But if the sound was in another room, it's highly dependant on the layout of the environment whether the guard should still hear it. And sadly measuring real-world acoustics is kinda expensive to do, so why not fudge it to within some approximation?
This where Blacklist is able to exploit the previously established TEAS system. Given it provides an abstract mark-up of the environment, including the chokepoints, the distance of a sound can be calculated as the sum of straight line distances of the point of origin, moving through the chokepoints back to the guard. Hence we can more accurately measure a sound that travels 50 meters to reach a guard, despite the object that made the sound only being 10 meters away.
But despite that, it can still feel unfair. Especially as a player as learning an environment and the guards behaviours. Hence to provide some balance for gameplay, the hearing of NPCs in Blacklist is weaker if they're not on screen. It doesn't make them deaf, but they just don't hear quite as well when the camera isn't looking at them. And naturally their hearing becomes sharper on higher difficulties.
And of course, we need to consider how audio can be used as part of the feedback system to players. Characters should be communicating what they're doing as a result of having seen or heard something nearby.
Blacklist, like many other Splinter Cell games, has a bark system whereby AI characters will run lines of dialogue when executing actions. This is again a common tactic for action games with games as early as Call of Duty 2 pushing this concept. But stealth games require a lot of context and information. Hence the bark system has tiers to it that range from generic information such as saying 'I saw something' to 'I saw the light go out over there'. It helps provide more direct feedback to the player, allows them to better understand how their actions are recognised and what guard NPCs are doing in response.
Social & Contextual Knowledge
The last pillar of stealth is establishing social and contextual knowledge for AI characters in games. Given this is the last of the four, it is arguably the one that receives the least attention in stealth games, but is arguably the one that really helps reinforce the gameplay and can elevate a stealth game from good to great. This pillar revolves around the ability for AI characters to act like they're in the moment and reinforce the performance theatre of the game. This includes showing an understanding of which characters they are interacting with, but also a richer understanding of how the world is changing around them. This pillar is also the last on because it leans heavily on the others, given the collection of sensory and knowledge systems helps a character understand the world.
First of all there's social context. Typically in games, managing multiple AI characters at the same time is difficulty. Especially if they're capable of acting independently and together. The usual solution is to have a separate system manage any co-operative behaviour while the NPCs themselves don't know that each other exists. This works well in shooters such as FEAR and Halo, but for stealth games, where one of you going missing is kind of a big deal, it needs something more nuanced. In Blacklist, when groups of NPCs are hanging out and talking to each other, they're actively aware of one another in the space and on occasion will communicate. Hence if you listen in carefully, guards reinforce the audio pillar by having conversations that take greater consideration into current context. If one hears a noise, the other might agree or dismiss it as something else. Plus tying back into the environmental awareness, while doors being opened can be a red-flag, noticing your group of 4 is now 3 is arguably more important. Hence there is a stronger understanding of changes in the environment and dialogue that matches this. Either to highlight their understanding of what is happening or sharing useful information in combat, such as when the player is shooting at them from a position they can't reach on foot.
In addition, there is a stronger understanding of when things are going wrong. Should the character hear gunfire or another character yelling or find a dead body, they will move into an alert state and either search or start combat if the player is exposed. But critically, they won't lower themselves from this raised awareness state once it has entered it. This is an important feature that many games miss and when they do, it looks awkward. If a guard finds his buddy dead, naturally they'll go into a heightened state of awareness, maybe investigate the body or search the area, but they shouldn't go back into an idle state afterwards, all the while ignoring their buddy still lying dead on the floor.
Providing a sense of fairness in gameplay is always a challenge. Depending on the type of game you're making and the systems within it, how you communicate those inner workings to players while maintaining balance and novelty is a new problem with each new project. And with stealth games you're arguably playing on a higher difficulty level, given the communication of the systems becomes a core dynamic. If players recognise their actions have consequence and that the games response feel fair in that context, then you're on the right track to building balance for play.
- Walsh, 2014: "Modeling Perception and Awareness in Splinter Cell: Blacklist", Game Developers Conference (GDC), 2014.
- Waslh, 2015: "Modeling Perception and Awareness in Splinter Cell: Blacklist", Game AI Pro 2, Chapter 28, 2015.