Before beginning on an admittedly very long and dry essay, I feel that I should explain my motivation behind this paper. Beyond simply providing “solutions” to crunch, as a student in game design, hearing about the intense working conditions of the industry, to be frank, fills me with anxiety, and I suspect I’m not alone with this sentiment. I do not doubt that I will pursue game development professionally, not in the slightest. But, when I regularly hear about games that required incredible human costs, and there being a general acceptance that crunch is simply a part of game development, I can’t help but feel like a stupid child sprinting straight into a mine field. So, I do hope this paper helps the few developers that read it to make their working lives at least noticeably better, but I also hope this paper helps to give students like me some agency over their futures, which can be an incredible power to have during their formative years.
Video game development is seen, first and foremost, as a creative endeavor. Game developers are passionate people, who were lucky enough to find work they enjoy. They create and play games in an ever growing industry worth billions, and express themselves through the efforts they put in. This is the mainstream understanding of game development, and its delusions are evident to any who dares to peer further into the industry. Yes, video game development is a passion fueled industry, but it is not the “play” that many imagine it as. Regardless of how enjoyable a developer’s work is, it is still work for money to pay their bills and eat food. So, if their manager gives them two, ten, or twenty more tasks to do, then they have to finish them. Otherwise, nothing is stopping a new, actually passionate applicant from taking their place; if someone is not getting their work done, then they must not be passionate, after all. So then what happens if, by the end of the day, they have not finished all of their tasks? They keep working. They sleep in the office. They don’t stop until the job is done. Behind the romanticized depiction of game development lies a hard truth of deadlines, often toxic company culture, and overworking.
It is a common occurrence within the games industry for development teams to labor for extraordinary hours, and under intense pressure, to complete a video game. Working unpaid overtime to reach looming deadlines is so frequent, that it has even spawned a shorthand term for it: crunch. Among many developers within the industry, crunch is considered a necessary part of development, or at least unavoidable. However, crunch has serious implications for the physical, mental, and social health of developers. It serves to compromise team productivity, results in a hemorrhaging of talent and experience as burnt-out developers escape, and breeds a workplace environment that reinforces crunch as an expectation for developers, referred to as “crunch culture”. In the past few years, games whose development were defined by intense and extended periods of crunch have come under scrutiny, and have left the industry with the lingering questions “why?” and “what do we do?”.
While some developers and gamers would argue that crunch is simply a part of game development, in this paper I approach it as a problem that must be solved for the well-being of developers, and the longevity of the industry. I will explore the causes of crunch, splitting them between two classifications: sources that are the circumstances and practices that motivate crunch, and methods that are the cultural contexts and perspectives that instruct developers to crunch in response to the sources. I then compile solutions to address the causes outlined, separating them into three levels: team, project, and individual, which target their respective scale within a games studio. These classifications provide a useful system for future appraisal of causes and solutions to crunch that are not addressed in this paper. The goal of this paper is, by the end, to have a detailed set of suggestions such that no matter the circumstance or position a developer may be in (from manager to contractor), they can immediately act to at least minimize crunch, and at most eliminate it.
Why Crunch Is A Problem
The unadulterated incentive to crunch for many teams is to accelerate progress on a project that is behind schedule; although, the causes of crunch are significantly more complex than that, which will be shown in the next section. On the surface, it seems like a reasonable conclusion that working longer and harder will complete more tasks. However, this has proven to be a logical fallacy. While crunch tends to improve productivity in the short term, sustained crunch, found to be of about four weeks or more, actually decreases productivity compared to working regular 40-hour weeks (Take This, 2016, p. 6; Keith, 2020, 33:28). Edholm and Lidström (2016, p. 9) define a separation between types of crunch, pointing out that the only category that results in improved productivity is “mini crunch”: a period that lasts at most two weeks.
It would be easy to say that shorter periods of crunch are worth pursuing, but a team’s situation is often more complex. Despite their boon to productivity, mini crunches are a minority compared to longer sustained, or more intense, crunch (Edholm & Lidström, 2016, fig. 5). If it was as simple as choosing to crunch in short spurts, developers would already be doing that. Regardless of whether shorter periods are beneficial to the project, the decision of how long to crunch for is not made in a vacuum. Rather, it’s informed by factors within the project to a degree where it could hardly even be considered a choice at all. The lack of control over what type of crunch a team will engage in, if pressured to do so, makes targeting mini crunch as the sole form used simply unrealistic.
Even if a team is given the freedom to choose mini crunch, the risks to their well-being outweighs any benefits it may have to the project. Across all types of crunch, Edholm and Lidström (2016) observed a consistent increase in stress and fatigue among developers. Developers under crunch tend to be sleep deprived, have a poor diet, have unhealthy habits, and are more likely to have depression and anxiety (Take This, 2016, p. 8-9). Crunch is also notorious for its effects on developers’ work-life balance. Being at work, rather than at home, tends to strain relationships with family and loved ones, and the adverse health effects lead to stress and anxiety among those close to developers (Take This, 2016, p. 9). The incredibly damaging effects crunch has on health are the root cause of all other problems related to it. If even the shortest periods of crunch can result in such complications, there is no way to reasonably enforce crunch and expect developers not to experience these effects, and all other detriments that stem from the degradation of their health.
The most common solution individual developers find to avoid such a situation is simply to leave the industry. The effect this has for development teams is a severe lack of experienced personnel. Due to the constant stress endured during crunch, burn-out is a regular occurrence. This, combined with older developers with families wanting greater work-life balance, can push those with the most experience to quit (Take This, 2016, p. 11). As Seth Coster (2020, 0:08) notes, those who have been in the games industry for long enough are called “veterans”, and not “experts”. The mark of long-time developers is not expertise and knowledge, it’s survival. Coster points to crunch being the reason for this. As video games grow in scale, complexity, and cost, the need for a reliable and large body of experienced developers will grow, as well. The lack of the latter warns of an extremely unsustainable industry environment. This puts projects at a higher risk of failure since they may lack enough experienced production leads to successfully navigate development, and increases costs since onboarding new developers requires resources that would not be spent otherwise.
Punctuating all of this, is crunch’s often self-perpetuating behavior. As developers become fatigued from crunch, they become more likely to make mistakes, leading to more work. This inadvertent increase in tasks then pushes developers to crunch even more (Edholm & Lidström, 2016, p. 11-12; GDC & Coster, 2020, 23:35). This is the reason for the aforementioned drop in productivity, as well as a source of decreased quality in the final build of a game (Take This, 2016, p. 10). On top of this, those with experience still remaining in the industry will likely have one of three views on crunch; either supporting its use since that’s what they expected of themselves, accepting it as a part of game development, or adamantly against it. The state of the industry suggests the first two are more common, which follows the reasoning that those who oppose crunch are more likely to be among those that leave because of it. As a result, as experienced developers get promoted to managers, there’s a greater likelihood that they will expect their teams to crunch just as they did, or be unfamiliar with effective ways to prevent it.
Crunch causes severe health risks for those it’s imposed on, can compromise the quality of a game, results in a loss of experience and talent, and fails at achieving what it was intended to be used for. Encompassing all of this is its nature of being self-perpetuating through the very problems it causes, and by creating a toxic workplace environment. Crunch has no consistent benefits, and even when it does contribute to a project, the negative effects are monumental compared to it. To quote Coster (2020), who refers to crunch as “heroics”:
Heroics are the signature thing that the games industry is known for…[and] are a signal that all of your systems have failed. This means that our industry is best known for being really bad at our jobs. And we’re stuck. (24:14)
Causes of Crunch
It’s worth mentioning that there are other causes of crunch beyond what will be detailed here. One could argue that the segment of the gaming community supporting crunch, labor laws allowing unpaid overtime, or the capitalist incentive to publish more games quicker are all contributing factors of crunch. However, all of these problems are at a societal scale where development teams, or even the industry as whole have little to no control. The goal of this section is to highlight causes that exist at the local scale within development teams, and between projects, which can then be realistically, and directly addressed. There is one exception to this, with NDAs, and the purpose for which will become apparent later on. The causes of crunch are deeply interconnected, so to better understand their relationship they will be split between sources: the circumstances and practices that motivate crunch, and methods: the cultural environment and perspectives that instruct developers to crunch in response to the sources.
Sources of Crunch
The most common source of crunch, according to developers, is deadlines (Edholm & Lidström, 2016, p. 9). Seeing as reaching deadlines is part of the core motive to crunch, the frequency of blame being placed on them is to be expected. Among Edholm and Lidström’s (2016, fig. 5) classifications of crunch, “continuous crunch” which lasts the entirety, or for a large portion of development, and “final crunch” that occurs at the end of a project, are the most common types utilized. Such a result agrees that the final release date of a game is the highest priority deadline for a team, that this deadline is treated as immovable, and that it regularly motivates crunch. However, lacking ample time to complete necessary tasks is only one half of the original incentive to crunch, with the other being actually having enough tasks to warrant it. This imbalance can be viewed as the ultimate source of crunch, as all other sources act as upstream to it, influencing the task to time ratio development teams are confined within. Each of these upstream sources can be split into the different stages of development (pre-production, production, and post-production) they reside in, which further demonstrates the cascading effect each of them has on each other.
The goal of the first stage of development, pre-production, is to establish the scope (the features and assets implemented in the game) and timeline of development, where failure to do so accurately sends ripples throughout the entire project. In other words, it initially defines the task to time ratio. Among the most common issues faced that lead to crunch are a poorly defined, or inaccurate timeline and scope (Edholm & Lidström, 2016, fig. 3). Leaving a project plan in such a state opens development up to a series of challenges and complications. A central theory to project management is the adherence to the “iron-triangle” of scope, time, cost, and quality (Caccamese & Bragantini, 2013, p. 4). The iron-triangle posits that these four constraints are necessary to remain within for a project’s success. The goal of a project manager is then to maintain a project’s scope and timeline to not exceed costs, and maintain quality. So, beginning a project without a properly defined iron-triangle will inevitably place strain on development. Having a “realistic schedule is seen as a critical success factor” (Kuutila, et. al., 2020, p. 9) because of the difficulties that ensue without one.
During production, the effects of inaccurate estimations, combined with inefficient and ineffective management, result in the bulk of the sources of crunch. One of the most common is “feature-creep” (Edholm & Lidström, 2016, fig. 3); where features are added mid-production to a degree where the originally planned scope is no longer accurate. This is usually the result of irresponsible management allowing unmitigated implementation of unplanned features (Edholm & Lidström, 2016, p. 9-10). Remaining steadfast to the original plan can be difficult, especially if stakeholders are pressuring the team to add features, but it will be near impossible if the original plan is not properly defined to begin with. Feature-creep is seen populating the “chaos” that Keith (2020, 12:40) mentions as characteristic of production, the final crunch classified by Edholm and Lidström (2016, p. 9), and the iteration (repeating production processes until a sufficient result is achieved) Archontakis (2019, p. 52) points to as a key source of crunch. Chaos, final crunch, and iteration all stem from the lack of a defined plan to guide production. Since pre-production failed to identify the goals and limitations of the project, the team engages in chaos and iteration mid-production to “find the fun”. The time they consume results in a severe upheaval within the schedule, regardless of how poorly defined it is, sending the team into final crunch. The severe, and unresolved technical debt typically incurred by final crunch then pushes developers to crunch further. Coster (2020, 16:00) describes the ways wasting resources occur during production, punctuating that crunch perpetuates waste, and Edholm and Lidström (2016, fig. 3) indicate that the largest percentage of that waste is in bug fixing, as “technical issues” are among the most common causes of crunch.
Now in post-production, there is still a final source of crunch that can appear: a lack of publicized records. A significant part of why estimation may be inaccurate during pre-production is the lack of experience to pull from. One of the primary contributing factors to failed projects, not just in game development, is a lack of “historical data from similar projects” (Kuutila, et. al., 2020, p. 8). The games industry is particularly notorious for limiting the access of knowledge and experience through NDAs (Non-Disclosure Agreements), as O’Donnell (2014) argues:
Estimation in particular requires experience...If estimations are routinely ignored because they are either dramatically under- or overestimated, how does that affect project deadlines?...The (in)ability for game developers to learn and share information about game production practices severely limits the capacity for the industry to mature. (p. 42)
The reluctance to add records and postmortems to a public pool of knowledge eliminates any purpose experience can have. The cascading effect between sources of crunch is not simply isolated to singular projects, but also influences future projects, exacerbating the likelihood of encountering the same challenges again. However, a team facing such problems is not enough to explain crunch’s prevalence in the industry. The methods, or the cultural context, by which teams reach the conclusion that crunch is their best solution must also be accounted for.
Methods to Crunch
The method at the forefront of the industry’s consciousness is crunch culture. This is a company or team culture where crunch is expected of developers, and regular or constant crunching is viewed as a good practice. Not all development teams have a crunch culture, per this definition. However, there still exists a general sentiment within the industry that crunch is an unavoidable part of development (Cote & Harris, 2020; Edholm & Lidström, 2016; Peticca-Harris et. al., 2015), which is a passive manifestation of crunch culture. The factors that lead to the development of such an environment are deeply ingrained within the industry, and culminate in a similar relationship as the sources of crunch and the imbalance of tasks and time. In this way, crunch culture can be viewed as the downstream method to all other methods that inform developers to crunch. The methods that do so: employment pressures, the image of the passionate worker, the unmanageability of development, and the resistance to management processes, act in a multi-pronged approach by both normalizing crunch, and preventing resistance.
The first two methods, employment pressures and the image of the passionate worker, go hand in hand. Peticca-Harris (2015) argues that the project-based nature of the industry, and thus the portfolio-based nature of employment, places developers as solely responsible for maintaining their own employability. In this environment, developers are pitted against each other in maintaining “their peer and portfolio-based reputations” for favorable employment and promotion opportunities (Peticca-Harris et. al., 2015, p. 573). This drives developers to adamantly pursue what is perceived as the best employee. Cote and Harris (2020) identify this image as one detailing a developer who is “passionate and dedicated to the industry ‘because they love the work’” (p. 170). Edholm and Lidström (2016) note that “colleagues staying late can be a big influence” (p. 10) on whether a developer stays late themselves, and that crunching is considered part of the industry’s “hard work ethic” (p. 12). The idealized image of a developer without the need for rest serves as a principle goal within the industry, prompting developers to behave as such in the pursuit of perfection, typically expressed through an “over-indulgence” in their work, which justifies crunch as an expression of their passion.
This should not distract from the honest passion individual developers have, though. Edholm and Lidström (2016, p. 10) clarify that this is not uncommon for developers, and they will purposely push themselves for the betterment of the project on their own accord. Such a decision is still “constrained by the social context, institutional conditions, and social rights involved” (Peticca-Harris, 2015, p. 576), but that does not negate developers’ intrinsic motivation. Such a distinction between the expectations put on developers, and their own drive is important to make. While their outwardly appearance is similar, if not the same, their origins are fundamentally different, and must be approached differently as a result. And, regardless of its origin, the individual passions of developers are still a method to crunch, making them a target of this paper.
The last two methods, the belief development is unmanageable and a resistance against management frameworks, serve to prevent developers from searching for alternatives to crunch. These two methods are observed by Cote and Harris (2020) within the rhetoric used among developers, and by O’Donnell (2014) as they physically manifest within the industry. The perspective that games are unmanageable is pinned to them being a creative project, not simply one of software development, and the lack of control developers have over their own projects, particularly due to publisher influence (Cote & Harris, 2020, p. 167; O’Donnell, 2014, p. 139-140). This works in tandem with what Cote and Harris (2020, p. 169) refer to as an “anti-corporate ethos”, a nostalgic preference to games’ indie history of garage development, to effectively dissuade developers from pursuing management frameworks and development practices that can mitigate crunch. Seeing as development is treated as unmanageable, attempts to manage it can be expected to be met with resistance, either out of challenging a status-quo, or from perceived futility. However, O’Donnell (2014) further elaborates that this perception is informed by the circumstances developers find themselves in, particularly the “norms of secrecy” (p. 140) caused by NDAs, clarifying that rhetoric is not the only factor contributing to their construction.
This separation of causes between sources and methods is valuable, as it communicates that crunch is not the “natural” solution to challenges faced in development. The sources are inherent risks and challenges of project-based work. However, the decision to crunch is not purely caused by these. Rather, the cultural context a project and its developers exist in informs which solutions they pursue. In a different context from the one dominant in the games industry, the typical response to such problems could be to immediately delay the project, or to cancel it outright. However, while this classification does make a conveniently simple narrative, where developers encounter difficulties and are then influenced towards crunch as the solution, it should not distract from the complex interconnections the two have. For example, NDAs create a “culture of secrecy” (O’Donnell, 2014) that perpetuates both the sources of crunch by restricting the spread of knowledge, and reinforces the methods of resistance against crunch mitigation by creating an environment that justifies them. The reality of game development is messy, and so too are the causes of crunch. However, unravelling them, and understanding them is key to finding their solutions.
Solutions to Crunch
Unlike the causes of crunch, the solutions will be divided into three levels: team, project, and individual. The departure from sources and methods is due to that classification only being a theoretical framework. While this classification will be useful for identifying appropriate approaches for addressing individual causes, the jump from theory to reality can often be inelegant, and defeat its own purpose. So, solutions will instead be organized based on the scales they should be deployed at. This way, they are directed with intention, and lead to action within a real context. This also provides “layers of defense” against crunch, rather than simply presenting a disorganized collection of practices. Depending on the environment developers find themselves, it may be unrealistic to use certain solutions from certain levels. If, for example, solutions meant to be deployed on the team level are not possible due the team being brand new, or unfamiliar with each other, then the project and individual levels can still remain. Of course, as levels get removed, the likelihood of crunch increases. If only the individual