The cruel optimism of “good crunch”: How game industry discourses perpetuate unsustainable labor practices

The video game industry’s labor practices have become an increasingly common topic of discussion throughout game studies and the gaming community, especially when it comes to “crunch” or periods of intense, extended overtime. Despite this attention, crunch persists, and the industry’s tendency to distinguish externally mandated or excessive crunch from self-directed or scheduled crunch continues to be problematic. This article considers the distinction between “good” and “bad” crunch as a form of cruel optimism, in which the idea of a tolerable crunch actually prevents the game industry from imagining how to produce games without any crunch. Drawing on a critical discourse analysis of industry trade press—specifically Game Developer magazine and Game Developers Conference presentations—this research demonstrates how viewing any form of crunch as acceptable quells potential innovations in video game production and locks developers into an unsustainable cycle. We encourage developers instead to rethink labor practices more expansively.

of "crunch"-extended periods of drastic overtime-are commonplace (e.g. Schreier, 2016Schreier, , 2018Vanderhoef and Curtin, 2015). More importantly, many gameworkers and gaming communities around the world have normalized this practice, accepting the idea that it is nearly impossible to produce exciting, memorable games without at least some crunch. Take, for example, developer CD Projekt Red (CDPR), whose highly anticipated game Cyberpunk 2077 won several awards before it was even released (Sinclair, 2020). However, Cyberpunk was delayed multiple times-originally slated for publication in April 2020, its release was pushed back to September, then November, and again to December (Hall, 2020;Roberts, 2020). Games journalists such as Jason Schreier (2020aSchreier ( , 2020bSchreier ( , 2020c reported that developers at CDPR crunched at least periodically since 2018 (Good, 2020). On the contrary, some CDPR employees have rebuked Schreier's claims that they were subjected to a mandatory crunch, instead arguing that the development team voted to move into 6-day weeks to avoid further slippage in the release date (Periwal, 2020). Fans have also derided Schreier's critiques, as Polish labor laws require CDPR developers to be paid overtime (Hernandez, 2020). 1 This disagreement surrounding Cyberpunk's development is in many ways an exemplar of broader discourses around crunch in the video game industry. As we and others have argued (Cote and Harris, 2021a;O'Donnell, 2014;Peticca-Harris et al., 2015), forces such as games' production structures and culture of secrecy have worked to normalize crunch. This practice is often framed as a necessary evil for producing games (Clayton, 2019;Frank, 2018;Good, 2018Good, , 2020. While all of these factors contribute to crunch's persistence, we suggest an additional reason why this grueling practice endures-game developers distinguish between "good" and "bad" forms of crunch. This comes through clearly in the Cyberpunk example, where developers are not arguing over whether or not crunch occurred; rather, they are arguing over whether it was mandatory, and therefore bad, or voluntary, and therefore good. Using an inductive critical discourse analysis of various game development sources, we found that developers tended to label externally motivated crunch, pushed by publishers, managers, and console producers/tech issues, as "bad crunch" but simultaneously present autonomous, self-directed crunch or crunch that results in an award-winning product as "good crunch." This builds a relation of cruel optimism, whereby the idea that developers can achieve a good crunch encourages repetition of the system rather than more diverse reimaginings of labor practices. Good crunch faces many, if not all, of the issues that result from bad crunch, so its valorization perpetuates "a relation of attachment to compromised conditions of possibility whose realization is discovered either to be impossible, sheer fantasy, or too possible, and toxic" (Berlant, 2011: 24).
In other words, while it is heartening to see growing interest in game industry working conditions, there is a tendency to frame developers' concerns in terms of managing overtime or providing fair compensation rather than in terms of preventing crunch practices as a whole. Given that the Cyberpunk incident is just one example in a rich history of discourses that valorize extended overtime in service to creating memorable games, we argue that this results in a system in which developers strive for a good crunch rather than a fully crunchless production. Thus, we push for developers to break out of the good/bad crunch distinction, release the cruelly optimistic idea that there can be such a thing as "good crunch," and instead focus on fully reconceptualizing game production.

Crunch and its costs
Abundant popular press coverage and a growing body of academic work have so far been ineffective at preventing the frequent use of crunch throughout game development. This is unfortunate, as crunch can lead to extreme fatigue, diminish employee mental health, create tension in families, and contribute to a pervasive overtime culture (Fujigaki, 1996;Kuutila et al., 2018;Nishikitani et al., 2005;Peticca-Harris et al., 2015;Wright and Cropanzano, 2000). This culture of overtime is cyclical and self-perpetuating; senior employees experience burnout at higher rates, leading to struggles throughout the industry to retain qualified managers (Consalvo, 2008;Deuze et al., 2007;Kerr, 2011;O'Donnell, 2014). Without management experience, teams can struggle to schedule and complete game production efficiently, a problem they often try to solve with crunch.

Contributing factors
Previous research has also identified financing models, reliance on passionate and largely disposable workforces, as well as the game industry's lack of unionization as contributors to crunch (Fahey, 2011;Johns, 2006;Kerr, 2011;Legault and Weststar, 2016;Schreier, 2016;Weststar and Legault, 2017). As game budgets have grown over time, developers have become increasingly reliant on funding from outside publishing companies, and developers are subsequently beholden to the publishers' whims. While the extent to which this happens can depend on studio size, project budget, and level of independence (e.g. big-budget AAA games vs smaller independent projects), elements of what researchers have called the "iron triangle" exist across studio models. The iron triangle describes the relationship between project management and publishers, demonstrating how managers allow budget concerns, deadlines, and the publisher's product expectations to dictate the production process (Legault and Weststar, 2016;Peticca-Harris et al., 2015). Project managers' resulting inability to effectively schedule deadlines and milestones then becomes a factor in their continued reliance on crunch.
The gaming industry also attracts ardent employees whose desire to work in game development often emerges from their romanticized relationship with gaming as a hobby, leading to a workforce that can be taken by advantage of by management (Bulut, 2020;Consalvo, 2008;Deuze et al., 2007;Dyer-Witheford and De Peuter, 2006;Harvey and Shepherd, 2017;Hoffman, 2004;Kerr, 2011;Rockstar Spouse, 2010). Consalvo (2008) found that the game industry suffers from high turnover and burnout rates, leading to a reliance on young, passionate employees who are willing to work longer hours to make up for their general inexperience. This culture of overtime (O'Donnell, 2014) is exacerbated by game development's project-based nature, where companies frequently hire and fire employees based on available funding and skills needed at the current stage of development (Legault and Weststar, 2016;Peticca-Harris et al., 2015). Recent results from the International Game Developers Association's (IGDA) Developer Satisfaction Survey (Weststar et al., 2019) show that while 71% of developers define themselves as permanent employees, these respondents worked at an average of 2.2 companies over 5 years. Freelancers/contractors moved even more, averaging 4.1 workplaces in the same time period. Therefore, while many employees are "permanent" and salaried (which can exempt them from overtime pay), their employment is still not secure; they often have to consider where their next project may come from. Precarity is thus contextual and variable, but also very hard to avoid (Bulut, 2020). Finally, because so many people are passionate about development, employees constantly confront the feeling that they are replaceable-that there are numerous other people waiting to take their job if they do not go above and beyond their employer's expectations (Consalvo, 2008;Hoffman, 2004).
Employees' passion and lack of job security highlight some of game development's key risks. Interestingly, as will be described further below, passion and uncertainty can be an issue even in countries with robust labor laws; for instance, one video from our dataset described how Swedish developer Massive Entertainment got around national labor restrictions by asking employees to work "voluntary" overtime and to forego their legally mandated vacation days. Passion and precarity allow crunch to be built into the overall system of global game development, suggesting further interventions are needed.

Potential solutions
As a relatively young industry which resists professionalization (Cote and Harris, 2021a), which lacks a tradition of collective organizing (Woodcock, 2016(Woodcock, , 2020, and which is global, therefore dealing with competing legal and labor expectations (Deuze et al., 2007), the video game industry has long resisted unionization. However, recent IGDA surveys found that a majority of game developers supported organizing in a union (Weststar and Legault, 2017). The success of the 2016 video game voice actor strike, organized by the SAG-AFTRA (The Screen Actors Guild-American Federation of Television and Radio Artists) union (Ng, 2016), and of the 2020 game writers' strike, which was organized by ununionized, freelance employees at Voltage Entertainment (Parrish, 2020), also signals a growing interest in collective action, as does the rise of the Game Workers Unite (GWU, n.d.) movement. GWU has launched autonomous chapters in the United Kingdom, Ireland, Scotland, Finland, and two in France since 2018 and seeks to provide comprehensive representation for developers across the globe (Brown, 2018;Game Workers Unite, n.d.;Milner, 2018).
Unionization would help address many contributors to crunch. Weststar and Legault (2017) found that most developers support the idea of an industry-wide union, like the Screen Actors' Guild or Writers Guild of America, over company-based union models. This is because of games' project-based nature; an industry-wide union, which would set standards for different positions, would help protect workers as they move between jobs at different companies. Furthermore, base standards for hours or pay could guard against the exploitation of passionate workers.
Despite this potential, unionization may not solve issues of crunch and quality of life fully. As will be explored in the analysis, our data show that crunch occurs globally across various levels of development, and even workers who have labor protections are encouraged to put in extra hours (e.g. Hernandez, 2020;Sydow, 2008). Unions therefore may not protect workers against more informal expectations, for example, that they be available for calls outside working hours (Moody and Kerr, 2020) or from the idea that they are making necessary sacrifices for the good of their company or product. The idea that employee passion drives them to engage in "good" crunch acts as a form of cruel optimism, sustaining crunch practices even in situations where there are organized efforts to improve working conditions. Fully reimagining games' labor systems will likely require developers both to engage in tactics such as collective organizing and to forgo problematic understandings about crunch. Berlant (2011) describes cruel optimism as relationships in which "something you desire is actually an obstacle to your flourishing [. . .] They become cruel only when the object that draws your attachment actively impedes the aim that brought you to it initially" (p. 1). An object of optimism promises some form of solution to a problem; when it fails to deliver on that promise, or delivers only a partial solution, Berlant argues that it becomes cruel optimism. In the case of crunch and labor practices, this research considers "good crunch," the self-imposed or scheduled crunch that developers endure to make the best games possible, as a form of cruel optimism. This is because developers set good crunch up in opposition to (indeed, as a solution to) problems of bad crunch or extended overtime imposed on developers by publishers or technology. Good crunch, though, possesses many of the same costs as bad crunchcosts to individuals' health, careers, and relationships-and therefore "actually makes it impossible to attain the expansive transformation for which a person or a people risks striving" (p. 2). Any use of crunch affirms the industry's reliance on it, preventing crunchless solutions from being considered. However, as Berlant also points out, "sometimes, the cruelty of an optimistic attachment is more easily perceived by an analyst who observes the cost of someone's or some group's attachment to x" (p. 24). By drawing attention to the cruel optimism that is woven into good and bad crunch, this research aims to encourage developers to strive instead for more sustainable and progressive labor practices.

Methods
This article builds on an earlier study in which we conducted an inductive critical discourse analysis of Game Developer magazine, assessing all issues from 2000 to 2010 (125 in total). We focused specifically on this decade as it is the first in which serious conversations around crunch arose in the video game industry. For example, the IGDA introduced their first quality of life white paper in 2004 (Bonds et al., 2004), and significant controversies around crunch occurred in 2004 and 2010 (Hoffman, 2004;Rockstar Spouse, 2010). Our goal was to understand how crunch persisted despite growing attention to its costs. We assessed each issue of the magazine in its entirety and coded it using an inductive, grounded theory approach where patterns and themes emerged from the data, rather than researchers' preconceptions (Lindlof and Taylor, 2002). We paid particular attention to the magazine's postmortem features, or articles where development teams detailed five things that went right and five things that went wrong during a game's production. The results of our initial analysis have been previously reported (Cote and Harris, 2021a). As we dug further into those data, though, we found the contrast between good and bad crunch merited further investigation.
To explore this theme, we drew on our existing 2000-2010 dataset and extended our analysis to the present by assessing relevant talks presented at the annual Game Developers Conference (GDC) and archived online in the GDC Vault. Searching the Vault for "quality of life" and "crunch" yielded 53 videos; after viewing, 29 were deemed pertinent to the study (see Supplemental Appendix). The remainder used terms like "quality" in their descriptions, but were not about labor practices. Videos were carefully viewed, with the researchers inductively noting emergent patterns. 2 The choice to include GDC videos, rather than continuing to assess Game Developer articles, arose from the fact that Game Developer shut down as a magazine in the early 2010s, fusing with its online sister publication Gamasutra. This changed the publication's structure, making it somewhat unwieldy to analyze. 3 GDC videos, however, share the target audience of Game Developer (working devs), continue to provide an inside look into how industry members discuss crunch among themselves, and occur in annual installments, allowing for more organized analysis.
Throughout these texts, and largely consistent over time, we found that developers were very aware of the costs of crunch, sharing frequent horror stories of lost weekends, destroyed relationships, and even physical health issues arising from overwork. At the same time, they often positioned these consequences as the outcome of bad crunch, while they accepted good crunch as part of the game design process. We begin by assessing the themes of bad crunch and its potential outcomes, before moving on to good crunch as a form of cruel optimism.

Externally driven
The first and most prominent bad crunch theme that emerged in our analysis demarcated bad crunch as resulting from outside sources, such as publisher demands or technological challenges. Common publisher problems included inflexible deadlines, high demands for public relations and marketing materials, or multi-region releases. For instance, the November 2001 postmortem for Rayman Advance cited "the absolute inflexibility of the shipping date (a date that actually moved up a few days near the end of the project when Nintendo moved up the release date)" (p. 48), combined with technology struggles, as creating a high-pressure scenario toward the end of their production. Similar stories appeared in the August 2003 postmortem for Amplitude, the IGDA's 2004 GDC Quality of Life presentation, the May 2007 Elebits postmortem, Coray Seifert's 2013 GDC talk about avoiding crunch, and more. Crunch was blamed on "impossible deadlines" (August 2005, Psychonautic Break postmortem, p. 31) throughout the dataset, and some discussions of quality of life even presented publisher challenges as inevitable. For example, anti-crunch advocate Hank Howie's GDC 2005 talk, "Better Games (and Quality of Life) in 40 Hours per Week," stressed how publisher demands and deadlines were not going to go away, advocating instead for other crunch-prevention strategies. Publisher-set schedules were seen as especially challenging for holiday blockbusters (e.g. Sheffield, 2008) or licensed or movie tie-in games (e.g. Devine, 2012), but concerns resonated beyond these to be a consistent complaint for developers.
External demands for public relations and marketing materials also resulted in some bad crunches. The January 2003 postmortem for No One Lives Forever 2 suggested that the team was unprepared for the number of demos, trailers, interviews, and screenshots they needed for advertising purposes. Having to add these materials into their existing schedule contributed to a production where "most of [the team] worked at least 100 hours per week during the last several months of the project, and some people were crunching even before that" (p. 54). The Conduit's August 2009 postmortem described nearly identical challenges, stating, While the level of community and industry excitement for THE CONDUIT was enormously appreciated, we didn't initially have the infrastructure in place to effectively respond to the nearly absurd volume and intensity of attention we received. We were continually preparing for industry events and generating demo builds to show off the latest and greatest. (p. 35) Developers often failed to account for publishers' advertising plans while scheduling their production, resulting in regular crunches.
Both Game Developer articles and GDC talks recognized the challenges of publisher/ developer relationships and how much crunch had become a regular component of production. To illustrate, a March 2005 feature article on unionization stated, It's difficult to pinpoint exactly when crunch "got built into the equation." Certainly, it hasn't always been so. At one time, the game industry consisted of small groups of game enthusiasts working together feverishly and endlessly to build a title they believed in. They worked long hours because they were driven by passion. But over the last 10 years, that model has pretty much changed to a commercially-driven industry inhabited by big, publicly-traded companies. (Hyman, 2005: 17) This article positioned crunch as endemic to the game industry but increasingly abused by large companies. Long-time developer Laralyn McWilliams similarly cautioned against bad crunch in her 2019 GDC presentation, where she said, "Passion is a two-way street, and in the hands of some employers and some leaders, passion is used as a rope that can bind us. So never forget it's a business." McWilliams argued that while it was good and useful to be passionate about gamework, developers should be careful to avoid externally driven bad crunch mandates. These examples, and many others in our sample, take for granted games' history of internally driven crunch based on developer enthusiasm and projects they "believed in." Their laments do not focus on overwork itself, but rather on how they feel developers' passion is increasingly taken advantage of by large corporations and commercial interests via forced overtime. In doing so, they set up a relation of good versus bad crunch, where externally driven crunch is seen as the problem, rather than crunch itself.

Technological hurdles
In terms of technology, bad crunch arose when console producers failed to share development kits efficiently, slowing down production on new games, or when changing from one set of technologies to another took a dev team more time or energy than anticipated.
For instance, the development of Command and Conquer: Tiberian Sun was hampered by the dev team's choice to switch the "core graphics engine to an isometric perspective in order to enhance the game's 3D look. This resulted in a cascade effect of broken systems that weren't anticipated" (February 2000: 53 (December 2009 Brutal Legend). Almost all of these issues threw off schedules and necessitated overtime. Again, these themes appeared both in Game Developer and at GDC, as when Rantaeskola and Alonso's 2015 production methodologies talk referenced solving previous projects' memory issues with extra man hours.
Basic technology issues were bad enough on their own, but publisher's deadlines or demands for multi-region or multi-platform games often intensified problems by requiring extra labor and specific skillsets. The production of Rayman Advance combined a hard publisher deadline with the technological challenge of moving from the GameBoy Color to the GameBoy Advance. The developers wrote, "While we originally estimated GBA development to be twice that of Game Boy Color, it wasn't until six months into RAYMAN ADVANCE that we realized it's more like four times that of GBC" (November 2011: 48). Because they could not move their deadline, this technological challenge was solved via crunch.
Another clear example of a publisher/tech double whammy comes from the February 2004 Secret Weapons over Normandy postmortem, which said, if releasing a title on a single platform is a sprint, releasing on three platforms is a triathlon. We had gone through the alpha, beta, and submission milestones on the Playstation 2 version, but were beginning to lose steam. However, we had to go through it all again for the Xbox version. Due to the short amount of time remaining, the lines between alpha and beta became blurred. (p. 43) In this case, publisher demands and deadlines, technological certification processes, and different platform requirements combined to set the stage for crunch. That same year, during the IGDA's quality of life talk at GDC, an audience member asked the panelists for advice on how to negotiate fair schedules with publishers when you are waiting for a new console's development kit, again displaying the synergistic nature of publishing and technological challenges.

Uncontrolled or endless crunch
Finally, developers described crunch as problematic or "bad" when it was extensive or uncontrolled. To give an early example from our dataset, the December 2000 postmortem for F.A.K.K. 2 described, "a crunch mode that lasted nearly five months, and one month of 'super crunch', consisting of seven-day weeks and 12-to 16 hour days" (p. 47). Unsurprisingly, the writers found this schedule made team morale "very low" (p. 47). For a continuing example, the May 2010 article "Tales from the Crunch" includes stories like the following: There was one year in which two mandatory overtime crunches ran for over a month, one of which involved 10 hours a day, seven days a week "or you lose your job" crunch. [. . .] It was during this time that the flu hit, and I had to take time off. Because of the stress of the crunches, I had used up all my sick days. So after some 120+ hours of unpaid overtime, the company docked me three days of pay for going over the allotted amount of sick time. Outrageous! (p. 9) Throughout our dataset, devs told many stories about extensive overtime, mandatory working hours, and "the kind of crunch that will make you weak in the knees to think about and give you a thousand-yard stare when you're done with it" (September 2002 Dungeon Siege postmortem: 49).

The impacts of bad crunch
Bad crunch was a regular occurrence in the game industry during the period of analysis. And this crunch came with many costs; developers burned out, faced physical or mental health challenges, and lost time with their families and loved ones. Numerous postmortems ended with plans to adjust for future projects, which seems like a hopeful foundation for interventions into game labor practices. In many of these instances, however, developers objected most strenuously to the cause or duration of the crunch they experienced. Several stories included some form of "regular crunch is fine, but this crunch was terrible" framing, showing how developers perceive bad crunch as the issue, rather than crunch as a whole.

Good crunch
This leads directly into the key counter-themes that we noted in our analysis-how developers positioned some forms of crunch as "good crunch." In contrast to bad crunch, crunch was seen as "good" if it was self-directed and carefully managed. Good crunch also occurred, according to developers, when the resultant game was a success-when the ends justified the means. Indeed, this latter trend often immediately followed critiques of bad crunch, working to dismiss problematic labor practices if games succeeded. Overall, we argue that these themes build crunch into game development systems as a form of cruel optimism, encouraging developers to accept problematic labor practices rather than pushing for more extensive change.

Independent
Numerous articles and talks positioned self-imposed crunch emerging out of developers' passion as a good thing, giving specific examples of times when developers' voluntary overtime improved the resulting game. For instance, an employee from Treyarch pointed out how The popular "Nazi Zombies" mode in CALL OF DUTY: WORLD AT WAR [. . .] was originally prototyped by a few enthusiastic team members on their own time. The mode steadily gained converts who pitched in with their own contributions to get the prototype off the ground.
Production recognized its potential value as a bonus and made the call to dedicate official resources to it. When it shipped, it was one of the game's most talked-about features. (March 2010: 17) Another article, which gave developers advice on how to get ahead in such a competitive industry, recommended that they take it upon themselves to solve problems: "then quietly show everyone what you've done. The other programmers will be grateful that you've saved them work and will be impressed with your coding" (May 2009: 20). Going above and beyond was framed as key to success as a developer, even if it required afterhours work. As Bulut (2020) points out, narratives like this redefine work in positive terms, as based in love and enjoyment. However, loving work does not prevent it from having costs or being precarious; encouraging workers to see "voluntary" overtime as a normal part of the job thus constructs a relation of cruel optimism, presenting a positive façade that masks disadvantages.
Game Developer and GDC carefully distinguished self-motivated overtime as distinct from (and better than) externally motivated crunch. This can be seen in examples like the October 2007 postmortem for Crackdown, which stated, "Video games are creative art, and the best developers are inherently passionate about crafting them. However, there's a difference between developers who willingly give overtime, and managers who demand it" (p. 28). That said, the voluntary overtime developers glorified was often extensive and demanding. The May 2000 postmortem for Unreal Tournament celebrated how their project "often saw team members working a 24 hour day, sleeping on a couch for six hours, and then working another 24 hour day" (p. 50). Ten years later, the March 2010 Uncharted 2 postmortem said that the company never mandated crunch but suggested they were able to get away with this because they "hired people with personality types that make them hard-working, willing to accept responsibility, and perfectionists and that led to many months of long hours, late nights, and truncated or skipped weekends" (p. 29). These articles, and many others, framed overtime as something developers wanted and therefore a benefit rather than a problem.
Multiple GDC talks gave comparable arguments. Petter Sydow's 2008 GDC talk about the production of World in Conflict stated, "We also tried to avoid ordered overtime. Ordered overtime is really, really bad for morale. When people feel forced to be there, it really, really slows down production." Instead, he emphasized how Massive Entertainment and the World in Conflict team tried to encourage individuals' sense of responsibility for and excitement about the project, pointing out that employees would then often "make their own decision to stay behind each evening and to come in Saturday, Sunday, and to voluntarily fix the level." This was necessary given Massive's location in Sweden, where labor laws prevent the forced overtime that is common elsewhere, but it also shows how passion-based or individually driven crunch is positioned as inherently better than externally motivated crunch. Xiaojuan He's 2011 GDC China talk, on behalf of Ubisoft Chengdu, similarly argued that "If you're working towards your dream, you will not see it as working overtime." These examples, which emerge from countries with very different labor laws, highlight how notions of voluntary crunch are omnipresent in the global video game industry. Extremely demanding schedules were often dismissed under the guise of developer passion and "self-driven" participation, regardless of whether one was legally protected from overwork or not. 4 Supposedly good crunch, rather than solving the issue of overtime, perpetuated it in a new, cruelly optimistic form.

Carefully managed
Other discourses of good crunch argued that crunch was not a problem provided it was appropriately scheduled or managed, but many examples of scheduled crunch still read as excessive and potentially exploitative. Following a bad crunch on their game Age of Empires, for example, Ensemble Studios implemented the following crunch schedule during the development of Age of Empires II: Age of Kings, ostensibly to protect their developers from worse: The hours were 10 A.M. to midnight, Monday through Friday, with Wednesday nights ending at 7 P.M. so we could go home to our families. We had weekends off and meals were provided during the week. For the most part this worked very well, although having a "family night" where family members could join us for dinner once a week proved to be more of a distraction than we would have liked. Producer Harter Ryan deserves much credit for making crunch time so much easier on AOK. (January 2002: 56) Ensemble described a comparable crunch schedule ("10 A.M. to 11 P.M. four nights a week (with Wednesday or Friday off at 6)": 55) in July 2002, showing this practice to be persisting.
Other planned crunches differed in their level and duration, but scheduling crunch was common even among developers who otherwise resisted overtime. In 2005, regular quality of life advocate Hank Howie wrote, It's fairly widely recognized that workers can absorb a temporary 15 to 20 percent increase in their workload for short periods of time. We usually schedule one crunch week for each of the early and middle milestones, and two weeks per milestone during the closing stages. Crunch weeks entail working four 10-hour days. The fifth day (individuals choose which day that is) is a normal day. (p. 22) Despite prioritizing quality of life, Howie argued that scheduled crunch was a workable strategy for producing the best games. Unfortunately, he also said that even his company occasionally faced multiple crunch weeks in a row, showing how good crunch can be a slippery slope.
A general acceptance of scheduled crunch emerged across the dataset and can be summed up with the following quote from Game Developer's May 2010 "Tales from the Crunch" article: Planned crunch can be vital to getting a game polished, or getting in that crucial feature that a small group of developers really believe in. But crunch shouldn't be necessary just to get a game shipped at a basic level of quality, and it shouldn't go without pay. (Sheffield, 2010: 7) These discourses argued that good crunch could exist within certain confines, but in doing so, they again reinforce crunch's place in development. From there, it is only a short hop to bad crunch territory.

Ends justify the means
Finally, and perhaps most problematically, developers often framed crunch as good if it resulted in a successful product; this occurred not only when crunch was self-directed and scheduled, but it also often happened even when all other aspects of the crunch period were bad. Thus, while we argue that all discourses of good crunch can serve as a form of cruel optimism, this particular discourse is perhaps the cruelest of all, as it works to justify even the most damaging practices. Furthermore, this narrative resonated across many different styles of studios, from AAA to indie, to affect the industry as a whole. For instance, the June 2001 postmortem for Black and White stated, At Lionhead Studios, we all knew that BLACK & WHITE was going to be something special. This belief became self-fulfilling as we were inspired by each new feature and every neat, innovative section of code. Naturally, this meant that everyone worked exceptionally hard. Over the course of the project the team did the work of a group twice their number. We regularly went home as dawn broke, and weekends became something other people did. (p. 56) It later added that this hard work and belief in the project made these months of long hours "fun" even though they were "the hardest any of us has ever had to work" (p. 58). Similar stories appear elsewhere, including (but not limited to) the January 2004 postmortem for Jak II, the August 2005 Psychonauts postmortem, the January 2007 Downhill Jam postmortem, and several GDC talks. 5 One great example of how a successful outcome can be used to justify otherwise bad crunch comes from Graeme Devine's 2012 talk about the production of the ParaNorman mobile game. Devine described how the team ended up crunching when they realized very late in production that their game was too short but could not move their release date, as the game was a movie tie-in. Ordinarily, this externally motivated crunch would be portrayed as bad crunch, and Devine did point out its challenges. On the contrary, he continued, Crunching still sucks and it's incredibly expensive, not only in hourly contractor hours, but it's incredibly expensive on your soul, on your families, on everything. So . . . lot we did wrong on this game, but I think the end result, when you look at the game, the game turned out fantastic. I'm not sure that the journey was always fantastic, but it's the end result that counts. (Emphasis added) The seamless transition from crunch's costs to its justification reveals good crunch's cruel optimism: "the condition of maintaining an attachment to a significantly problematic object" (Berlant, 2011: 24). Despite its promise, good crunch limits developers' quality of life in the same way that bad crunch does. As such, it provides only a vision of improvement, rather than true change.

Significance and conclusion
The analysis we offer here has several limitations. For instance, employing a textual analysis necessarily restricts our conclusions to what we can draw from the pieces at hand, whereas interviews or focus groups would have allowed opportunities to follow up on confusing or unusual moments. We intend to expand this research using these methods in the future. We have also focused on a limited, albeit popular, set of texts (one magazine and a related annual conference), meaning our analysis may not reflect all discourses circulating in game development. This is especially true as, although these sources are written by developers for developers, they are also somewhat public-facing; given the game industry's culture of secrecy (O'Donnell, 2014) and precarity (Bulut, 2020), writers may avoid overtly critiquing their employers or industry. Articles and talks also have to be approved by editorial boards, meaning they must be considered both appropriate and newsworthy. As a result, our work likely misses smaller moments of labor resistance in the industry, such as those mentioned by Paolo Ruffino and Jamie Woodcock (2020;Woodcock, 2016Woodcock, , 2020. Despite this, the higher level longitudinal nature of our study, the consistency of the themes we identified, and the industry's ongoing struggles with crunch make the results of this project relevant to many scholars, gameworkers, and labor organizers, especially as unionization efforts take off. By providing an external perspective on the challenges of cruel optimism, this article seeks to break the normalization of current labor patterns with an eye toward more sustainable, equitable development practices in games and elsewhere. Discourses around good/bad crunch subdivide a set of practices which, from an outside perspective, appear to be very similar and which likely have similar effects. Overwork, regardless of the impetus behind it, can result in physical or mental health problems, challenges to relationships, burnout, and more. The fact that developers position different forms of crunch as good or bad thus forms a relation of cruel optimism, whereby developers end up striving for something that doesn't actually meaningfully improve their circumstances. As Berlant (2011) states, "All attachments are optimistic. When we talk about an object of desire, we are really talking about a cluster of promises we want someone or something to make to us and make possible for us" (p. 23). When developers strive for "good crunch" over "bad crunch," they are optimistically trying to improve their labor practices, guard their quality of life, and make game development more sustainable. Nevertheless, the idea that one can somehow crunch "correctly" becomes the new object of attachment, and it is a cruel one; focusing on this impedes the process of reimagining development and creating production structures that do not rely on crunch-good or bad-at all. This is also why crunch is endemic to game design as a whole. Much existing work has focused on crunch as linked to large studio or AAA development (e.g. Cote and Harris, 2021a;Hoffman, 2004;Rockstar Spouse, 2010;Vanderhoef and Curtin, 2015), and some speakers in our dataset suggested quality of life was easier to maintain with smaller teams than with larger ones (e.g. IGDA: Working to Death, GDC 2010). However, other research (e.g. Fisher and Harvey, 2013) suggests that independent development often reifies the norms of large studio development and therefore is not inherently more progressive. In terms of the good/bad crunch divide, our data suggest that indie studios may face even more of a quality of life challenge given that their crunch is less likely to be externally imposed. Although publishers can still set expectations, indie devs may find themselves mired in crunch that is self-directed, emerging from passion, or for the good of the project. For example, Matt Gilgenbach's 2013 GDC presentation on indie game Retro/Grade stressed how his small team's ambition, attention to detail, and desire to compete with a AAA experience drove them to crunch for nearly 4 years, to the detriment of their health and personal lives. He cautioned other independent developers against similar overreach, but as long as good crunch is seen as acceptable or necessary to a game's success, the practice will continue.
Good crunch discourses are challenging to overcome. This can be seen in how even firm anti-crunch advocates like game journalist Jason Schreier (e.g. Schreier, 2016, 2020b occasionally fall into the trap of admiring certain forms of crunch as demonstrating artistic commitment. In his book Blood, Sweat and Pixels, Schreier writes adoringly of Stardew Valley creator Eric Barone, following stories of Barone's self-imposed 4-year crunch with statements about his success: "Since he'd launched the game, it had grossed close to $21 million. Eric Barone, who was twenty-eight years old and couldn't open the front seat of his car, had over $12 million in his bank account" (Schreier, 2017: 79). Because of how game discourses have normalized good crunch, it becomes a comfortable, reliable object of attachment. This forms "cruel optimism's double bind: even with an image of a better good life available to sustain your optimism, it is awkward and it is threatening to detach from what is already not working" (Berlant, 2011: 263). But that detachment is necessary to solve the many issues resulting from crunch.
We are not attempting to argue that progressive, equitable development is not possible and/or already occurring in some areas of game development. Advocates like Hank Howie or David Amor largely succeeded in creating studios with high quality of life and low crunch (Cote and Harris, 2021b), and GDC speakers like Coray Seifert and Paul Tozour took strong stands against crunch. We also noted conversations about labor practices and project scheduling ramping up somewhat over time, especially in our GDC sample. This may be a remnant of how videos are chosen for the vault, but it also speaks of a rising awareness of crunch's costs. Growing calls for unionization further support this perceived trend. What we are arguing, then, is that broader changes to the overall system of game production require that developers "acknowledg[e] the broken circuit of reciprocity between herself and the world but who, refusing to see that cleavage as an end as such, takes it as an opportunity to repair both herself and the world" (Berlant, 2011: 259). Devs need to recognize that good crunch is, at best, only a moderate improvement over bad crunch and, at worst, the same thing in a prettier package. By normalizing the idea of crunch as good if it's self-driven, scheduled, or in pursuit of greatness, developers set this as both their object of desire and the obstacle to their true flourishing.
Overcoming this obstacle is necessary as discourses of good/bad crunch do not stay limited to the industrial sphere alone. To return to the Cyberpunk example from the start of this article, the idea that crunch may be necessary for good game development-or at least that it's acceptable if the resulting game is a success-has promulgated out into game culture more widely, resulting in the backlash Schreier faced for critiquing CDPR's tendency to crunch (Hernandez, 2020). Other popular or journalistic sources, such as Blood, Sweat & Pixels or popular documentary Indie Game: The Movie (Swirsky and Pajot, 2012), have also worked to spread this idea throughout the gaming community, becoming part of a self-perpetuating loop of exploitative labor practices. Intervening in developer-led cruel optimism then becomes a first step toward breaking this cycle and imagining game development in more transformative ways.