There’s not enough videogames; everyone should be encouraged to make them (or, videogames are just art)

nintendo-netherlands-steamroll-gameboy-counterfeit-1-1038x576.jpg

Indiepocalypse discourse is back in vogue again. Polygon published this article about how there’s too many videogames; Games Industry published this editorial about how we need to stop encouraging people to go indie. My response, on Twitter, was:

There aren’t too many indie developers. There are just too many indie developers who don’t realise that being an indie developer is like starting a band. It’s a thing you do and get value out of and, if you’re incredibly lucky, might even make you some money one day maybe.

This post is essentially just a long-winded expansion of that tweet.

 

1. Making videogames is not fundamentally an economic activity.

The problem with much of the Indiepocalypse discourse is that it perpetuates the misunderstanding that ‘making videogames’ is first and foremost an economic activity. Of course, for a whole lot of people, making videogames is an economic activity; it’s their job! The thing is, you can make videogames without it being your job. Making videogames probably shouldn’t be your job until you are already a successful videogame maker. But that’s not how people traditionally think about the practice of videogame making. It’s perceived by a whole heap of aspirational developers, students, and recent graduates (never mind general players) as an activity you have to be sustaining yourself from in order to count as successful.

It’s interesting to think about just how that happened, because I don’t think there’s any other creative practice considered in the same way. Imagine you decide you want to start an indie band with your friends. Unless you’re incredibly rich, you don’t quit your day job and start working on your first album. Actually, a better analogy might be quitting your day job to start learning guitar. You decide you want to be a novelist. You don’t quite your day job and start work on your first novel. Musicians, poets, authors, and other artists generally don’t begin their creative pursuits expecting to make a living off it from day one. Why should game makers be any different? (There’s a whole tricky side discussion to have about the exploitation of ‘creativity’ and ‘passion’ here which I will get to below, promise).

Whenever I put it like that to people, they’re generally convinced. Like, yeah, that makes sense. Why would creating videogames be any different to any other creative pursuit? But why do so many aspirational videogame makers still rush headlong into it, crunching themselves and their savings for a first project? Like Zach Gage notes on a good twitter thread, I too have seen aspirants put time and money into getting a group together with no clear idea of what game they actually want to make yet. It’s this sort of behaviour that actually best demonstrates the problem: people start with ‘creating a business’ rather than ‘wanting to create something’. They start with making videogames as an economic practice, not a creative one.

 

2. Videogames are not fundamentally commercial entertainment software products.

I’d venture the reason this happens is the same reason videogames struggled for so long to be taken seriously ‘as art’. Videogames are not popularly perceived as art, and thus making videogames is not popularly perceived as creative practice. This is not just a public ignorance issue, but an image that has been actively cultivated by the game industry over a number of decades: videogames as first-and-foremost commercial entertainment software products. You can trace this back to Nintendo’s noncompetitive actions in America around the NES, and the homogenous and hegemonic structure of the industry and the consumerist gamer identity that subsequently solidified throughout the 1990s and early 2000s. While the actual reasons for the US videogame crash were far more complex than the popular narrative of ‘too many poor quality videogames’, ‘too many poor quality videogames’ was the popular narrative nonetheless, and Nintendo put in a significant effort to assure parents that, essentially, amateurs and hobbyists won’t have access to the NES. The emergence of indie videogames in the mid 2000s once high-speed internet made it more feasible to circumvent the major publishers was in fact a re-emergence of a broader spectrum of videogame creative practices that were actively suppressed and obscured for the previous 20 years. Triple-a was an anomaly.

But the damage had been done. Following the NES and the 90s, a very specific, deliberately cultivated, consumerist imagination of videogame development and aesthetics had taken hold as if it was just the natural way videogames should function. The vast majority of videogame controversies of the last decade can be traced back to just how ingrained this imagining of videogames was (and still is). The dismissal of walking sims as having ‘no gameplay’; Gamergate’s general misogyny and inability to grasp the value of a videogame made in Twine and inability to comprehend social relationships between critics and creators; the general way that a videogame being of poor quality gets reframed as a consumer rights issue as opposed to just, like, bad art; and most recently, the absurdity that was puddlegate. More mundanely, look at the comments section of any itch.io game that penetrates broader gamer circles. Free games will often have comments like “This is good, but it’s a bit short”, as if a small free game chucked on itch.io should be measured by the same economic ‘content’ values as a $60 commercial title. At all levels, videogames are dominantly imagined by our culture as commercial entertainment software.

And, again, a lot of videogames are commercial entertainment software! And that’s okay! But that’s not what videogames fundamentally areVideogames are just art. The ‘just’ is important. This isn’t a pretentious or insecure claim for credibility, but a claim for banality. Videogames are just another way you can use tools to express yourself creatively. And, just like the other ways you can express yourself creatively, you might even be able to monetise your videogame creativity. But that’s not the nature state of things! It never was. The formal videogame industry just did a very good job for a good 20 years of convincing the world that the only way to make a videogame was by going through them (Anna Anthropy made all these arguments in 2012 in Rise of the Videogame Zinesters, btw).

 

3. Videogames aren’t software

Now we know better. Sort of. We know we can just grab Unity, make something, and chuck it on itch or, if we really want to, Steam or the App Store. But so many of the people who have figured that out are still in a ‘commercial entertainment software product’ mindset. They don’t consider themselves artists just getting started in a creative field. They consider themselves tech startups getting together to make some new software that happens to be a game. Next time a ‘tweet your unpopular videogame opinion’ meme goes around, I’ve already got mine ready: videogames aren’t software. They use software; they aren’t software. To be a bit dualistic and reductive, software is developed to solve an existing problem that can readily be identified as worth solving. Like, say, all payroll systems are terrible so I’m going to develop a better payroll system. I can measure whether or not it works once I make it. A whole generation of videogame critics (hi) have already pointed out the absurdity of asking “does Uncharted 2 work?” Like sure, does it lag or quit to the menu or whatever. That’s important in the same way it’s important that the painting canvas doesn’t fall off the wall or the guitar is in tune. But you can’t measure whether the experience of Uncharted 2 is ‘successful’ in any quantifiable way. Videogames aren’t software. Videogames are just art that use software.

 

4. Videogame development education is not (usually) a pathway to industry

So the formalised industry is in large part responsible for this, but the role of education institutions needs to be addressed. They’re mentioned explicitly in both the Polygon and Games Industry articles as the main benefactors of so many hopeful indie devs chasing their economically-biased dreams, and I think that is a fair accusation. I wrote about the problems with how videogame development programs are marketed earlier this year and how many students enter their degrees without a literacy of what videogame development actually is. Again the problem comes down to the same broad issue perpetuated through videogame development cultures that videogame making is first and foremost an economic activity. So in many programs students are taught not how to create but how to contribute to the development of a product within an industry; they’re taught a range of skills throughout the first two years and then spend the entire third year making a videogame. Imagine if a creative writing student didn’t write a story until their final year! (There are exceptions, of course)

Game development needs to be taught as creative process, not as ‘an industry’. From that blog post:

Students need to be taught how to approach videogame development as a creative process. There’s some hard skills in particular software and programming languages, sure, but it’s also about how to be creative; how to form a community; how to think critically; how to research, develop, and communicate ideas; how to assess art and culture; how to start just making stuff already and not wait til you are good enough or ’employed’. How ‘getting employed’ is only one of the many ways you might use these skills and, even then, you’re very unlikely to get rich with these skills. Students need to be aware that they’ve signed up to become artists, essentially, with all that that entails.

(This is only arguably not true in the contexts where large studios exist to suck up graduates, which is a very small minority of places where games development is taught). Whether or not educational institutions are ‘responsible’ for this perceived problem is a tricky one to unpack (and caught up with long held skepticism in the games industry towards academia), but it’s definitely true that educational institutions are the best place where these ingrained ways of approaching making videogames could be fruitfully challenged.

 

5. Creating videogames is creative labour

I’m sure examples can be found, but the idea of someone claiming there’s ‘too many songs’ or ‘too many poems’ sounds absurd to me. Videogames are just art.

But there is a complex other side to this, and that’s the labour issue. The creative industries generally and the game industry specifically have long exploited people’s ‘passion’ for making games and their desire to do a fulfilling, ‘creative’ job to lower pay, encourage crunch, and avoid paying overtime. ‘You should be thankful you have this job’ developers are told if they dare complain about labour conditions. This is a side of this discussion (and my research on gamemakers) that I’m still trying to navigate my way through, and don’t have any clear answers for. How do you, on one hand, critique crunch culture and unpaid overtime while, on the other hand, advocating for aspiring gamemakers to make games around a day job for no pay? I’m not entirely sure yet!

Something I used to tell myself when I was a freelance writer: don’t work for free, but don’t only work for money. There was a tightrope I wanted to walk between respecting my own labour, and not reducing the value of my own labour to purely economic ones. I never tried to be a full-time freelance writer because that would’ve killed me. It would’ve been disrespectful to my own labour to put in the hours necessary to make that viable. It was much more respectful to myself to have work in academia that paid a living amount, and then write when I wanted to write on the side either for my own blog, or paid for others.

I think that’s where I feel aspiring indie developers also need to situate their own labour if they have to live under late capitalism. The effort, time, and money required to immediately go full-time into indie development for the average output that delivers is self-exploitation. Creating games when you can around a day job is less self-exploitative—so long as you create games of a scale that fit around your day job.

I regularly think back to something one of my interview participants said earlier this year. This was someone currently earning a salary as an indie developer who, unlike the majority of developers I interviewed, was ultimately unfazed about the long-term sustainability of that role. They said to me:

If you’re doing any other creative field the baseline is that the thing you make won’t make you money. There’s no delusion in any other creative field. ‘Making money’ is the amazing thing you aspire to, potentially, after making work for a long time.

I found that idea of ‘delusion’ interesting. This developer was content to one day have to go back to a day job to support their craft because that’s just what being an artist is like. Maybe it shouldn’t be! But it is! They weren’t prepared to grind themself to the stone to make games work as a full-time salary. They were putting in an effort to make it work, of course, but that effort was of a different tone than those I’ve spoke to who are desperate to ‘make it’ as a game developer. It didn’t feel like this developer was tricking themself or self-exploiting, but just finding a way to be honest and content about the fact that being an indie videogame developer essentially just means your an artist, which under late capitalism means sometimes you make money and sometimes you don’t, and your life needs to be structured to accept that.

So there’s two different discussions there that overlap but also need to be distinguished: 1) the troubles and difficulties of being an aspiring artist under late capitalism; and 2) aspiring indie developers need to understand that they are aspiring artists living under late capitalism.

 

6. More people curating more people making more videogames.

It’s always struck me as odd to see established indie developers—themselves empowered by shifts that made it easier to develop and distribute videogames—complain about how it has become too easy for others to develop and distribute videogames. It’s understandable, of course, and I’m sympathetic to an extent towards people who are themselves still somewhat precarious seeing the small advantages they have begin to slip away. But I think it’s incredibly important the discourse points the accusatory finger in the right direction. More people making more videogames can only ever be a good thing for the health of the medium. Unity, Twine, Bitsy, Flickgame, itch.io, and other platforms have afforded massive amounts of experimentation and (for lack of a better word) innovation in a short period of time. New, exciting creators have emerged in new scenes that never would’ve even considered creating videogames six or seven years ago. Of course, a whole lot more creators are also making absolute trash. Good! That’s a crucial aspect of any healthy creative medium.

If a finger is to be pointed, it’s to be pointed at the corporately-owned platforms like Valve and Apple that monopolise digital distribution while doing very little for discoverability. It’s a major issue that an iPhone game will either make or break based on whether some guy at Apple decides its worth giving a daily feature or not. We need more curators, which is something I believe Bennett Foddy has called for since starting his own game recommendation blog. More curators recommending more games on more platforms not owned by Valve and Apple, is what I would like to see. That is, of course, easier said than done.

Ultimately, the indiepocalypse discourse always comes down to the same disconnect: it’s a discourse on the economics of indie game development that exists, ironically, become too many aspirational indie developers approach their early practice as primarily economic. Calls are made for less developers or more strict curation from platform holders, and all the developers creating and sharing for reasons other than economic (amateurs, hobbyists, etc) become a problem to be solved rather than a vital aspect of the culture to be celebrated.

There aren’t too many musicians. There aren’t too many poets. There aren’t too many videogame makers. Videogames are just art, and we need to figure out ways to ensure more aspirational developers know that—like, really know that—before they start chasing a dream.

Advertisements