Job-readiness is a lie

On Friday, the federal government announced it was going to drastically change how it funds tertiary education. Effectively, they want to raise the price of humanities and arts subject areas to redirect students into more “job-relevant” sectors. There’s many reasons this is very bad. First and fundamentally, there’s the ethical and democratic issue of pricing people out of tertiary education. This move effectively means individuals will now pay practically half of their tertiary education out of their own pocket (via HECS), a crucial tipping point in consecutive governments’ slow destruction of free, or at least affordable, tertiary education—a basic feature of any functioning democracy. Beyond this fundamental attack on a bedrock of our society, it’s also bad policy for more straightforward economic and job-creation reasons. It won’t create the increased skills in the desired areas anyway (a grade 12 student doing English and History isn’t suddenly going to enrol in Medicine because its more comparably priced); the identified job areas, such as agriculture, don’t actually align with the skill areas the government itself has identified as lacking (which are themselves humanities areas); humanities students are crucial for subsidising the far more expensive science and engineering and medical degrees, so reducing enrolments in these areas will negatively impact the very disciplines the government claims to be supporting. The whole thing is a mess for a whole range of reasons, frankly.

Other people will write smarter things about the economic failings of this plan, but here I want to particularly discuss the government’s focus on ‘job-relevant’ educations versus, implicitly, ‘job-irrelevant’ educations (such as the education minister’s own Arts degree supposedly). This falls within a broader rhetoric espoused by governments and repeated by students, parents, media, and university management and marketing alike that universities must increasingly focus on producing ‘job ready’ graduates with ‘job ready’ skills. Not that artsy fartsy theory and history and critical stuff but the hard skills that you actually need in the work force. (The skills that, historically, was the responsibility of the companies to invest in so as to teach graduate hires, but which companies have now convinced universities is their responsibility, that graduates should be perfectly formed workers and able to slip into their company-specific pipelines).

Here’s the short version of this post: Job-readiness is a lie that only works to produce graduates less capable of dealing with the world they find themselves in, less well-rounded as human beings, less able to think on their feet, and less employable.

Continue reading

Are games art school? How to teach game development when there are no jobs

gdcslides3.001

At the 2019 Game Developers Conference I gave a talk at the Education Summit called “Are games art school? How to teach game development when there are no jobs”. The video of the talk is available on the GDC Vault but unfortunately you need a subscription to access it. So instead, here is a write-up of what I talked about.

Continue reading

There’s not enough videogames; everyone should be encouraged to make them (or, videogames are just art)

nintendo-netherlands-steamroll-gameboy-counterfeit-1-1038x576.jpg

Indiepocalypse discourse is back in vogue again. Polygon published this article about how there’s too many videogames; Games Industry published this editorial about how we need to stop encouraging people to go indie. My response, on Twitter, was:

There aren’t too many indie developers. There are just too many indie developers who don’t realise that being an indie developer is like starting a band. It’s a thing you do and get value out of and, if you’re incredibly lucky, might even make you some money one day maybe.

This post is essentially just a long-winded expansion of that tweet.

Continue reading