With every passing AAA release, video games grow increasingly complex. From the likes of video game design classics Pong (1972) and Tetris (1984) through graphics pioneers like Donkey Kong Country (1994), all the way to modern big-budget epics like the physics and systems driven Legend of Zelda: Breath of the Wild (2017) and Grand Theft Auto 5 (2013), there’s an undeniable trend towards increasing complexity.
It’s a shift which has seen not only a change in the way game developers operate, but in what we expect from a video game, both in the initial release and post-release content.
One thing which hasn’t changed, however, is the price tag – video games remain roughly $40 to $60 in stores and online (adjusted for inflation). They’re prices which have remained relatively constant for over 20 years
, and yet as the cost of ‘AAA’ video game development goes up year on year, isn’t it about time the average major release video game cost $80?
In this article, we’re going to dive into what’s driving the cost of video game development and why charging more for games might actually save you money.
What’s Making Video Games So Expensive?
There are a number of factors which explain why the cost of video game development has gone up incrementally year-on-year, but as Lottoland
point out in their history of video games, developer ambition has gone hand in hand with the available technology.
Invariably developers have gotten their hands on faster, more powerful consumer technology and, seeking a selling point for their software, have pushed it to the very limit. Doing so in 2000 required plenty of effort, of course, but it could be done with a relatively modest team. A single artist at the time could create a character from scratch; modelling, animating and texturing them for a game. Today? A single character will be worked on by upwards of a dozen people.
It’s not just characters though, because dozens of gameplay and technology innovations have now meant that developers must work on everything from physics engines to anti-aliasing solutions and even VR support for the latest generation of headsets.
It’s a situation which has meant that developers have to employ more artists and coders to meet demand and, therefore, increase both development time and cost significantly.
The Case for Higher Prices
Now, I know what you’re thinking – why pay more for games when you’re getting all these advanced features for the same amount of cash you were paying a decade or two ago? Well, the simple answer is, you’re paying much, much more.
Although boxed games might cost the same amount, their feature set is often limited dramatically for those who don’t buy the expansion pass, spend money on in-game credit and more. In actuality, you’re getting significantly less for your money than you used to. Instead, the developers are clawing back their increased expenditure by locking features, characters and, often, whole sections of the narrative.
So-called ‘loot boxes’ have caused controversy
around the world over the last year and gamers are growing increasingly discontented with the actions of AAA developers who adopt a pay-to-win strategy, all in order to make back the money they spent during development.
Of course, there’s no guarantee that increasing the price of video games would curb this behavior. After all, developers can make a lot of money that way. However, it would make those tactics less essential than they currently are, potentially encouraging developers to focus on the experience of the gamer, rather than on the need to recoup their vast budget – something we can all get behind.
So, would you pay more for a video game if it meant getting the full experience out of the box?