Exclusively Games is supported by its audience. When you buy through links on our site, we may earn an affiliate commission.

Read More

The Devil’s Advocate: In Defense of Microtransactions

A strong and well-considered argument should, first and foremost, give honest consideration to any opposing standpoints. By understanding opposing views, we can reinforce our own views while arming ourselves with the ability to tackle opposing arguments. That is the purpose of The Devil’s Advocate. In these articles I will take what is considered to be an unpopular position on a contentious issue in the games industry, and argue in favor of it.


The views in this article are not necessarily my own personal views (particularly on the issue of microtransactions), or the views of Exclusively Games. Rather, it is an attempt to argue in favor of the counterpoint – a common technique in developing debating skills.

Now, on to the issue at hand:


Microtransactions are exactly what they sound like – small purchases. Historically, gamers would simply pay one price, and they would have a full game made available to them, but today it is common that this initial purchase price could be followed with dozens of smaller prices to buy additional features (such as missions, cosmetic items, etc.). Microtransactions are quite commonly the primary source of income for a game, and some developers have abandoned the initial financial outlay entirely. This “freemium” model can be found in many multiplayer games like Fortnite, World of Tanks and League of Legends.

Microtransactions are derided by a large proportion of gamers, particularly those that are deemed “pay-2-win” – microtransactions that give a tangible benefit to players that can afford to spend the money. These gamers mourn the days when you could just pay a single fee and that was the end of it. What they overlook is the huge impact that microtransactions have had on growing the games industry, and that is ultimately positive for the industry as a whole.

Benefits to the Industry

If you’re like me, who lived and gamed through the Dotcom Bubble in the late 90s, you would have not only seen some of the finest games ever made (particularly for the PC), but also the endless casualties – developers who couldn’t “make it” collapsed or were gobbled up by the publishing powerhouses like EA and Activision. The idea of an “indie” developer barely existed, and those indies that did exist would have struggled to put food on the table.

The emergence of DLC – essentially miniature expansions – showed developers that there was a content threshold for players, where they were willing to outlay small amounts of money for small amounts of content. Under the expansion model, a small portion of gamers might outlay the $30 or so that an expansion would usually cost. But if that expansion was divided up into three $10 portions, they could still capture the gamers willing to spend $30, but also make a little bit extra -from those who only wanted to spend $10 or $20. Fast forward to today, and games are more fragmented than ever. $2 for a skin. $1 for a decal. $5 for a new level.

Opponents of microtransactions are constantly crying out in protest, but one need only look at the state of the industry to see how much microtransactions have leveled the playing field. Employment opportunities in the games industry are numerous, consumers have more choice in games than ever, and small developers are able to fund operations on a scale that AAA developers could only dream of only a couple decades ago.

The reason is that microtransactions have introduced a way for developers to have a steady, reliable cash flow that they can constantly tweak, rather than relying on an initial bulk income that rapidly tapers off. For freemium games, with a lower threshold for entry, there are significantly more potential customers. For games that retain the initial cost, it is also a way to keep gaining income from gamers who have made that outlay.

Benefits to Gamers

The consequence of a vibrant industry with steady and reliable income across the entire spectrum of indie to AAA is that gamers have choice. We are no longer limited to two or three mainstream shooters – like Quake III: Arena or Unreal Tournament. Today, there are literally hundreds of choices, and all of them popular. There are also opportunities to experience games with AAA polish for little to no cost; the fact that games with exceptional production value like Destiny 2 are free (with some minor caveats) is utterly amazing.

Those who bemoan the death of the traditional distribution model are crying over spilt milk – those games still exist and are still successful – The Witcher III: Wild Hunt was immensely successful, and took a late 90s approach to its financial model. But alongside that, there are dozens of other financial models that incorporate microtransactions.

Consumer Choice

What all this ultimately boils down to is one of the key tenets of modern capitalism – consumer choice. Basically – if you don’t like it, don’t pay for it. The only reason that developers use microtransactions is because they work. Sure, some developers have been somewhat brazen with their implementation (such as EA), but this has ultimately backfired thanks to consumer backlash, and people “voting with their wallets”. The fact that microtransactions have been and continue to be successful is because most gamers are okay with them, broadly speaking.

Despite the moral panic about microtransactions, they have ultimately been beneficial for the industry and for gamers. If they can continue to support and develop the industry I love, then I’m okay with that.

In about 1989, Gavin Annand played his first games on a Sinclair ZX Spectrum. Thus, began a lifetime obsession with games. A gaming addict or connoisseur, depending on your perspective.