Are Video Games Really Underpriced?

By December 7, 2017Blog

By Tim Shea

Special Contributor

Following the November 17th launch of EA’s much anticipated Star Wars Battlefront II, the title turned into a nightmare for the gaming giant as customers revolted en masse against the embedded microtransaction structure. Such was the extent of the backlash that EA ultimately caved to the pressure, ditching the reviled loot boxes and leaving the company in a bind as to how they’re going to meet revenue projections. Gaming gods be praised.

The controversy didn’t end there, however. Last Wednesday, CNBC published an article summarizing KeyBanc Capital Markets’ Evan Wingren’s analysis of the matter. His take? Video games are underpriced, and gamers overreacted. The results were…predictable. But, the joys of internet drama aside, the article does raise an interesting question. Namely, is Mr. Wingren right? Should videogame makers raise their prices?

On the surface, his analysis is sound. Taken on an hour-by-hour basis, video games are indeed a steal compared to other forms of visual entertainment. The relative price differentials vary from medium to medium; an hour of television entertainment, for example, costs roughly 50% more than video games, while an hour at the movies costs a whopping 750% more! This much is indisputable. Beyond that, though…

Beyond that, Mr. Wingren’s analysis shows a shocking lack of insight and nuance, especially coming from a Wall Street professional. The fact that video games cost less on a per-hour basis simply does not translate into the assertion that prices should be higher. First, it relies on the assumption that consumers derive the same amount of satisfaction from the various different forms of entertainment. Given that the cost of a movie ticket grants me access to a climate-controlled theater with leather recliners and a deluxe sound system, while my Netflix subscription grants me access to my worn corduroy couch and a needy dog, yeah, that assumption is questionable at best.

Mr. Wingren also all but waves away the well-established law of diminishing marginal utility. That is, he implies that the 100th hour of videogame play imparts the same level of satisfaction as the first hour of videogame play (or movie watching). Even if it were the case that a night at the movie theater and a new videogame produced equivalent levels of consumer utility, by the time that 100th hour of gameplay is reached, the per-hour utility of the videogame will almost certainly have plummeted. This of course reduces the average per-hour utility derived, and the thus the average per-hour price consumers are willing to pay.

The list of possible critiques goes on and on: that he ignores the relative per-hour production costs of games versus movies, that he fails to account for varying tastes and audience sizes, or that, under his simplistic price-per-hour model, hardcover publishing probably wouldn’t exist. But all of those pale in comparison to his original sin: having the hubris to think he knows better than the consumer about what they should be willing to pay. It doesn’t matter how much modelling or research goes into a pricing decision. Ultimately, the consumer has the final vote. Sometimes businesses get the pricing right, and they succeed. Other times, often for mysterious reasons, consumers don’t bite and a business fails. It happens all the time. It’s called a market.

In closing, I should note that I’m not explicitly disagreeing with Mr. Wingren’s assessment so much as I am the foundation on which he built said assessment. It’s entirely possible that game developers could get away with charging more. For certain developers like The Witcher franchise’s CD Projekt Red, it even seems likely. What does seem obvious, however, is that while consumers might be willing to pay more, they absolutely put a huge premium on pricing transparency. They don’t like getting ambushed by hidden or unexpected costs.

There’s a powerful lesson in there for both businesses and governments alike.

 

Leave a Reply