Every year or two I write a major prediction paper. They may appear speculative but because they are based on mathematical methods with very high efficacy, I'm not jeopardizing my record for 100% predictive accuracy by doing so. What I do is track high altitude systems and search for trends that may cause feedback loops. What I mean here is one trend that forces another trend rapidly in one direction, and vice versa. The result is that the trending numbers move rapidly towards either zero or infinity. These sorts of feedback loops are common causes of disease in human physiology. When they occur in a high level economic system, I call them a Systemic Death Spiral. This usually precipitates some sort of crash.
Previous examples I've published were about Zynga more than six months prior to the record breaking bad IPO, the performance of the real money auctionhouse in Diablo 3 six months before I got to see the game, the Great Recession (regrettably published after the fact, I learned my lesson...), and my 2010 predictions of how augmented reality would be developed and used to transform education and industry by 2020 and 2030. In the last case, the technologies I predict for 2020 are already in development, so the first part of the prediction is secure. We will have to wait a bit to see how 2030 turns out.
Since at least 2014 I have been alluding to the “coming F2P mobile extinction event” in various comments I have made here on Gamasutra. But the trends were still reversible and I was hoping the indicators were obvious enough that industry would act independently to avoid a death spiral. The problem here is that industry has misinterpreted the indicators and has actually acted to accelerate the actions that would trigger a death spiral.
The largest companies in our industry are generally the slowest to adapt to changes in technological and consumer trends. “The larger the ship, the harder it is to turn it” as multiple industry executives have told me. So in this way the largest companies in our industry act as lagging indicators. Normally we care more about leading indicators to tell us what will happen next. Lagging indicators are useful for telling us what won't be happening next.
In this case, since the largest companies have rolled up to the cliff and decided to drive forward instead of back, what won't be happening is we won't be dodging this bullet.
The following “X Trend” is actually a composite of at least two feedback loops, so in this case “X” is not only racing to zero, but will just keep going and turn negative. It already has for over 95% of the industry. The feedback loops described in “The Ever Shrinking Pie” and “The Data Implosion” sections are feeding back to each other, causing a very rapid death spiral. There are also links to the general economy, because the interactive media economy is linked to the general economy. So those two systems are also feeding back to each other and can accelerate corrections in both systems. I'm am not going to discuss crossovers to the general economy as I don't want to politicize this predictive paper.
The X Trend
Using the F2P business model, your success or failure is determined by whether “X” in the following formula is positive or negative:
Life Time Value (Revenue per user) – User Acquisition Cost (per user) = X
If “X” is negative, and stays there, then ultimately the result is that this product is a commercial failure. Without a commercial success to balance that out, the result in time is company failure and layoffs. The trend since 2014 has been for less and less products to result in a positive “X”. This trend is not random. It is not fluctuating. The number of “positive X's” is mathematically approaching zero with certainty. Thus if you are operating under the current F2P business model paradigm, it may be random when you lose your job but “if” you are going to lose your job is not random. It is certain.
Part of the randomness as far as “when” you lose your job involves how long can your company leaders make it look like you are still solvent when you are not. This has become a real art form. So risks for investors have skyrocketed along with the “X” trend, which over time will cause investors to get increasingly cautious. That in turn means less extensions and an acceleration of the entire process leading to layoffs.
While the X Trend was certainly visible to key people in 2014, and of course kept secret, I think a good number of people understand it at this point. What they may not know is what is causing it, and thus they are hoping it is transitory. It is not. It is the result of a systemic flaw in the way we have deployed F2P, an inescapable anomaly that reminds me of the key conflict in The Matrix story.
The problem I'm presenting here is multidimensional, which makes it complex and less than obvious. To simplify it, I'm going to break it down into two parts, but neither part will make sense without the other part. So you kind of have to read both parts simultaneously in whatever way works for you. Maybe read the first part, then the second part, and then the first part, until the concepts merge into one?
Dynamic Marketing Costs or “The Ever Shrinking Pie”
User acquisition costs have been skyrocketing. As they go up, they push X down. A product that might have been profitable last year, may be a loser this year even without a loss of LTV. Even app store titans in the top 10 are seeing X go down progressively every month. They can do whatever they want to stop this decline, but without knowing the cause of the decline they are doomed inexorably to fail. Of course for those not in the top 10, the process will happen much faster.
Working with a relatively small app producer, I've seen user acquisition costs around $7 per install. But it fluctuates. When I worked for large companies, the cost per install was lower due to market power effects (a bigger customer gets a better rate). But these numbers are not important. Stock values (one time snapshots of a metric like this) are almost always misleading. They make good powerpoint slides. What is important is the trend. The trend is for these costs to push continually higher.
Now it might be tempting to say that the cause is that platform holders (Apple, Google, etc) have a monopoly and can charge whatever they want. Maybe they want to squeeze developers to the point of extinction, and don't care if they fail along with them. Again, I think this is an oversimplification that hides what is really going on, which is actually quite a bit more troubling.
Certainly by now the Big Boys are using machine learning and pre-AI systems to get the most they can out of each install sale. Is the cost going up because game devs have more to spend? No. Is the cost going up because players have more to spend? No (see next section).
They are going up because the perceived value of our products is going down with consumers (explained in detail in next section). The lower the value of something, even if free, the less likely a consumer will choose it. So the platform holder is having to show your product to a LOT more consumers before one chooses to install. There is a finite number of things that can be shown to a consumer. The more of that total consumer exposure pie you are eating up with your product, the less can be shown for other products.
So while the total exposure pie is not changing in size, the reluctance of consumers to hit “install” makes the pie effectively much smaller over time. Because that same sized pie is generating less installs over time, the platform is selling less and less installs over time. To maintain their business model, they have to charge more for each of those installs and that is why your cost per install is skyrocketing.
Dynamic Microtransaction Pricing or “The Data Implosion”
Ever since Zynga incorrectly attributed their success to data analysis in 2011 prior to their IPO instead of disclosing the impending failure of their business model, and dovetailed off the popularity of the movie “Moneyball” which was also misrepresented by a host of players, there has been a rapid allocation of finite development resources towards business intelligence and data analysis in game development studios. If these new methodologies were generating revenues in excess of their costs, it could have ushered in a golden age of game development. That was the promise at least, but the trend is clearly moving in the opposite direction.
What's going on is really quite complex but I am going to put it into the public space for the first time here. The biggest advantage of the F2P business model over the retail model is that instead of drawing a straight line through customer budgets with a single price point, you can use variable price points and discriminatory pricing.
With retail, if I charge $20 for my game, I fail to get as a customer everyone that values my game at less than $20. This could be because they can't afford it, or are just not sufficiently impressed with it, or there is a better product competing with my product. On the other side, there may be customers that value my product at $500 but only have to pay me $20.
In this case, advanced mathematical processes can help determine where you draw that line in an effort to collect the maximum possible revenue. But in any case, using the old retail model you will fail to capture the full potential spending of all consumers above and below your price point.
So let's say that each individual customer has their own budget they are willing to allocate to games. This might vary from game to game. They will probably spend more on this year's hottest Blizzard game than they will on some mobile game. For any specific game, there will be a specific aggregate potential spending curve and spend pool/budget. In other words, the most you could get your players to spend on your game if you could gather the most possible money from each individual customer. Let's call that Max Spend Pool or MSP.
Many factors can affect MSP, but the most important thing is that it is a representation of total life time value delivered to your customers as a group. Maximizing MSP is an even more complex topic that I will discuss in extreme detail in my upcoming book on scientific methods of game design. But for now let's assume you have a product with an MSP that you don't know how to change.
You can do a small change to see if your short term metrics improve, something often described as A/B testing. This is generally ineffective because due to the group nature of spending, you could make a change that causes a 5% improvement in a short term metric, that then results in a 10% drop in a longer term (30+ day) metric and you won't know if that was due to your change, or some external effect like market competition. Clearly if you want to impress your boss and keep your job, you will assign the 5% increase to your efforts, and the 10% loss to the efforts of someone else.
What is reliably effective is that you can use data science to adjust prices up or down based on a player's spending behavior. If they start to spend, you can start offering them more expensive options to spend on. If they are very stubborn about not spending, you can offer them some very inexpensive options until you find their maximum price point and finally start to give up the dollars. You can also create situations in the game where there is a scarcity of a particular item, and then offer them that item they are short on in their time of need.
In theory, by using these methods you can over time capture the maximum value of MSP for your product. Except...there is a problem. You aren't the only one with the option of using variable pricing. So does the consumer! They can react to your actions by adjusting their spend cap up or down, especially in group play environments.
One of the reasons games have so rapidly become a bigger market than competing forms of entertainment is because they give the most pleasure for the cost. They are a cheap fix for that entertainment need on the part of the consumer. But this perception of value is not fixed in stone. As we continue to develop methods of threatening consumers into spending and engaging them in compulsion loops to trick them into spending more than their intended spend cap, we reduce the perceived value of our products with consumers.
Techniques to use data science to capture the MSP are of course sensible from the perspective of the developer, but for the consumer this raises the cost of games and thus reduces the perceived value of our products. Here I don't mean one product. I mean ALL products on the market. The consumer will learn from experience and use inductive reasoning to come to the conclusion that if their last 10 game experiences didn't really impress them as a good deal, then their next experience won't either.
Thus by using data science techniques to improve revenue generation in the short term, we end up lowering the MSP for ALL products from ALL sources on ALL platforms over time. This is what I call the Data Implosion. Of course the natural reaction in such a situation where your X Trend continues to decline is to do the “smart thing” and spend more on business intelligence. Which, accelerates the Data Implosion.
Here is why. Game development studios don't have infinite budgets to spend on games. The more they allocate to BI and data science, the less they allocate to non-quantitative design and creative aspects of the game. I will just call all of these people and assets “Creatives”. As these Creative assets are lowered in value, they experience layoffs and reduced earnings and respect even when they are working. This translates to lowered morale and productivity.
Conversely, as the very limited number of data scientists are fought over, their perceived and real value skyrockets. So doubling the size of your BI department could quadruple the cost of that department. From a resources view, this creates so much gravity from the BI department that it acts like a black hole, sucking in resources from the rest of the studio.
But the sad truth of the matter is that from the perspective of the consumer, adding more BI adds nothing of value to the consumer. In fact, it results in games costing them a lot more without any improvement in quality. Simultaneously, this trend causes reduction in Creative output, which is where value for the consumer is being generated.
The end result is that a rapid loss of product quality combined with a rapid increase in product cost causes consumers to sharply discount their potential spending budgets for our games. Competing forms of entertainment that were previously perceived as a worse value compared to games, now start to appear competitive again or even superior to games.
The result for us as game developers is that our studios struggle and eventually go out of business. Before they go out of business they will lay off Creatives, which will guarantee they go out of business. You may get hired again, but it is a mathematical certainty that on average your next period of employment will be even shorter than your previous period of employment as all periods of employment mathematically approach zero.
Before that happens the black hole created by the Data Implosion hits critical mass and detonates, resulting in what we economists affectionately call “A Correction”. It's sort of a special time when everyone loses their jobs and we get to start over from fresh with all new possibilities. Creatives, who's value was rapidly approaching zero as the value of data scientists was approaching infinity, will suddenly become valuable again. Commercial data scientists are survivors. They will likely migrate from our industry to the next, just like they did before they became popular in gaming, and start the process all over again.
Here I'm not discounting the value of data science in general, or saying we don't need them because we do. What I am saying is that their value to industry has been misrepresented by data scientists themselves, and because they are generally smarter than the people hiring them, and have the data to prove it, how can you say otherwise? This puts industry leaders in an awkward position.
You cozy up to the head of a competing studio at GDC and they brag about how big their BI unit is. What are you going to say? “I have more creatives than you do!” He will just laugh at you. You don't want to suffer this sort of humiliation, so you need more BI.
This continues until The Correction, which at this point I am saying to you is a mathematical certainty. Even if the heads of every game studio in the world read this paper this week, it would probably still be too late to stop The Correction. They just have too much of their total budgets allocated to assets that do not improve product quality. If they all rushed out to hire Creatives tomorrow, the cost of Creatives would skyrocket and the value of individual Creatives varies widely. And Creatives have personalities and attitudes which can be complicated.
After the correction, what we likely are going to end up with is a more balanced blending of Creative and quantitative assets, and successful studio heads will by then have learned how to make effective use of those assets.
The Correction will also significantly reduce competition in the market. This will reverse the X Trend, especially if the focus in development returns to increasing product quality instead of product efficacy. Perhaps by then investors will also have better learned how to identify what products are likely to be successful (as described in my upcoming book – shameless plug!) and we will see a smaller number of higher quality products reaching consumers.
When product quality increases, monetization increases. Always. When monetization “efficacy” increases, the long term effect is lower monetization once perfect competition is achieved. Like now. There is no down side to improved product quality in the short or long term, either to individual products or the industry as a whole. Higher quality breeds anticipation and larger MSP for everyone in the industry.