Sponsored By

Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs.

Which of your new users are about to churn?

The most significant churn occurs right after install regardless if it’s on a mobile device or on Facebook. The first hours, not to say minutes, of game play, are critical.The goal of the churn model is to predict new players' churn a day after install.

Shalom Dinur, Blogger

January 27, 2015

6 Min Read

The most significant churn occurs right after install regardless if it’s on a mobile device or on Facebook. The first hours, not to say minutes, of game play, are critical. In this post I will focus on actual game play and set aside other factors such as: interface, availability of a free trial and tutorial quality, which are widely discussed elsewhere.

The impact of a 10%+ increase in 2nd day retention is a good impetus for building churn models. From our research and understanding of the user base and game economy, it became apparent that increasing short term retention was critical in order to retain and build healthy growth of the long tail of veteran users. Positive impact on LTV is more efficiently gained by long term retention compared to improving short-term retention.

The goal of the churn model is to predict new players' churn one day after they installed and played the game.

Churn Definition - Inactivity on the 1st and 2nd days after install.


Rather than waiting for day 3 of inactivity in order to make sure the player has left and will not return, the goal is to predict churn on the day of install.

With regards the availability of relevant data for the model, there was no shortage of this in our databases. Our placement of the inactivity threshold for churn at day 2 was also not a problem, since churn is very prevalent on the 2nd day after install (about 55% on mobile and 75% on Facebook).


We took data of 2 weeks with about 4000 installs a day and gameplay data for the first day of each install. We used Sql Server as the Database and R-Studio for data analysis, exploration and Logistic regression building and initial scoring.

Building the Panel

Using Sql Server a panel was created based on historical data of new installs and the first day experience. The data included approximately 50,000 web installs, of which 80% churned within 2 days. Many calculated explanatory variables were added for each of the users for the first play day.
 

Features:

  • Number of sessions

  • Avg session length

  • Total Time in game

  • Levels played

  • Games played

  • Machines opened

  • Weekend or weekday install

  • Big wins

  • referral

  • Mobile Type

  • Churn indicator (did the user return in the 2 days following install)

 

Building and testing the model

We opted for Cross validation Logistic Regression for simplicity, ease of use and robustness.
The prediction exercise yielded good results. The methods applied produced an AUC: ranging from 0.80 to 0.93, depending on features in the model. Coverage was above 80% of churners and false-positive was under 20%.

The features (variables) that turned out significant (p-value < 0.05) used in prediction: Total Time in game, Weekend or weekday install, Big wins, Mobile type.
 

Model Results
One of the insights gained by this work was that churn of 1st day installs is a user behavior that is predictable rather accurately. Another useful observation was that bugs that QA did not detect can be identified (e.g. specific mobile versions that do not run the app very well, Facebook loading on specific flash versions crashes etc). The predictive features that ultimately turned out to be significant were mostly time app usage variables of the first day. Naturally, app usage turning out to be most significant has many reasons: game design, game economy, monetization offers, technical issues etc.

The above reasons notwithstanding, we are able to pick out with high probability those users who will not return the next day (and this was the goal).

How did we act upon this information?

Intervention was performed via emails and notifications the following day of initial install with a specific message for these 2nd day churners. The message included a special bonus and a unique experience that changed from time to time. Response rates were rather low but about 10% of churners did return due to these actions.

Another intervention was to identify which mobile versions showed statistically significant lower 2nd day retention rates and tackle those versions: First off reduce marketing immediately on these indicated target groups and secondly find and fix the associated technical issues. After fixing the detected technical issues, marketing spend was reallocated to the previously faulty mobile version groups.

Read more about:

Featured Blogs

About the Author(s)

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like