Sponsored By

Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs.

Large-scale, fine-grained push notifications with Amazon Web Services

Pablo Varela of Plumbee describes how to efficiently send large numbers of push notifications -- and have your players thank you for it!

Plumbee Games, Blogger

May 7, 2014

13 Min Read

One of the most popular parts of our Mirrorball Slots free-to-play mobile game is Challenges. During a Challenge, players can collect Symbols during non-winning spins, and the more Symbols a player collects, the more Mirrorballs she can earn. After the Challenge ends, the player can redeem her Mirrorballs for credits for a limited time.
 

After we added Challenges to our mobile app, our players asked us to remind them to come back to play, so they wouldn’t miss out on the chance to earn more credits. To do this, we needed to introduce mobile push notifications to the app. However, we felt that app developers often make the mistake of sending too many push notifications. We wanted to ensure that we had a lot of control over what we were sending, to ensure that the notifications were appreciated by our players and not just considered spam.
 

The push notifications had to be fine-grained so that we could target the right players with the right message at the right time. We also wanted to make it easy for our teams to target push notifications based on any information available in our analytics systems. In addition to this, the system had to scale linearly to support millions of players and messages.
 

In this blog post we will describe our solution for these requirements, which combines Amazon Simple Workflow, Amazon Simple Notification Service, and Amazon Redshift to send push notifications to millions of players divided into hundreds of arbitrarily-defined segments.

Push notifications with Amazon SNS
 

To send each mobile push notification we use Amazon Simple Notification Service (SNS). SNS abstracts away the details of handling push notifications on different platforms (iOS, Google Android, and Kindle Android) and allows us to reliably send hundreds of thousands of push notifications at the same time. The following is an example of a call to SNS from our servers, using the AWS SDK for Java:
 

private void publishMessage(UserData userData, String jsonPayload) {

    amazonSNS.publish(new PublishRequest()

                        .withTargetArn(userData.getEndpoint())

                        .withMessageStructure("json")

                        .withMessage(jsonPayload));

}


where the jsonPayload would be something like:
 

{

"default": "“The 5 day Halloween Challenge has started today! Touch to play now!”",

"APNS": "{\"aps\" : {\"alert\":\"Halloween Challenge has started!"\",\"sound\":\"default\" },\"id\":\"XXXX\",\"tag\":\"challenge-start\" }"

}


The structure of this payload allows us to specify platform-specific features (e.g. sounds) while maintaining a default message if the target platform does not support these features.

Collecting devices
 

To be able to send push notifications, we need to obtain a reference to the player’s device. This happens via a device registration process that is triggered when a player agrees to accept push notifications from our application. Our servers handle this request by calling into SNS to retrieve a unique identifier for the player’s device, which Amazon calls an Amazon Resource Name or ARN.

 

private String getArnForDeviceEndpoint(

       String platformApplicationArn,

       String deviceToken) {

    CreatePlatformEndpointRequest request =

       new CreatePlatformEndpointRequest()

           .withPlatformApplicationArn(platformApplicationArn)

           .withToken(deviceToken);

    CreatePlatformEndpointResult result =

       snsClient.createPlatformEndpoint(request);

    return result.getEndpointArn();

}


We forward the ARN together with our own unique identifier for that user to our analytics systems. In our case we use Amazon Simple Queue Service (SQS) to store log events until they are processed by our analytics systems, which is done in our example code below by the call to sqsLogger.queueMessage.
 

private String registerEndpointForApplicationAndPlatform(

final long plumbeeUid, String platformARN,

String platformToken) {

    final String deviceEndpointARN =

       getArnForDeviceEndpoint(platformARN, platformToken);

    sqsLogger.queueMessage(new HashMap() {{

       put("notification", "register");

       put("plumbeeUid", plumbeeUid);

       put("provider", platformName);

       put("endpoint", deviceEndpointARN);

    }}, null);

    return deviceEndpointARN;

}


The log event produced by the above will be something like
 

{

 "sId": "2fa5e687-c692-4c60-8bf9-19d3b3bef30f",

 "timeStamp": 1381853084268,

 "metadata": {

       "plumbeeUid": XXXXXXX,

       "notification": "register",

       "provider": "apns",

       "endpoint": "arn:aws:sns:us-east-1:XXXXXXXXX:endpoint\/APNS\/MIRRORBALL_SLOTS_CI\/XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX"

 }

}


Once the ARN is stored in our analytics systems, it can be retrieved later as part of the push notification processing in order to send a push notification directly to that specific device.

Targeting
 

As part of our analytics systems, we were already using Amazon Redshift, Amazon’s analytical database service, to store a variety of commonly and heavily accessed data aggregations. This made user data easily and flexibly available, so we decided to simply use SQL queries against Redshift to select any segment of players. We trained our product managers and marketing team to use SQL, something they were happy to learn once they realized the power it gave them.
 

Here is an example of a typical query our marketing team might create to select a segment of players:
 

-- All the players that spend between 6h and 9h UTC on

-- Mondays and didn't spend in the last 7 days

 

SELECT plumbee_uid, arn

FROM user_spending_activity

WHERE time_slice = 3

AND plumbee_uid IN (

            SELECT plumbee_uid

            FROM user_metrics

            WHERE last_purchase_time < (now - 7d)

)


The time_slice referenced here is a pre-prepared field that divides the day into blocks, so that we can send a notification to a user when they are most likely to receive it.

Sending millions of push notifications
 

Push notifications are edited by product managers and the marketing team via a Google Spreadsheet. Users fill in columns in the spreadsheet specifying the message to be sent, the SQL query defining the segment to which it should be sent, and when the message should be sent.
 

At regular times each day, a scheduler starts the push notification process. We use our build system, Jenkins, to act as the scheduler, but this could be replaced by a simple cron job or any other scheduling tool. The scheduled process reads the configuration stored in the Google Spreadsheet, and creates a workflow within Amazon Simple Workflow (SWF) for each notification that should be sent in that timeslot. SWF orchestrates the sending of each batch of push notifications, so that we can distribute the process across many machines while also reliably and durably maintaining its execution state.
 

The workflow for each notification performs the following steps:
 

  1. Run the SQL query specifying the target group on Redshift and use Redshift’s UNLOAD function to export the results as files in Amazon S3. We tell Redshift to use a fixed column width so that we know how much space each row takes in the file, and therefore can skip directly to a particular user in the file.

  2. Obtain the list of files from S3. Divide the target group of players into equal-sized batches. Create and start a number of child workflows, dividing the batches equally among them.

  3. Each child workflow

    1. Reads a batch of ARNs from the files in S3. Here we make use of the fixed-column width to skip directly the start of the batch in the correct file. ARNs are read sequentially from the file following this.

    2. Sends a push notification to each ARN in the batch.

    3. Repeats steps (a) and (b) until its batches are exhausted.

  4. Once all child workflows have completed, the parent workflow deletes the query results from S3 to clean up.

  5. Metrics reported by all child workflows to the parent workflow are aggregated and published to Amazon’s Simple Email Service (SES), which sends an email to us with the details.
     

The above steps are illustrated in the diagram below.
 


The reporting email we receive looks something like this:
 

The following message has been sent to NNNNNN devices in m min, ss sec:

       ““The 5 day Halloween Challenge has started today! Touch to play now!””

 

Targeting query:

SELECT m_arn.plumbeeui, DISTINCT(m_arn.endpoint) FROM mobile.lu_user m_user, mobile.lu_notifications_arn m_arn WHERE m_user.plumbeeuid = m_arn.plumbeeuid AND country_code IN (:america) GROUP BY m_arn.endpoint

 

Workflow Execution Stats

       * number of devices targeted: NNNNNN

       * number of notifications sent: NNNNNN

       * number of notifications that failed (excluding disabled devices): 0

       * number of disabled devices: NNN


Results

As mentioned at the beginning of this post, one of the key use cases for our push notifications was to remind our players about our timed Challenges feature. Below, we show what happened to second-day retention when we started sending push notifications to players on day 1 of a Challenge (left), the final day of a Challenge (middle), and on the Mirrorball redemption day (right). In other words, the number of people playing the next day increased by up to 8 times.




 

In summary
 

If used thoughtfully, mobile push notifications can have a great impact on a game’s metrics -- and they can even be appreciated by players instead of discarded as spam. At Plumbee we were able to use the information we’d already collected in Redshift, together with a scalable architecture enabled by Simple Workflow and SNS, to create a meaningful and positive experience with push notifications for our players. Since Amazon Web Services costs scale with your usage, this kind of sophisticated approach is available even to the smallest of startups. We hope other game developers will try it too!
 

- Pablo Varela, systems engineer at Plumbee

Read more about:

Featured Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like