Design & Implementation of Mobile Analytics

There is a better way to make your mobile analytics far more effective and to dramatically reduce your integration time!

Further, I illustrate how the best person to manage analytics not only understands metrics but also has a good product sense and should have a deep understanding of the mobile app itself.

Read on...

BACKGROUND:

I've had a really great opportunity since the beginning of this year to work with 5 different mobile gaming companies on various short term consulting projects typically around mobile game monetization and optimization. Without breaking my confidentiality agreements I can say that these companies ran the gamut in genre (from super casual to hard core) and success (top 10 to floundering).

Most of the companies we worked with were fairly well resourced; a typical characteristic of a company able to afford external consulting fees. Surprisingly, analytics and reporting were almost always an after-thought and left to the very end by all of the companies we worked with.

In a couple of cases, I would see a massive list of metrics in some excel spreadsheet or Google doc. However, once the products went live we found that most of the analytics weren't instrumented. Hence, we were left with just the typical monetization and retention KPIs that come standard off of most third party dashboards.

DESIGN:

Let's begin with the design of your analytics and what you should be thinking about.

First of all, analytics should start at the very beginning of game development. Right after a game design document of some kind is completed you should start by building some hypotheses about the app.

We start by developing key hypotheses on what will determine success or failure for the app and what are the key risks that can kill us. In addition, we should define the objectives for the app. Think of this as an initial "MVR" (Minimum Viable Reporting) set of the most critical measures to help us validate the hypotheses.

So ask yourself the following key questions:

  1. Bases of Competition: What are the key bases of competition for this app? What do we think will determine success or failure for us? Why will a user pick this app over another similar app?

    1. Example: If this is a branded card battle game with a +1 of employing a turn-based battle system then potential ways of thinking about the success or failure of the game would be:

      1. Brand: Strength of the brand to lower user acquisition costs

      2. Improved (hopefully) Gameplay: How well turn-based battle resonates with users relative to standard automated async battle systems

  2. Key Risks: What are the top 3-5 risks in this game? This can coincide with the +1 but should also include any other key risks for the product

    1. Monetization/retention?

    2. Game design?

    3. Platform or technology risk?

    4. Server side stability/performance (especially for games with real time component)?

    5. User reaction to a new feature?

    6. Content capacity/ability to scale content?

    7. etc, etc

  3. Objectives: How do we measure success? For most companies this should be profitability but as we've seen in the current market there are other objectives for some larger companies.

    1. Profitability?

    2. Reach/Ranking?

    3. Revenue?

So let's go through a hypothetical example.

Hypothetical Mobile Game Product: Justice League Rush

  • Concept: Take a tower defense game like Kingdom Rush Frontiers gameplay and +1 it

  • +1 design parameters:

    • Brand: Branded gameplay using DC Comics/Warner Bros Justice League brand

      • Hypothesis: The Justice League brand will 1. enable a significant Apple feature, 2. dramatically decrease overall user acquisition costs, and 3. generate significant organic user traffic so long as the game charts

    • Multi-hero: Increase focus on heroes by enabling user to control up to 3 heroes at a time

      • Hypothesis: Multiple hero control will motivate users to spend more money on hero upgrades/items and cause users to invest more emotionally into the heroes in the game without significantly reducing gameplay time, sessions, etc. due to increased micro (Note: I don't actually believe this).

    • Social: Promote greater player to player and friends interaction by enabling user to call reinforcements from a set of friends similar (somewhat) to what's done in Puzzle & Dragons

      • Hypothesis: We should see a significant increase in user retention based on light user interactions, increased notification activity, and messages/other social activity.

    • Hybrid Payment Model: Use a Candy Crush Saga style payment model where the game is initially F2P but then add hard gates every 15-20 levels or so gated by payment or social.

      • Hypothesis: This model will dramatically increase overall game revenue vs. a paid model as we expect a high payer conversion rate (the Candy Crush Saga effect). This model should also complement social features to help increase the overall user base.

Whether you believe the hypotheses or not, you should stake a claim one way or the other and test it. Now that we have defined the game concept let's proceed with the key questions:

  1. Bases of Competition: The key differentiating aspects of the game should be the Brand and Payment model. Having said that, let's step through each of the +1 features here and develop some views and test metrics around these features:

    1. Brand:

      1. Justice League brand should drive a large number of organics:

        1. Test: Organics per day (and by chart position), Organics as a % of installs

      2. Justice League brand should increase CTR for ads. Potential to shift UA focus to CPC rather than CPI:

        1. Test: mobile ad CTR, CPI of our CPC campaigns

      3. Justice League brand should reduce overall user acquisition costs:

        1. Test: measure eCPI relative to other games

    2. Multi-hero:

      1. Usage of multiple heroes will compel users to want to upgrade more heroes and emotionally attach themselves with the heroes more leading to increased monetization for heroes

        1. Test: Check ARPDAU contribution for hero upgrades/items. Fish around for similar numbers from other games to compare with.

      2. Increased micro will not decrease game play usage. This should actually be a fairly major concern as I strongly believe that micro usually doesn't work for mobile games.

        1. Test: Avg sessions, Avg session length, Time spent in app per active user, d1/7/30 retention

      3. Overall monetization should increase (without a corresponding decrease in retention or usage)

        1. Test: Overall ARPDAU, LTV

    3. Social:

      1. Social features will increase retention of the game by: 1. generating additional notifications that people actually want to see, 2. create social connections with other users, 3. create rivalries

        1. Test: d1/7/30 retention, notification click through %, notifications clicked per user, social invitations sent per user, % paid vs. social hard gate unlock, number of friends per active user, etc.

    4. Hybrid Payment Model:

      1. An incremental paid app model like Candy Crush Saga should increase overall monetization relative to paid or purely free for this kind of a skills based game

        1. Test: Should weigh ARPU, social installs from hard gates, and user expansion (by being free) x ARPU relative to a paid only model e.g., comparison to Kingdom Rush Frontiers.

  2. Key Risks:

    1. Multi-Hero Feature Micro: Concern over whether too much micro in a mobile game can create a poor mobile game experience for the user

      1. Covered above

    2. Brand Transfer: Concern that the Justice League brand translates to a tower defense game and whether the audience for Justice League transfers well to a tower defense genre.

      1. Test: D1/7/30 retention, session time, # sessions relative to other tower defense games. Actually this risk should not be addressed by analytics after the fact but we should try to measure it nonetheless.

    3. Level Design, Balancing & Tuning: Concern that the success of games like Kingdom Rush Frontiers stems from having a well balanced game with interesting level designs.

      1. Test: Progression funnel by map, days to map (e.g., how many days does it take to get to map 1, 2, 3, etc.), sessions per map

  3. Objectives:

    1. Profitability: Which means eCPI < LTV

      1. Monetization/LTV proxies: d1/7/30 first time buyer conversion, 30 day ARPU, ARPDAU, % Paying user

      2. Retention: d1/7/30

      3. UA: CPI, organics/day (and by chart position)

This set of analytics should comprise the key set of initial analytics that we develop.

Further, however, we should define other analytics based upon specific applications. For example:

  1. Audit

  2. User Acquisition

  3. Optimization

  4. Anti-Hacking

This process is described in an earlier post of mine here: Introduction to Mobile Analytics & Reporting.

IMPLEMENTATION:

With the analytics design complete, we are now ready for implementation of the analytics.

I've seen analytics integration efforts take weeks. This shouldn't be the case.

There are 2 tools that I've developed that I believe significantly helps aid developers and analytics managers to get on the same page and reduce time spent in coordination and integration:

  1. An Application/Role/Metrics table (or just "ARM Table") as described in my earlier post

    1. This table should provide an overall roadmap to developers about what needs to be instrumented and reported by application and by role (e.g., specific dashboard views)

  2. An Analytics Events & Properties table which I describe below

    1. This defines what needs to be instrumented and helps translate between coders and analytics managers

The Analytics Events & Properties Table (or just "EP Table") should just contain the following information:

  • Event Name | Properties | Code Name (The name used in code and sent up to the analytics service) | Notes

Here is an example of such an EP Table that an analytics manager would fill out and then hand over to dev:

The overall process should work like this:

  1. Hand-off: The analytics manager (or whoever is responsible for analytics) defines the ARM Table and EP Table and hands off to game developers

  2. Instrumentation: Developers will study the tables and instrument game code as they are working on the code and:

    1. Instrument third party analytics where possible based on the EP Table and fill in the EP Table

    2. Create proprietary reporting tools based on whatever can't be handled by third party analytics

  3. Configuration: The analytics manager will take the completed EP Table back and complete configuration and dashboard views on third party analytics sites. Further, check any reporting tools to make sure that all reporting is accurate and complete.

Once the analytics manager has the EP Table back configuration of analytics is easy. Event names and property names are well defined and no need to go bother a developer to dig through code to figure out what an event or property was called in code.

Analytics Configuration Screenshot (From Leanplum):

Leanplum Screen

Following this methodology should hopefully accomplish the following primary goals:

  1. Focus: Help focus team on key analytics that are the most relevant to game success/failure

  2. Reduce Time: Reduce integration and coordination time between PMs/producers/analysts and developers taking integration time from weeks to days or even hours

  3. Day 1 Launch: With the planning ahead of time and integration of analytics during development, reporting should be available from day 1 when the app launches and people don't need to be scrambling to figure things out after it's too late

Good luck! If you have a better way of doing this please let me know.

Join the conversation

or to participate.