We’ve Acquired data.aiLearn more



How A/B Testing Boosted In-App Purchase Conversion By 124% And More

This post will show you 3 specific ways that A/B testing helped an app improve conversions and revenue.

A-B testing an app with Zac

About the author: Zac Aghion is the co-founder of Splitforce and leads strategic development and manages day-to-day operations.

He was previously the head of marketing at the China Economic Review, he speaks French and Chinese, and geeks out on onigiri.

Spot the Difference With Friends! is a kids’ game that is popular in the UK, Singapore, Philippines, and Malaysia. It is produced and published by ReignDesign, a Shanghai-based app studio.

Players can challenge their Facebook friends to spot the difference between two pictures. To make things even more challenging there is also a time limit.


They can compete with the larger community by trying to get to the top of the overall leaderboard. This adds even more engagement to the game.


Since this is a free game that monetizes through in-app purchases and ads, ReignDesign wanted to figure out how to increase their revenue by optimizing gameplay. This post will show you exactly what we did to help them tune three aspects of their game.

ReignDesign worked with us to run a series of A/B tests on in-app elements controlling user experience and rules of Spot the Difference With Friends! We tested: purchase funnel optimization, game level difficulty and time penalties.

Here is what we discovered.

Driving More In-App Purchase (IAP) Conversions By Optimizing The Funnel

With the increasing popularity of free-to-play and users’ clear preference for ad-free apps, IAPs have become a popular way for game and app publishers to earn revenue.

In this game, new players are given 6 stars, which act as a virtual currency that is spent during gameplay. One star is spent for each round of play. When players run out of stars, they can no longer play the game, until they buy more.


Original Design

Initially, ReignDesign had included an in-app notification which encouraged users to go to the in-app store and buy more stars when they tapped on the Play button but had no stars.

This in-app purchase funnel was not performing well, with about 0.5% of users actually converting to paid customers. ReignDesign decided to test another funnel, in which the in-app notification was removed and users were sent directly to the in-app store.


New Design

The new funnel converted free users into paid user at 1.12%, representing 124% increase in ReignDesign’s IAP conversion rate. This may not seem like a huge difference at first, but across the app’s 1,000+ daily active users a 1% increase represents 10 more sales every day.

In addition, as the user base of the game grows, small increases in the conversion rate will become even more significant in terms of revenue.


Optimizing Level Difficulty

When designing a social game, a lot of questions are tough to answer without data. In particular, ReignDesign wasn’t initially sure how to set the progression of level difficulty.

Make the game too easy, and people might get bored. Make it too hard, and people might become frustrated.

Instead of guessing at the optimal difficulty levels, ReignDesign tested three different variations of game difficulty progression:

  • Variation 1: Easy - Medium - Hard (EMH)

  • Variation 2: Easy - Medium - Medium (EMM)

  • Variation 3: Easy - Easy - Medium (EEM)

In order to measure the difference in user behavior when exposed to the different level difficulty progressions, ReignDesign implemented quantified goals to log the total number of levels played by users for each variation.


The Result

Users that were exposed to the third variation (Easy -Easy - Medium) played an average of 1.76% more levels than users exposed to the baseline. This had a direct impact on ReignDesign’s ad revenues, as ads are shown between levels and gameplay.

Time Penalties: A Hypothesis Reversed

In the single-player version of ReignDesign’s game, a 3-second time penalty is deducted when players make an error.

The game design team believed that this time penalty would discourage users from playing more in the multiplayer version, but decided to test and see if that was indeed the case.


After running a test comparing time penalty versus no time penalty in the multiplayer version, the data clearly showed that users with a time penalty played 6.3% more rounds than users without.


In this case, running an A/B test prevented ReignDesign from removing a feature that they believed would have a negative impact on user engagement. Through testing they discovered that the time penalty feature was actually good for engaging users, which was a result contrary to their initial business intuition.

The increases in play time as a result of these tests were reflected in additional ad revenue earned from the Spot the Difference With Friends! game since launch.


Getting a game downloaded is just the first step in making it successful. From there, you have to optimize the different elements of the game to make sure that you are getting the most revenue possible.

As the results have shown, sometimes what works goes against your initial impressions. This is why it is important to actually test ideas and let the results determine your direction.

We want to say thank you to Zac at Splitforce for writing this post. If you have any questions, let him know in the comments below.

Sensor Tower's platform is an enterprise-level offering. Interested in learning more?

Hugh Kimura

Written by: Hugh Kimura, Head of Content

Date: June 2014