Pre-register: State of Mobile 2025

Blog

App Intelligence · Randy Nelson · December 2015

App Store A/B Testing: The Pros and Cons of the Most Popular Approaches

App store A/B testing is on the rise—here's what you need to know about the strengths and weaknesses of the available approaches.

A-B Testing Hero Image

In this guest post, TestNest CEO and co-founder Nick Kurat takes us through the most prominent types of app store page A/B testing, explaining the pros and cons of each.

A/B testing is becoming increasingly crucial on the app stores for a good reason: All the marketing assets connected with a potential user's decision to download have to be optimized for conversions. Otherwise, you might end up (unnecessarily) wasting your marketing dollars trying to guess at the most compelling creatives.

App store A/B testing has come a long way in a very short period of time. At TestNest, we spent a year and a half researching and building tools to speed up this evolution. Today, I want to share what we've learned about the most popular ways to handle app listing page A/B testing and compare the different approaches with one another.

Approach 1: Updating Your App Store Page

The most basic A/B test you can do (but trust me, it's better you don't) is updating your app store page assets, keeping them there for a week, then measuring and comparing the conversions for the variations you have.

This takes up a lot of your time, especially when performed against the Apple App Store, as it takes Apple about a week to actually approve your update. In addition, you can't guarantee that the sample of app page visitors you're receiving is uniform. You may get enough visitors on your page for about a week, but then a blog might post an article about your app, and, because of this, a completely different audience will be directed to your page, which will throw off the results of the experiment. I honestly can't think of upsides to such an approach.

Approach 2: Facebook Banner Tests

A-B Testing Hero Image

Another way of testing page assets that we've seen used is placing your icon graphics or screenshot creatives onto a Facebook banner and then measuring the number of clicks it receives. This is a better option compared to the previous approach, because you at least get a proper traffic split between the variants. However, we find this approach extremely limiting in general, as you end up tracking how users react to the banner with your icon, and not to the app listing page itself.

Approach 3: Google Developer Console Experiments

A-B Testing Hero Image

Last May, Google introduced its own A/B testing platform for app listing pages. It's a nifty little tool that has its pros and cons (which I'll go over below), but it's definitely a step forward from testing app pages via successive updates and Facebook banners.

Advantages:

  • Organic Traffic: Because Google integrates A/B testing right into the Google Play platform, you don't have to drive traffic to an app listing page. Google uses the organic traffic you already have going to your app page, which in turn allows you to A/B test the actual app listing page in the Google Play store. This is very cool, but it has a few limitations (more about this below).

  • Google Developer Console Integration: Google integrated this A/B testing feature right into the Google Developer console, which gives you a few more handy possibilities, such as applying the winning variant to your app store page after the experiment is over.

Disadvantages:

  • No Prelaunch Experiments: Many companies who actively employ A/B testing are using it to determine the page assets they should use when releasing a new app. This is very important since using unattractive app page elements can cost you in low conversion, high CPIs, and, in the end, burn your money. Google Play unfortunately doesn't provide the ability to run prelaunch experiments.

  • No Simultaneous Experiments: You also can't run multiple experiments at the same time. This can require a lot more time if you are planning to test various aspects of your app page, such as your icons, screenshots, and app video.

  • Basic Analytics and Only Four Types of Experiments: A/B testing reports that you receive from Google are quite limited, as they only display the number of visits, installs, and the conversion rate. Moreover, you can only run four types of experiments: icon, screenshots, video (with featured image), and description.

  • It Only Works for Google Play: This one's pretty much a given.

Approach 4: Testing via "Fake" App Store Pages

A-B Testing Hero Image

The other approach to testing app listing pages is via a "fake" app store page. In order to do this, you create a web landing page that looks exactly like an app store page, drive traffic to this page, then measure the results. Of course, just like the other approaches to split testing, it also has its pros and cons.

Advantages:

  • Prelaunch Experiments: Because you're dealing with a web page, the Get/Install button link can be set to any URL. This is particularly useful in cases where your app is not launched on the app stores yet. You can link your Get button to a page with a "coming soon" message and an email capture form. This way, you can track how users interact with your app store page variation and not lose the ones who decide to download.

  • Many Experiment Types: With a custom web page, you can test pretty much anything on the listing page, from the icon and screenshots, to featured pages graphics, app name, and description.

A-B Testing Hero Image
  • Wide Tracking Possibilities: We've discovered that basic A/B testing analytics are not enough to fully understand the behavior of your page visitors. It's crucial to understand this type of behavior as you may be optimizing the wrong link of the chain.

    For instance, you may be trying to optimize app store page assets, but your Facebook banners are inconsistent with your app store page. If a user clicks through from the Facebook ad onto your app store page, and sees that its assets (icon, screenshot, etc.) don't promise what they initially did on the Facebook page, the result would be a low conversion rate. As such, your Facebook banners would need to be updated first to be in line with your app's landing page in the app store.

    Web pages give you wide tracking possibilities, as their analytics go beyond what you can get from the Google Developer Console experiments or Apple App Store analytics.

  • Simultaneous Experiments: Unlike Google Play experiments, you can run as many experiments as you want with as many asset variants as you want. After all, these are just web pages on the internet where you send your traffic and track user actions. It's important, however, to remember that each new experiment and variation in that experiment will require additional traffic in order to get statistically significant results.

Disadvantages:

  • No Organic (App Store Search) Traffic: Because the experiment landing page is not integrated into actual app stores, there is no way to drive native app store search traffic to the experiment page. In most cases, this ends up being insignificant: Facebook's targeted ads help you find your ideal audience that would normally look for your app when searching in the app store. More advanced testers, however, probably want to use the same traffic pattern they have on the app store page, which is the right balance of organic and paid traffic.

  • App Store Page Displaying Twice: After the visitor taps on the Get/Install button, you want them to download the app the same way as if they were in the app store. The problem is, you can't install an app from the internet by this link. The app would first be opened in the native app store, and then the user would need to tap the Get/Install button there to download it.

    This can be somewhat nerve-wracking at first (since you'll be losing some users who've decided to download your app), but it's the cost one has to pay in order to collect results from an experiment. By the time they tap on Get/Install, you've already collected all the valuable information about their behavior. Plus, this is an experiment, so you don't have to run all of your traffic through the web page; just a few thousand visitors is enough to get statistically significant results and apply them to the app store.

Conclusion

While A/B testing app store pages is a great idea, not all of the approaches are equally useful. In fact, some can be misleading and drive you towards the wrong conclusions. Each approach has its pros and cons, but we've found that "fake" app store page testing offers the most value with moderate limitations.

If you have an Android app that already launched and you plan to test graphical assets (icon, screenshots, video), you definitely want to check out the Google Play Developer Console. If you're looking for pre­launch testing for both Google Play and Apple's App Store, as well as deeper user behavior analysis, a solution like TestNest is your best option.

I'd love to hear more about your personal experience with app store A/B testing—feel free to share it in the comments below.

Sensor Tower's App Intelligence platform is used by leading mobile publishers and developers to elevate their app store optimization efforts. Interested in learning more? Request a live demo with our team!


Sensor Tower's platform is an enterprise-level offering. Interested in learning more?


Randy Nelson

Written by: Randy Nelson, Head of Mobile Insights

Date: December 2015