We’ve Acquired data.aiLearn more

Blog

APP STORE OPTIMIZATION · KAREN BISCOPINK · JULY 2015

Case Study: How A/B Testing Can Improve Your App's Conversion Rates

Learn how to A/B test your app's metadata to ensure you're generating the maximum possible conversions.

Image for Skoda's Metadata A-B Testing

Guest Post: This post is written by Eugene Svartsevich (left) who is a member of the Bamboo Apps marketing team, and Matt Krivoshein (right) of Splitmetrics.com. Both Eugene and Matt have extensive experience in mobile app marketing, with a focus on growth-hacking techniques.

Headshots of Guest Post Authors

The Background

In today's digital era, where app developers have to fight for the interest of potential buyers, app store product page optimization (also know as ASO) is becoming increasingly important. Both the Apple App Store and Google Play are marketplaces where the first impression actually counts. Along with the app's name and icon, screenshots have a crucial role in driving installs to your mobile app. Well-designed screenshots focus a user's attention, ultimately getting the viewer to click, scroll and download. If you screenshots are not compelling enough, expect visitors to bounce off of your app page.

All right. You have a team and a few ideas, but how can you be sure which one will really hit the spot? This is where A/B testing comes in, to give you a better understanding of what converts better for your specific app.

While working on ŠKODA Little Driver project, which is a branded car game, we (the Bamboo Apps team) designed a few sets of App Store screenshots but weren't sure what to choose. So we decided to host a simple A/B test to check which set of screenshots would bring the biggest number of app installs.

For this experiment, we used SplitMetrics as they are known for their A/B testing platform, which is capable of tracking App Store user activity.

The Test

Since SplitMetrics is an automated self-service tool, the only thing we had to do manually was to create several sets of screenshots to test. Combining different backgrounds and changing the order of these screenshots, we designed three variations (pictured below):

  • A reordering of our existing App Store screenshots

  • The "lifestyle" design

  • A screenshot with customized background

Image for Skoda's Metadata A/B Testing

The SplitMetrics platform functionally imports all the metadata from an App Store page (app name, icon, description, etc.) and allows creation of a unique landing page for each combination of screenshots. Once the landing pages were ready to go, we received a corresponding experiment link and launched an ad campaign (aimed at the app's target audience) on Facebook, in order to drive relevant traffic to each landing page.

Since the experimental landing page looks similar to the App Store, we assume that user behavior in the experiment is similar to actual behavior in the App Store. This means that we can use the data to make an informed decision.

Image for Skoda's Metadata A/B Testing

The Results

With the experiment running for 30 days, we brought in slightly over 1200 visitors distributed between the alternative landing pages.

Graph Displaying A/B Testing Results from SplitMetrics

The astonishing part, though, was that simply re-ordering the screenshots showed the best result in terms of conversion. Here are the results we saw for each alternative experience we created:

  • A reordering of our existing App Store screenshots - Reordering the screenshots with "Drive Mode" as the first screenshot saw a 6.4% increase in app installs, as compared to our control group.

  • The "lifestyle" design - This alternative was an outsider, with the lowest overall conversion rate of 33.3%.

  • A screenshot with customized background - This alternative outperformed the control group as well, with a 4.4% increase in app installs.

The test with reordered screenshots earned 18 more conversions than the lifestyle designs, and 19 more conversions than the screenshots with customized backgrounds. Additionally, while viewing this top-performing alternative, users were 10.3% more likely (as compared to the poorest performing test) to scroll through the entire list of screenshots. The difference in these numbers for the winning and losing alternatives proved the following for us: Screenshot order and design (even something as simple as a screenshot background) can be influential when a customer is deciding whether or not to download an app.

The Percentage of Users that Scrolled Through Screenshots

lt="Graph Displaying Percentage of Users that Scrolled Through All Screenshots on App Listing

The average screenshot scroll rate was only 17%, which confirmed our hypothesis that the vast majority of app page visitors would only see the first visible element. Our A/B test figures reflect a key app store optimization rule: Put your most important screenshot first!

Another takeaway from the experiment is that the name and description of the app are secondary to people deciding whether or not to download an app. Most of the time, they are not even taken into consideration; only one user read the full app description on the App Store. (SplitMetrics sees only 1% of users reading the full description across all experiments.)

The Winning Alternatives

Graph Displaying Percentage of Users that Scrolled Through All Screenshots on App Listing

Why did the test with re-ordered screenshots generate the most conversions? The winning variation is clear and self-explanatory: Its first screenshot explains what the app does, at a glance. The view shows the car's interior, with steering while, buttons, and captions. It's not only eye-catching, but highlights the core pupose of the ŠKODA Little Driver App.

As evident from our figures above (in which alternatives #2 and #3 got fewer installs), custom backgrounds and complicated design elements can distract user attention, or possibly even irritate them!

Conclusion

Due to the results of the ŠKODA Little Driver experiment, we're thoroughly confident that changing the order and the background of App Store screenshots had a robust impact on our conversion rate.

The reality is that most users will never read your full app description, and only a fraction of them will scroll beyond your second screenshot. This puts the weight on the first screenshot available to visitors. For your first screenshot, use an image that helps users understand your app immediately. Avoid complicated backgrounds and structures - they may decrease conversion rates. As we saw after conducting the ŠKODA Little Driver experiment, following these recommendations can lead up to a 16.6% increase in app installs.

Just keep in mind that there's always room for new discoveries in the conversion optimization space. Even small changes on a product's page can result in significant growth in downloads, and even app store revenue.

Thank you to Eugene of Bamboo Apps and Matt of SplitMetrics for providing this guest post and A/B Testing case study!


Sensor Tower's platform is an enterprise-level offering. Interested in learning more?


Karen Biscopink

Written by: Karen Biscopink, Head of Content

Date: July 2015