My experiences with Play Store Experiments

During Google’s I/O 2015 at June we learned that Google introduced Play Store Experiments, a platform where developers can perform A/B testing on their Play Store listings and see which combination of texts, icons and screenshots worked the best.

If you take a look at your app’s user acquisition data you’ll quickly understand why it is so important to experiment with your app’s meta data. In most cases 20-40% of the store listing visitors download your app. What if you could increase this number? What if you can’t change this number but you can drive more visitors to your app’s Play Store listing? And what if you can do both?

With this tool you can’t optimize the onboarding experience and boost retention but you can increase the number of downloads by optimizing your app’s Play Store listing. Small changes can have huge effects and we all know that users primarily decide on the icon’s design when clicking on an app.

Getting a winner take some time, mostly when you have only a couple of downloads per day, but it still can be a great indicator that it’s time for a change. At the end you’ll be able to increase the downloads and thus the revenue by whatever percentage you manage to reach. It’s all up to you.

By playing with these Play Store Experiments you get one more thing: the more frequently an app is downloaded the better ranking it has because it tells Google that the app has potential. By ranking it better Google will also earn more revenue from ads or in-app purchasing so this is a win-win for all parties.

Let’s see my experiences with Play Store Experiments:

Experiment 1 – Short Description I

I have an app with about 400-600 daily downloads and I was curious if I were able to get more downloads by changing the Short Description of the Play Store listing.

Current Version: With this eye test you can test your vision at home easily and totally free!
Test Version: 6 Free Eye Tests and a Game that requires Speed and Good Color Vision

Experiments - Eye test - Google Play Developer Console

I waited more than a week to be able to decide which version drove more downloads. Results:

Small Description Result Daily Installs

The graphs show you how many people installed and uninstalled each version per day but it doesn’t tell it by numbers. I had to manually sum up the numbers to find out that during these 11 days the Current Version got 1942 downloads and the Test Version got 1866 downloads.

I was surprised that there there was no significant difference between the 2 versions, because I honestly thought that the Test Version will be more successful:

  • It includes a number that tells you exactly how many eye tests people can play with. With other words, it tells them what they’re getting.
  • It also makes the description stand out.
  • The first letters of all the words are capitalized to be more eye-catching.
  • It includes the word Game because people like playing no matter what.
  • It includes the word Free in the beginning of the description to be more upfront

Result: Despite all this the test was successful. It showed that the Short Description doesn’t really play a role when it comes to downloading an app. To make sure, I ran another experiment.

Experiment 2 – Short Description II

I decided to give the Short Description another go by experimenting with it on another app. This time I chose a social app that we created as a side project, Yolify. Versions:

Current Version: Transform Your Life Into a Sensational Journey
Test Version 1: Over 10,000 Bucket List Ideas In One Place – Complete Them With Your Friends
Test Version 2: #1 Bucket List App on Google Play – 10,000 Ideas To Step Out Of Your Comfort Zone

With this experiment it turned out that you cannot have the ads be equally distributed. The current version will be served to 50% of the people and all other versions share the other 50%. I’m not seeing the logic behind this but that’s all what we have now.

We added a large number to both versions. If an app has 10,000 something than it must be good. I also added a very strong marker in the Version 2: #1. People love when something is No. 1 and we really wanted to highlight this.



Since the versions are not served proportionally, we need to focus on scaled installs. Even if we don’t have enough downloads to explicitly declare a winner, it looks like our Current Version is preferred by most people despite all our expectations. This goes against the result of my 1st Experiment: The short description do have an effect on the number of installs.

Experiment 3 – Screenshots

It’s 2015 and most developers are still uploading bare screenshots of their apps. You don’t want to be those people, right? You need to be different to stand out of the crowd. A beautiful screenshot is a must because you need to make people feel like your app is beautiful and when something looks awesome then it must be awesome, so people think. On the other hand, this is nothing new so you need to be a bit creative. People don’t have time to read app descriptions and they are lazy as shit, so why not mix the screenshots with some descriptive words? Let the images do the talking and walk people through the core features of the app!

This trick is used by many apps including RunKeeperSoundCloud, Swarm and 8fit. You can check them out for inspiration.



How did my experiment go?


I already used the screenshots with the phone and the description in the Current Version and I wanted to see if the old pure screenshots would perform better. At half time, the test version led by a little and at the end it won. That’s something I did not expect as I was sure that the previous screenshots were more professional and better-designed. I guess that’s what happens when you are sure about something.

One thing to know is that the app is ranked #2 for my keyword and that explains the small difference the experiment made.

Experiment 4 – Icon

The first thing people see on Google Play is the icon of your app. This is the most crucial part in driving downloads because people click on apps that have stunning, unique and modern icons. A lot of people download the app first by the icon rather than the description. Your conversion rate depends on your icon! You should always devote enough time to design your icon as you can have the best app in the world, no one will download it if they don’t click on it.

So I asked my friends what they thought about the icon of one of my apps and they told me it was ugly and not attractive at all. I didn’t think it had the best design in the world, but I thought it was OK. It was time to design a new icon and run an experiment to find out who’s right (I’m always right, of course and I wanted to prove it to them).

The three designs:


What’s important to know is that until the experiment is finished you shouldn’t jump to conclusions because the results can be way different at the end. That’s what happened in this case when the Test Version 2 was the absolutely winner, even if it wasn’t really different from the Test Version 1:



I didn’t wait for the experiment to end because the current results were enough for me to know that the new design is much better than the old one:


From the 3 version’s perspective the Current Version performed the best, nevertheless the 2 Test Versions look almost exactly the same which means while the number of scaled installs in case of the Current Version is 162, it’s 264 in case of the two Test Versions. Results: The new icon performs 57% better!

One thing to note is that the Current Version was uninstalled the least times which clearly shows that people don’t like when the design of the icon doesn’t go with the design of the app! So what I need to do now is to apply one of the Test Versions and redesign the app to fit the new icon.

Another thing to note is when you don’t have enough downloads to draw conclusions then you only one test version against the current one because at the end you may apply the wrong version. You should always run experiments long enough to achieve statistical significance.


Play Store Experiments is a great tool and I highly encourage you to use it. Though you can run only one experiment at a time, you can reach serious results on installs by experimenting with the icon, screenshots and the descriptions.

If you like this post, please recommend it.

Need more downloads? Select one of my app promotion services and I will promote your app to 30 quality app review sites and millions of people on social websites.


Balint Farago

Entrepreneur, startup enthusiast, gadget fan. I travel a lot and in the meantime I develop and promote mobile apps.

  • León Hernández

    Hi Balint, thanks for the great article, definitively we all have to be using experiments for our apps. I couldn’t help to notice an error in your math respect the scaled installs. The scaled installs is calculated so you dont have to worry about the shares (50-25-25) in your case, you just compare the scaled installs as they all have the same weigth. This is for the documentation of google:

    Scaled installs: # of installs during your experiment divided by audience share.

    For example, if you ran an experiment with two variants that used
    90% / 10% audience shares and the installs for each variant were A = 900
    and B = 200, the scaled installs would be shown as A = 1000 (900/.9)
    and B = 2000 (200/0.1).

    So, actually your new icons performed worst, at least in the short time you keep the experiment. Hope this help to craft the article better. This is the app I’m trying to optimize with experiments What do you think would be the most important think to test next? (I’ve started with the screenshots, so don’t be surprised if you see the ugly and simple ones) :)

    • BalintFarago

      Hi León. Thanks for noticing the error. You’re right, the new icons performed worse which is surprising. As for your app, I would definitely test the title as it misses your keyword (blood alcohol/alcohol meter) and Cautoh sounds very strange.