What is Split Testing and How Do I Use It?

If you are thinking about purchasing paid media to advertise your mobile application to increase downloads, you are definitely going to want to spend your dollars wisely.

To ensure that you are getting the best bang for your buck, you will want to employ a common technique in the digital ad space called Split-Testing. By employing this technique you will drastically improve your app marketing efforts.
What is Split-Testing

Split-testing also known as A/B testing or bucket testing is a marketing technique to test out variations of an ad in order to compare results to determine what ad elements works best to get maximum results.

Basically, split-testing is when you run two ads competing against one another to determine which one produces the best results and is ultimately the one to continue trafficking.

To properly split-test, you will want to create variations of your ad that include changes to elements such as copy, images, font style and size, colors and layouts. Comparing results between the ads will confirm whether your change has made a positive or negative impact. You will want to continue with the ad that is doing well and stop with the ads that are performing less.

Split-testing is done to help you better your results from your ad campaign. By better understanding what is working and what is not with your ads, you can make immediate adjustments to improve results and your ultimate return on investment.

 

How to Split-Test

To split-test you will need to first start with an ad. Create an ad for your campaign that is compelling and conveys your messaging in an effective, easy to understand and powerful manner.

Take a look at this ad and create a variation on one of the elements. Perhaps you will use a different image or change a word in the copy or bold the call to action. Whatever you do, keep the changes effective but to a minimum as too many changes will not allow you to determine what modification has produced different results.

Run both of your ads with the same configuration in your network and watch the results of both. You will want to let your original (control) and variation (test) ads run for a fair amount of time to really start to understand how the audience is reacting to them. You will also want to make sure that a significant amount of the audience has been exposed to the ad before making any conclusions.

The amount of time you take to make the decision on which ad to continue with depends on the data that you are receiving but you should be able to start to make a decision within a couple of days.

Once your ads have had some time to live on the network, you will want to look at the data gathered during that time to decide which ad is doing better. For the most part, the metric you will be looking at here is the CTR or click-through rate. But if you are able to (as some networks have this capability), for mobile application advertising, your key metric should be downloads of your application.

Regardless of the metric you choose, compare results between the original ad to the test ad and continue with the one that produces the best results.

Depending on the length of time of your campaign, you will want to continue the split-testing process, creating variations off of the ad that produces the best results from the previous test.

 

Why Split-Test

Split-testing is definitely proven to help you produce the best results from your ad campaign but using available data from ad network reporting to gauge the impact of small variations to your ads.

It is obvious that as you take the time to improve your ad based on actual data you will only be left with higher conversion and click through rates for your campaign overall.

But beyond increasing immediate results, the learnings gained from the split-tests you perform in your campaigns will start to educate you on what direction to take your future advertising for the same audience. This information is invaluable as it will help to elevate your future ad decisions right from the start of the campaign. Of course, no campaign is ever the same but the more tests you run, the wiser you will be.

 

Leave a Reply