Popular A/B test mistakes in advertising
Tanel Vetik
Head of strategy
When I first started out in paid ads marketing, I’d test dozens of ad sets and ad variations at the same time with limited to no budget per advert. This seemed like a good plan at first, but I later learned that I was making a common beginner mistake called “The Multiple Comparisons Problem”.
Essentially, it means you’re running too many simultaneous A/B tests (too many different ads and targeting options) with limited data variables (not enough money). Then you get results that are not statistically accurate and don’t represent true results had you had a larger budget or fewer simultaneous A/B tests.
All marketing managers are guilty of this and some figure it out sooner than others. Thing is, it costs a lot of money to figure it out. You wouldn’t want to be the one paying the tuition fees for this one. Here’s what you should do instead.
Popular A/B test mistakes in advertising
Tanel Vetik
Head of strategy
When I first started out in paid ads marketing, I’d test dozens of ad sets and ad variations at the same time with limited to no budget per advert. This seemed like a good plan at first, but I later learned that I was making a common beginner mistake called “The Multiple Comparisons Problem”.
Essentially, it means you’re running too many simultaneous A/B tests (too many different ads and targeting options) with limited data variables (not enough money). Then you get results that are not statistically accurate and don’t represent true results had you had a larger budget or fewer simultaneous A/B tests.
All marketing managers are guilty of this and some figure it out sooner than others. Thing is, it costs a lot of money to figure it out. You wouldn’t want to be the one paying the tuition fees for this one. Here’s what you should do instead.
When I first started out in paid ads marketing, I’d test dozens of ad sets and ad variations at the same time with limited to no budget per advert. This seemed like a good plan at first, but I later learned that I was making a common beginner mistake called “The Multiple Comparisons Problem”.
Essentially, it means you’re running too many simultaneous A/B tests (too many different ads and targeting options) with limited data variables (not enough money). Then you get results that are not statistically accurate and don’t represent true results had you had a larger budget or fewer simultaneous A/B tests.
All marketing managers are guilty of this and some figure it out sooner than others. Thing is, it costs a lot of money to figure it out. You wouldn’t want to be the one paying the tuition fees for this one. Here’s what you should do instead.
The right way to A/B test as a beginner
You’re probably A/B testing something already. You may not have realised it, but how the advertising platforms (such as Google Ads and Meta Ads) are set up is they make sure your ads are always competing with each other. The winner gets most of your ad spend, and losers get less than 5% of your daily budget.
If you already have more than one advert per campaign or keyword, then you’re basically running an A/B test. However, it’s your job to learn from the results to build even better adverts in the future. Unless you hired someone to do this – then it’s their job to do it for you.
But if you want to go from accidental testing to purposeful A/B tests, then you need to start from these things:
- Write down your hypothesis (e.g., “video performs better than carousel format ads”).
- Write down what you want to improve (visuals, texts, cost per lead, conversion rates etc.).
- Decide on your KPI’s (cost per lead, impression share, cost per click etc.).
- Measure in standardised intervals like every month, quarter, and year.
- Attribute changes to A/B test results to see what you can continue testing and what holds true after multiple rounds of tests.
If you continue to do this on the regular, then you’ll soon have tried and tested ways of generating new revenue for your business. And you’ll be able to confidently navigate the changing economic landscape.
The impact of A/B tests on business revenue
Imagine being able to get 10% better results from advertising each quarter. That’s at least 40%-45% better results every year. That’s a lot of growth for the average business. And that’s the impact what professional A/B testing with correct follow-up and implementation delivers.
No sane business owner would say no to this, but most are doing just that. They do the same sort of stuff every year and expect things to magically get better on their own. A bit like gambling. Except you’re placing all your bets on the economy and how the market treats you.
Well executed A/B testing will also tell you where you need to improve your overall business model. In fact, it tells you this very clearly. Numbers don’t lie and you’ll know for sure where your revenue comes from. Then you can sunset the less profitable stuff and focus your energy on the 20% of projects that bring 80% of your revenue.
Think about the 20% of things that bring you that 80% of income. Could be clients, could be email campaigns, employees, etc. Now imagine you can take the top 20% of those 20%. That’ll be the 4% that get you 64% of your income.
Now if you could focus al your effort on those 4% and get more of them until they’re your 100%, then you’d increase your business revenue by 1600%. That’s like taking a business that generates £100,000 per year to something that makes £1,600,000 per year.
That’s the power of proper A/B testing and figuring out what works, what doesn’t work as well, and what you should stop doing. And it’s the thing you need to master if you ever want to grow your business to 8-figures and beyond.