SEO Is Not That Hard

When is A/B Testing NOT A/B Testing?

Edd Dawson Season 1 Episode 189

Send us a text

When is A/B testing not actually A/B testing? Discover the truth behind this common misconception and learn how to accurately measure the impact of your website changes. Join me, Ed Dawson, as I unravel the intricacies of genuine A/B testing versus basic before-and-after analysis. You'll gain insights into why simply comparing performance metrics before and after making changes doesn't qualify as true A/B testing, despite what some tools claim. Uncover the pitfalls of overlooking external factors like seasonality, traffic spikes, and algorithm updates, which can skew your results.

As we navigate this episode, I'll guide you through the nuances of proper A/B testing, or split testing, where two versions of a webpage are shown to different audience segments simultaneously. This approach offers a clearer picture of how your modifications truly impact user interaction and engagement. By understanding these distinctions, you'll elevate your SEO strategies and content optimization efforts, ensuring you're not just chasing correlations but actual causations. Tune in and refine your approach to achieving tangible results in the digital landscape.

SEO Is Not That Hard is hosted by Edd Dawson and brought to you by KeywordsPeopleUse.com

You can get your free copy of my 101 Quick SEO Tips at: https://seotips.edddawson.com/101-quick-seo-tips

To get a personal no-obligation demo of how KeywordsPeopleUse could help you boost your SEO and get a 7 day FREE trial of our Standard Plan book a demo with me now

See Edd's personal site at edddawson.com

Ask me a question and get on the show Click here to record a question

Find Edd on Linkedin, Bluesky & Twitter

Find KeywordsPeopleUse on Twitter @kwds_ppl_use

"Werq" Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 4.0 License
http://creativecommons.org/licenses/by/4.0/

Speaker 1:

Hello and welcome to. Seo is not that hard. I'm your host, ed Dawson, the founder of the SEO intelligence platform, keywordfupoleasercom, where we help you discover the questions people ask online and learn how to optimize your content to build traffic and authority. I've been in SEO and online marketing for over 20 years and I'm here to share the wealth of knowledge, hints and tips I've amassed over that time. Hi, welcome back to SEO is not that hard. It's me here at Dawson hosting, as usual, and today I just wanted to talk, well, pose a question when is AB testing not AB testing? Now, this has come up to me because there's a tool out there which has started to claim that they can do AB testing. So if you were using using this tool, you can go to it and say I've started a test on this day, I changed some content on this date, and then they will monitor your google search console and traffic. I think they do it for google search console anyway, um, for before and after that, you know um change, so how your content was before you did that change and then how your content performed after that change, and they're calling this an A-B test. Now I just want to cover why it's not an A-B test, it's actually a before and after testing, okay. So obviously some people out there don't appreciate the difference between the two. I think it's important to know the difference between the two. Now, I'm not saying that what that tool is doing is bad. We kind of do the same thing on keywords people use, but we don't call it A-B testing because it's not A-B testing. Okay, it is before and after testing.

Speaker 1:

So let's start with before and after testing. So this is probably the easier of the two methods to understand and certainly to implement. So essentially, what you do is you make a change to your web page, you update the headline or you redesign a button or you do any kind of change on that page, and then you compare performance metrics like clicks through rate or conversions or the number of clicks from before the change and then after the change. So, for example, let's say you change a page, you add some text to it and add an image, and then you notice that next week your conversions increased by 20% the following week, which is great, right? Well, possibly.

Speaker 1:

But the problem with before and after testing is it doesn't account for external factors that could could influence your results. So here are a few those factors that can change. So seasonality, so your after data might. You might be collecting that during a holiday period or a major event in your industry could be traffic changes. You know you could have run an ad campaign or you could have expired. Experience of spike in organic traffic for a completely unrelated reason, like someone linked to you. It was a newsworthy item, something like that. Or it could be an algorithm update. Google's done a rolled out core update or one of its many other algorithm updates. That's affected how that piece of content ranks. That is completely unrelated to any change you made on that site.

Speaker 1:

So before and after testing assumes that the only variable that impacts your results is the changes you made. In reality, the digital landscape is so full of noise, it's so full of other variables that aren't just the content or the design of the page that it means that it's only a very, very basic indicator of whether you've done well or not. You can't fully give all the weight and the glory essentially to the changes you made. So let's look at A-B testing. Okay, so this is very contrasting. It's sometimes called split testing. So in an ab test you create two versions of the web page. I've covered. I think I've done a podcast all about ab testing before, so if you want to learn more, just have a search the back catalog for them. But essentially you create two versions of a page a version a, which is the original that's what the page already had Then a version B the variation with the new changes in and you then show these versions to different segments of your audience. So with A-B testing, if you've got two versions written at the same time, you can essentially work out if the change that you made has a positive effect on how people interact with the page. Now the trouble when it comes to saying has the change given you a positive ranking boost? It's impossible to do at the page level, on an individual page, because Google will only rank one version of the page at a time. It won't have two different versions ranking and show two different versions in the search engine results for you.

Speaker 1:

Now there is a way around this. If you've got a site with, you've got templates with lots of common pages For example an e site, we've got lots of pages that are all product landing pages you can say you can say leave half of your pages with a current template and add half of your pages with a new template and then wait for google to index the ones with a new template and then see how they perform as a group against the um, the core group that you didn't change. And you can do that, and there are some tools that help you do that, one being search pilot but it's hugely expensive. You're talking, I think, tens of thousands a month minimum spend for them and you know quite complex to do, not a simple thing to do. So you can do that. But again, there are still variables in there that you can't completely qualify for, because you know you might have some pages in one set that have got some better internal back sort of better external backlinks than ones in the other set, and that kind of thing can skew it.

Speaker 1:

Um, but the essential thing of a b testing is that you know the test happens at the same time period against two different versions, so you can remove all the other variables like seasonality and things like that. So you know, in terms of what are the key differences between before and after testing and A-B testing is timing. Before and after testing happens sequentially, so one after another. So you make a change, then compare the before and after data, whereas A-B testing happens simultaneously, which eliminates a whole group of external variables. Control groups Before and after testing has no control group. You compare it apples to oranges. You know it's different time periods, different conditions. Whereas A-B testing includes a control group, version A, which allows for a direct comparison under the same conditions. The reliability of results so before and after testing is prone to false positives or negatives, because it doesn't account for the factors influencing your metrics, whereas A-B testing provides statistically significant results, assuming you've got a large enough sample size and that you run the test for long enough.

Speaker 1:

Complexity Before and after testing it's very easy to set up. You don't need any special tools required, just make the change, publish it and you just look at your analytics from before and after and see how they perform over a time period. Whereas A-B testing requires special tools. You used to have Google Optimize. That's been sunset, you can't use it anymore. But there's things like Optimizely or VWO to manage those split tests and split the traffic and analyse results. But there's a cost, both in terms of setup and in terms of running and operating those tools. So when should you use?

Speaker 1:

You know, before and after testing is actually quite useful. You might not be able to split traffic for technical or logistical reasons. You're testing something really obvious like a big, dramatic change, like launching an Italian website design or changing a pricing model, those kind of things. Or you just need a rough idea of performance and you don't require statistical rigor. So, for example, if your website isn't getting much traffic, then setting up an AV test probably isn't going to be worth the effort because the sample size is going to be too small to produce reliable results. Av testing is really good when you need precise, reliable insights about how specific impacts you know changes impact user behavior. If you're testing incremental changes, like tweaking buttons and headlines, especially when it comes to conversion or optimization, you're tweaking to try and improve conversion, works really well and you have sufficient traffic to split between the two versions and get meaningful statistical data. So before and after is great.

Speaker 1:

I don't want to knock it. Let's say we use it when we're using keywords people use on their Google Search Console integration. We're making changes to content. Content we use basically before and after, because there are a lot of times you change in a piece of content where it's not templatized, it's not hundreds of pages with that piece of content on it and all you can really do is a before and after test to see if the graph goes up or the graph goes down after making your changes. And you have to kind of apply your own knowledge in terms of when google corruptors have happened, what your seasonality is, things like that. You've got to sort of overlay that yourself in your mental model and look at it. It's a completely fine thing to do, but I wouldn't call it ab testing because it's not ab testing. It's something very different.

Speaker 1:

Now, with both methods there's certain pitfalls you want to try and avoid, okay, so first all, don't rush to conclusions In both before and after testing. You know it's tempting to credit any performance improvement to a change but, like I say, correlation doesn't always mean causation. So, you know, make sure you think about what other things are occurring at the time, like we just mentioned before, in A-B testing, you know, don't stop your test too soon. You need to let tests run long enough to reach a statistical significance, otherwise you're still just guessing, testing too many changes at once.

Speaker 1:

Whether you're doing before, after or A-B testing, keep it simple. If you test too many changes simultaneously, you'll never know which one drove results, ignoring audience segmentation. So in A-B testing, you've got to really make sure that your split is random and representative of your audience. Otherwise, if you're sending people coming from one platform, say from social media, down to one version of people, coming from organic to another one, then yeah, these people are coming with very different intents and from very different starting points. If you don't randomize them and mix them up, then you could skew the results. So if you've got questions about testing and how to test and how to analyze things, and do just get in touch, ask any questions always here to help and you know, if you found this show useful, if you found value, then please do consider giving us a review it really helps and subscribing again, that really helps so we can show up in your feed every week. Um, and yeah, that's it for for today. So until next time, just keep optimizing, stay curious and remember SEO is not that hard when you understand the basics. Thanks for listening. It means a lot to me.

Speaker 1:

This is where I get to remind you, where you can connect with me and my SEO tools and services. You can find links to all the links I mentioned here in the show notes. Just remember, with all these places where I use my name, the Ed spelled with two d's. You can find me on linkedin and blue sky just search for ed dawson. On both.

Speaker 1:

You can record a voice question to get answered on the podcast. The link is in the show notes. You can try our seo intelligence platform keywords people use at keywords people usecom, where we can help you discover the questions and keywords people are asking online. Post those questions and keywords into related groups so you know what content you need to build topical authority and finally, connect your Google Search Console account for your sites so we can crawl and understand your actual content, find what keywords you rank for and then help you optimize and continually refine your content and targeted, personalized advice to keep your traffic growing. If you're interested in learning more about me personally or looking for dedicated consulting advice, then visit wwweddawsoncom. Bye for now and see you in the next episode of SEO is Not that Hard.

People on this episode