For years we’ve been covering questions around how to deal with A/B testing and Google indexing/crawling. Is it cloaking? Can it mess up your indexing? Will it hurt your rankings? All valid questions when you show multiple variations of your web pages to your users (including GoogleBot) just to see which variation has a better conversion rate.
John Mueller of Google addressed the question once again in the AMA he did at Reddit. Here is a summary of his recommendations, including understanding that Google needs to write up a document on this topic.
- We need to write up some A/B testing guidelines
- Treat Googlebot the same as any other user-group that you deal with in your testing
- You shouldn’t special-case Googlebot on its own, that would be considered cloaking
- Sometimes it does fall into one bucket (eg, if you test by locale, or by user-agent hash, or whatnot), that’s fine
- The pages should be equivalent (eg, you wouldn’t have “A” be an insurance affiliate, and the “B” a cartoon series)
- If you have separate URLs, make sure to set the canonical to your primary URLs
- Googlebot doesn’t store & replay cookies, so make sure you have a fallback for users without cookies.
A/B testing should have a limited lifetime