If you’re sending your marketing campaigns without benefit of A/B or multi-variant testing — most companies admit to fewer than five tests per month — you are effectively acting as a focus group of one. You are assuming all of your constituents feel the same way about your campaign as you do. Big mistake.
Most of us have a least a bit of familiarity with A/B testing and have integrated it into some of our deployments. Testing subject line A against subject line B is likely the most common test, but with A/B testing you can go so much further — both simple and complex — for instance:
A/B and multi-variant testing enable you to learn what makes your prospects, leads, subscribers, and customers tick. When you adopt a consistent testing process, your accumulative results will provide you with the knowledge to implement dramatic changes producing a measurable impact across campaigns, landing pages, websites, and all other inbound and outbound initiatives.
We have a client whose singular call to action in every email is to discount their product and each offer is more valuable than the last. When I asked how well this worked, they admitted, the bigger the discount, the more they sold. When pressed, however, they could not tell me the ROI of this approach. Sure, they sold more widgets, but at the discount level they offered, they also made far less profit.
I suggested an A/B-laden drip campaign offering no discounts, and instead providing links to testimonials, case studies, demos of their product, book a meeting links, and other inbound content. In this way, we were changing their position from asking for the business to earning the business. While I admit this usually lengthens the sales cycle, it also means money is not being left on the table unnecessarily.
For this client, the change in approach was simply too dramatic and they found they couldn’t stick with it long enough to gather the data needed to make long-term business decisions. The limited of data they were able to collect in the first few emails did show, however, an inbound approach deserved strong consideration by their organization.
Not all A/B testing need be this dramatic — we could have started them off with a less-committed approach — and my takeaway was: You don’t have to learn it all now; A/B testing can be integrated in a small way. Whether you go all out or an occasional test, A/B data is useless if you do not set measurable goals. Measurable goals mean you will establish:
If your email application does not support A/B testing, you can use a more automated approach. Simply create two versions of your marketing campaign and divide your list randomly in half — unless, of course, what you’re testing is something within your list, such as gender, or locale.
I often am in search of information well beyond opens, clicks, and visits, so I turn to Email on Acid for email heat maps and Crazy Egg for landing page and website heat maps. While these are effective on live pages and campaigns, it’s not required you deploy A/B testing to a live audience. Testing can be just as effective with a small focus group, just be sure it’s not a focus group of one.