Automated marketing and site development.


Celebrate Unsubscribes case study

Celebrate Unsubscribes case study

by Cyndie Shaffstall

A case study documenting Spider Trainers’ effort to remove inactive leads to improve deliverability and level campaign analytics.

Introduction

Beth Hayden, senior staff writer at CopyBlogger, wrote, “A lot of email marketers take it very personally when people drop off their list. They fret and sweat over every lost reader; but I argue that there are many reasons why you want to celebrate — not mourn — when someone unsubscribes from your list.”

I think Beth is onto something. Not only should you celebrate the loss, you should encourage it.

Many email-automation systems charge you based upon either the number of leads in your list or the number of emails you send. In either case, when you send an email to a prospect that clearly has no interest in your message, you are wasting money or effort, or perhaps both. What’s more, this inactivity could well be affecting your sender reputation and spam scores.

If true, then shouldn’t you ask yourself: At what point are my inactive leads a liability? Is it time to clean house?

In this case study, I’ll discuss a recent reengagement campaign that we deployed at Spider Trainers. The test list was small, just 2966 leads: 193 of which were suppressed as system addresses, 336 that bounced, and netting 2244 actual emails sent for the first version, but I feel the process we implemented is worth a look and may give you ideas for deploying your own reengagement and archiving campaign — on a small or large scale.

The campaign

All lists have a percentage of inactive (disinterested) leads, and ours was no different. We have an active marketing list of exactly 7,000 leads. Of this, we found that 2966 of those had not interacted with any of our messaging (online or offline) since their name was added to our marketing list (up to a duration of three years). That was alarming to me: nearly half of all of our leads had not opened, clicked, unsubscribed, or engaged in any other manner with our messages. Obviously there was work to be done, but on the upside, a little more than one half was moderately to very engaged. Now, was the time to either cut loose the disengaged half, figure out how to reengage, or add them to our drip campaign.

NOTE: Campaign engagement is measured through the use of a beacon or graphic that, when an email is opened and graphic displayed, sends notification to the sending application. This is then calculated as a contribution to the open rate.
It is possible some of these leads had opened a previous message but did not allow graphics to load and in that case, would not have added to our open rate.
The effective open rate is calculated by adding clicks made within emails, where the beacon was not triggered, to emails opened.

Features

In order to execute a reengagement campaign effectively, our software needed to support:

  • Segmentation
  • Email automation
  • In-depth analytics

These must-have features would enable us to identify inactive leads, automatically deploy messages, and track engagement, respectively.

As this particular campaign deployed, these same features were used to move new engagements into our drip or nurture campaigns, relegate for archival still-inactive leads, and capture any new engagement that did occur.

Message

Our message was simple and was presented in an HTML format along with a text format for recipients who prefer that presentation style. While I acknowledge a business email sent from my Google Apps for Business account likely would have received more opens, without the analytics tracking of our email-automation application, I wouldn’t have been much wiser. Still, it’s worth noting that if analytics aren’t your end goal — just reengagement — you might be better off to send your reengagement message through your company’s individual business accounts.

The email content was constructed in such a way that I felt any opens or clicks would provide a morsel of insight into interest. The text first acknowledged that we were tracking them and had noticed their inactivity. We then allayed concerns about the intent of the email with an empathetic statement, followed by two paragraphs reminding them of our services. To close the loop, we offered an unsubscribe link, a link to the most-popular page on our website (resources page), and finally, a suggestion that if hearing from us by email wasn’t preferable, perhaps they would enjoy our LinkedIn group.

I felt that these three links would effectively disclose their interest in continuing a relationship:

  • No interest (unsubscribe)
  • Renewed interest (resources page)
  • Moderated interest (LinkedIn group)

Depending upon your products or services, these types of links may not be on target and you might wish to consider other, more, or fewer options to better isolate interest levels and types.

The design of the email was simple, had few graphics, provided contact information and social sharing, and was consistently branded.

Testing

Practicing what we preach, after six days, we created a new segment from our inactive list of those who had still not engaged and resent the message with a new subject line. We use this process with nearly all internal and client emails because we have found that the differing subject lines resonate with different people and can net more opens.

We typically repeat the process every three days or so, up to five times before we call the campaign complete. This has the potential to convert a campaign such as this one (designed to remove dead leads) into a learning experience about subject lines.

In addition to the dwindling number of sends and testing of subject line, we also tested day and time of send. As you can see in the following table, the initial email went out on Wednesday 3 AM, followed by Tuesday 2 PM, and the final send on Friday 6AM.

The flaw in this approach is that for true A/B testing, we should have created an A/B/C split of the three subject lines and sent at all three day/times. With our less-structured approach, we don’t know whether the net number sent, subject line, or the day/time affected the analytics most.

Analytics

Our analytics at about 24 hours after each send are shown in the table on the next page.

At the close of the campaign, we had 66 people open one of the emails — who had not opened since they were added to our list — and a little more than half of them unsubscribe.

It’s always difficult to extrapolate meaningful data from numbers this small, except that we outperformed my expectations across the board. I assumed that we would have zero engagement, given that the leads in this list had been completely inactive for as long as they had been in the list. Opt-outs were higher than I expected, simply because I thought opens would be zero, thus no one to click the unsubscribe link within.

The high level of unsubscribes on Friday/Saturday was an interesting number as well. Does this mean that people have more time on their hands on Fridays and use this as an opportunity to clean out their inbox and spam folder, or is this an indication that they now have received this message (different subject lines) three times and they simply want to stop the flow?

I reached out to Michael Mendoza, CEO at Lineup, for some insight. Michael was a dormant lead in this list, but one who opened the reengagement email (without any additional prompting by us). Michael skipped the first but opened the second email, so I surmised that he opened based upon subject line alone.

“Lineup provides a CRM solution, so I do keep an eye on campaigns launched by our business associates. I always open business emails that I receive from Cyndie since I know her personally, but I tend to skip commercial emails that come from Spider Trainers; these are more appropriate for my marketing department. This email, however, piqued my interest because of the subject line [Are You Feeling Cyberstalked?] and made me wonder about the message within. Once I opened the email, the text indicated they were tracking my lack of interest, and I found it was a marketing email. I did not click any links because, while it’s true that I have been inactive, I still prefer to keep an eye on what Spider Trainers sends out,” said Michael.

The question then becomes, is Michael representative of the names in my list? Well — as it turns out — he is. About 2,000 of our leads are CEOs of companies with whom I’ve developed relationships over the past three decades of my entrepreneurial pursuits. I know that there are many like Michael Mendoza; interested enough in my new endeavors to continue to receive messages from Spider Trainers, but not the appropriate contact person for our marketing efforts. To verify that these inactive names are of this segment, I would need to individually vet each name, but I don’t think that’s necessary. My goal was to enable the archival of names that were not interested in our messages, and I think I’ve managed to effectively isolate those.

Unsubscribes

Looking back to the table of data, the most interesting statistic were the few people who chose not to disengage — even though we made it particularly easy to do so. Like Michael, 31 other recipients opened the email, presumably read the message, and chose to do nothing — not even unsubscribe. This segment will be relegated to our drip campaign and continue to receive one message a month. If they choose to interact with any of those in any way, they will join the ranks of our nurture list.

When we started this campaign, we had 2966 inactive leads. The segment today, after unsubscribes, bounces, and opt-outs is down to 2832. We’re now ready to archive them, under the assumption our messages go straight to their spam folder (which affects our spam score), or leave them in the list assuming we haven’t yet written a subject line that they found compelling enough to cause them to engage.

We have chosen to archive the remaining leads in this list and with that lower our outlay to our service provider by $200 per month.

Spam and reputation

It has become easier than ever to report spammers. Most email clients include a one-click, report-spam button, which logs a complaint at the ISP level or relays it back to the sender.

Unfortunately, spam complaints do not track why a recipient thinks the message is spam. Our clients or subscribers might have forgotten about opting-in to our list. We’ve also found spam complaints may increase if we send too often or if we send irrelevant messages. No matter what prompted the complaint, it contributes to a poor sender reputation, and that’s exactly what we are trying to avoid by encouraging unsubscribes. Unsubscribes are much preferable to a spam complaint.

Our sender reputation is also associated with the IP address of the mail server we are using. Thus, our email service provider scores our reputation by assigning different values to our email activity. The total of these values provides a ranking for Spider Trainers as a sender and can influence our deliverability rate.

As a final consideration, hard bounces (invalid email addresses) can be nearly as damaging as a spam complaint, so it’s of the utmost importance that we remove those after every send — which we do.

Summary

Was all of this worth the effort? You bet. By removing disinterested parties from my list, I saved $2400 a year in software expenses, improved our sender reputation by ensuring the emails I send are being opened, and reduced the likelihood of gaining a spammer moniker because the emails I do send are making their way to inboxes of people who want to receive the message.

There are plenty of other re-engagement case studies published, and I’ve read a fair number of them, but most are from large companies making grand efforts that I find difficult to replicated with such a small list. I hope that by sharing our efforts, other small companies will realize that list cleansing is not just for the Fortune few. It’s for everyone on any kind of budget and it’s a worthwhile exercise.

 

  • Post Categories
  • Blog