Bounces, complaints and metrics

In the email delivery space there are a lot of numbers we talk about including bounce rates, complaint rates, acceptance rates and inbox delivery rates. These are all good numbers to tell us about a particular campaign or mailing list. Usually these metrics all track together. Low bounce rates and low complaint rates correlate with high delivery rates and high inbox placement.

A totally fake graph I just made up
An example of bounce and complaint metrics correlating with inbox delivery
But sometimes the numbers lie and there isn’t a clear correlation between the metrics and the inbox delivery. In fact, these bounce rates and complaint rates are exactly the same as above, but the complaint and bounce rates don't correlate with delivery.
Another graph I just totally made up
Inbox delivery is decreasing even as complaints and bounces stay the same.

Why does this happen?

There are a number of different reasons that mail with low complaint and bounce rates will have low inbox delivery rates. Some of them are signs of improper behaviour on the part of the sender, some of them are simply the consequence of how mail is currently filtered. Whatever the reason it can cause confusion on the part of a lot of senders. To many people having low complaint rates and low bounce rates means good inbox delivery.
It’s probably partially the fault of delivery experts that so many people have such tight mental connections between low complaints and bounces and inbox delivery. When approached with delivery problems by customers and clients, many of us will look at complaint and bounce rates and advise that both complaint rates and bounce rates should be lower. As we work with clients to lower rates, then their inbox delivery often improves. So, clearly, low bounce and complaint rates mean higher inbox rates.
The problem is, though, that complaint rates and bounce rates are proxy measurements. They’re used to measure how much a mail is wanted by recipients and how clean the mailing list is. A list with high complaints and bounces is usually a list that doesn’t have much permission associated with it. Recipients generally don’t want mail that mail from that sender.
It’s important to remember, though, that complaints and bounce rates don’t specifically measure how wanted a particular mail is. The reason we focus on them is that they are easy to measure and they are correlated with how wanted an email is. We can use them to give us information. Delivery experts use that information to craft solutions to delivery problems. But we’re not actually fixing complaint rates and we’re not actually fixing bounce rates. Often the things I tell clients don’t directly lower complaint rates and they don’t directly lower bounce rates. Instead, I focus on fixing the policies and processes that are causing poor delivery. As those things get cleaned up, the complaint rates and bounce rates decrease and inbox rates increase.

Correlation is not causation
The correlation is that good lists and good senders have low bounce rates and low complaint rates. It’s not that they do anything special to address complaints and bounces, but rather that all the things they do focus on keep their lists healthy. Healthy lists have low bounce and complaint rates.
It’s very possible to have low complaint rates and low bounce rates without having a healthy list. That’s often what’s going on when senders have “great stats” and “zero complaints” but are still seeing poor inbox rates. These senders will focus on getting the great stats, because they think that it’s the great stats that lead to the good inbox rate. But they have it backwards. It’s that good list management, hygiene and engagement lead to good inbox rates. One way to measure list management and hygiene and engagement is to measure complaint and bounce rates.
Not every list with good stats has those goods stats because of good list management, hygiene and engagement. These rates are fairly easy to manipulate and some senders spend a lot of money and time manipulating their stats. Manipulating delivery stats did result in better inbox delivery, a bit, which is why so many companies spent so much time doing it. ISPs are adapting, though, and this is why we’re seeing senders with “great stats” have poor inbox delivery.
 
 

Related Posts

How do unengaged recipients hurt delivery?

In the comments Ulrik asks: “How can unengaged recipients hurt delivery if they aren’t complaining? What feedback mechanism is there to hurt the the delivery rate besides that?”
There are a number of things that ISPs are monitoring besides complaint rates, although they are being cautious about revealing what and how they are measuring things. I expect that ISPs are measuring things like:

Read More

Spamtraps are not the problem

Often clients come to me looking for help “removing spamtraps from their list.” They approach me because they’ve found my blog posts, or because they’ve been recommended by their ISP or ESP or because they found my name on Spamhaus’ website. Generally, their first question is: can you tell us the spamtrap addresses on our lists so we can remove them?
My answer is always the same. I cannot provide a list of spamtrap addresses or tell you what addresses to remove. Instead what I do is help clients work through their email address lists to identify addresses that do not and will not respond to offers. I also will help them identify how those bad addresses were added to the list in the first place.
Spamtraps on a list are not the problem, they’re simply a symptom of the underlying data hygiene problems. Spamtraps are a sign that somehow addresses are getting onto a list without the permission of the address owner. Removing the spamtrap addresses without addressing the underlying flaws in data handling may mean resolving immediate delivery issues, but won’t prevent future problems.
Improving data hygiene, particularly for senders who are having blocking problems due to spam traps, fixes a lot of the delivery issues. Sure, cleaning out the traps removes the immediate blocking issue, but it does nothing to address any other addresses on the list that were added without permission. In fact, many of my clients have discovered an overall improvement in delivery after addressing the underlying issues resulting in spamtraps on their lists.
Focusing on removing spamtraps, rather than looking at improving the overall integrity of data, misses the signal that spamtraps are sending.

Read More

20% of email doesn't make it to the inbox

Return Path released their global delivery report for the second half of 2009. To put together the report, they look at mail delivery to the Mailbox Monitor accounts at 131 different ISPs for 600,000+ sends. In the US, 20% of the email sent by Mailbox Monitor customers to Return Path seed accounts doesn’t make it to the inbox. In fact, 16% of the email just disappears.
I’ve blogged in the past about previous Return Path deliverability studies. The recommendations and comments in those previous posts still apply. Senders must pay attention to engagement, permission, complaints and other policy issues. But none of those things really explain why email is missing.
Why is so much mail disappearing? It doesn’t match with the philosophy of the ISPs. Most ISPs do their best to deliver email that they accept and I don’t really expect that ISPs are starting to hard block so many Return Path customers in the middle of a send. The real clue came looking at the Yahoo numbers. Yahoo is one of those ISPs that does not delete mail they have accepted, but does slow down senders. Other ISPs are following Yahoo’s lead and using temporary failures as a way to regulate and limit email sent by senders with poor to inadequate reputations. They aren’t blocking the senders outright, but they are issuing lots of 4xx “come back later” messages.
What is supposed to happen when an ISP issues a 4xx message during the SMTP transaction is that email should be queued and retried. Modern bulk MTAs (MessageSystems, Port25, Strongmail) allow senders to fine tune bounce handling, and designate how many times an email is retried, even allowing no retries on a temporary failure.
What if the missing mail is a result of senders aggressively handling 4xx messages? Some of the companies I’ve consulted for delete email addresses from mailing lists after 2 or 3 4xx responses. Other companies only retry for 12 – 24 hours and then the email is treated as hard bounced.
Return Path is reporting this as a delivery failure, and the tone of discussion I’m seeing seems to be blaming ISPs for overly aggressive spamfiltering. I don’t really think it’s entirely an ISP problem, though. I think it is indicative of poor practices on the part of senders. Not just the obvious permission and engagement issues that many senders deal with, but also poor policy on handling bounces. Perhaps the policy is fine, but the implementation doesn’t reflect the stated policy. Maybe they’re relying on defaults from their MTA vendor.
In any case, this is yet another example of how senders are in control of their delivery problems. Better bounce handling for temporary failures would lower the amount of email that never makes it to the ISP. This isn’t sufficient for 100% inbox placement, but if the email is never handed off to the ISP it is impossible for that email to make it to the inbox.

Read More