The data are what they are

I’ve had a lot less opportunity to blog at the recent M3AAWG conference than I expected. Some of it because of the great content and conversations. Another piece has to do with lack of time and focus to edit and refine a longer post prompted by the conference. The final issue is the confidential nature of what we talk about.
With that being said, I can talk about a discussion I had with different folks over the looking at A/B testing blog post from Mailchimp. The whole post is worth a quick read, but the short version is when you’re doing A/B testing, design the test so you’re testing the relevant outcomes. If you are looking for the best whatever to get engagement, then your outcome should be engagement. If you’re looking for the best thing to improve revenue, then test for revenue.
Of course, this makes perfect sense. If you do a test, the test should measure the outcome you want. Using a test that looks at engagement and hoping that translates to revenue is no better than just picking one option at random.
That particular blog post garnered a round of discussion in another forum where folks disagreed with the data. To listen to the posters, the data had to be wrong because it doesn’t conform to “common wisdom.” The fact that data doesn’t conform to common wisdom doesn’t make that data wrong. The data is the data. It may not answer the question the researcher thought they were asking. It may not conform to common wisdom. But barring fraud or massive collection error, the data are always that. I give Mailchimp the benefit of the doubt when it comes to how they collect data as I know they have a number of data scientists on staff. I’ve also talked with various employees about digging into their data.
At the same time the online discussion of the Mailchimp data was happening, there was a similar discussion happening at the conference. A group of researchers got together to ask a question. They did their literature review, they stated their hypothesis, they designed the tests, they ran the tests. Unfortunately, despite this all being done well, the data showed that their test condition had no effect. The data were negative. They asked the question a different way, still negative. They asked a third way and still saw no difference between the controls and the test.
They presented this data at the conference. Well, this data went against common wisdom, too, and many of the session participants challenged the data. Not because it was collected badly, it wasn’t, but because they wanted it to say something else. It was the conference session equivalent of data dredging or p-hacking.

 
Overall, the data collected in any test from a simple marketing A/B testing through to a phase III clinical trial, is the answer to the question you asked. But just having the data doesn’t always make the next step clear. Sometimes the question you asked isn’t what you tested. This doesn’t mean you can retroactively find signal in the noise.
Mailchimp’s research shows that A/B testing for open rates doesn’t have any affect on revenue. If your final goal is to know which copy or subject line makes more revenue, then you need to test for revenue. No amount of arguing is going to change that data.
 
 

Related Posts

My panels from #EEC16

I had the privilege to be a part of two panels at EEC16, with some of the best folks in the business.
The first panel was “Everything You Ever Wanted to Know About Deliverability, but Were Afraid to Ask.”  eec_deliv_slide
We had a lot of great audience questions.
The first question, which was awesome (and I don’t think planted) was: “What is the most important thing we can do to improve our deliverability?”
All of us had really similar answers: pay attention to your data and your acquisition. Deliverability starts with your data: good data = good deliverability, poor data = poor deliverability. How you acquire addresses is vital to any email program.
I’ve had dozens of sales calls with potential clients over the years. Most of them tell me lots of stuff about their marketing program. I hear details of engagement, data hygiene, response rates, CTRs, bounce handling. But very, very few people spontaneously tell me how they’re acquiring addresses. That’s so backwards. Start with acquiring addresses the right way. Deliverability is all in the acquisition step. Of course, you need to nurture and care for those subscribers, sent the right message at the right time and all the good things we talk about. None of that matters if you don’t start with good data.
Another question was about spamtraps. The panel had me take this one. I’ve written extensively about spamtraps and what they do and what they mean. The important thing to remember, though, is that a spamtrap is a signal. If you have spamtraps on your list, then there is a problem with your data acquisition. Somehow, people are getting addresses that do not belong to them on the list.
Spamtraps are a problem, but not for the reasons many people think they are problems. Folks get upset when their mail is blocked because of spamtraps. Blocking isn’t the only damage, though. For every spamtrap on a list that is one less responsive addresses. It’s one customer who you are not reaching. If there are spamtraps on a list, it’s likely there are deliverable addresses that don’t belong to your customers, too. These recipients are going to view that mail as spam. They didn’t sign up, they didn’t ask for it, they don’t want it. They’re going to complain, hurting your reputation. Too many of these recipients and delivery will suffer.
Spamtraps are a warning that something is wrong. That something is usually your data acquisition process.
Questions went on through the session and ranged from things like how to get mail to B2B inboxes and is there value in certification. We also had some insightful questions about authentication.
The second panel I was on was the closing keynote panel: “ISP Postmasters & Blacklist Operators: Defending Consumer Inboxes.” This was where I got to show my incoming mail chops, a bit. I was a last minute fill in for the panel and I am honored that Dennis and Len thought I could represent the incoming mail folks. It’s not like I’m out there writing filters, but I do pay attention to what the filter operators are saying and doing.
I think it is important for marketers to get a feel for what’s really going on at the ISPs. They aren’t trying to stop real mail, they’re trying to stop malicious mail. Matt from Comcast talked a lot about how marketers and ISPs share customers and the ISPs are trying to keep those customers safe and happy. Jaren discussed some of the decision making processes his company goes through deciding whether to err on the side of letting spam through or filtering good mail. Tom discussed how his blocklist works with some brands to help stop phishing attacks against those brands.
Overall, I think the session was a great success. The conference was great and I am looking forward to going back next year.
Were you at either panel? What did you think?
+eddc

Read More

Indictments in Yahoo data breach

Today the US government unsealed an indictment against 2 Russian agents and 2 hackers for breaking into Yahoo’s servers and stealing personal information. The information gathered during the hack was used to target government officials, security employees and private individuals.
Email is so central to our online identity. Compromise an email account and you can get access to social media, and other accounts. Email is the key to the kingdom.

Read More

Edison acquires part of Return Path

Today Matt Blumberg announced that Edison Software acquired Return Path’s Consumer Insight division, current customers and some Return Path staff.
Congrats to everyone involved.

Read More