Why does Klaviyo’s Smart Send Time optimize for open rates?

Christina Dedrick
Klaviyo Engineering
5 min readSep 18, 2019

--

Klaviyo is about making your business grow. We like to measure everything by revenue. But for Smart Send Time, we measure open rate. Why?

Conversions are rare events. A conversion rate of 0.1% for a campaign is fairly typical. Comparing the rates of rare events is extremely difficult since there is huge variation, so unless sample sizes are gigantic, it is impossible to distinguish between conversion rates.

Let’s look at an example from an experiment we performed during the beta for Smart Send Time. We worked with a health and fitness company who wanted to test sending multiple exploratory sends on different days of the week. They sent two exploratory sends, and we saw that open rate had a smooth, gradually varying trend throughout the day. The trend was nearly identical for both exploratory sends, which had different open rates because of their different subject lines. This means that open rates are very predictable between A/B tests, making them ideal for determining when the best send time is to maximize recipient engagement.

Clicks on emails are less common than opens. For the click rates, we still see consistency between the two campaigns, however, there is a lot more variability in the trend over time. Because clicks are rare, occurring at approximately 1–1.5% for this set of campaigns, variation in click rate is partially statistical noise.

Once we plot conversions, we see that the data is entirely noisy. There is no trend or consistency between the two sends. To pick the best converting time, we’d be picking a variation that showed four conversions compared to three or two compared to one. It’s not enough data to pick a send time that will consistently show the highest conversion rate. We might pick a send time that had a high number of conversions not because it was the best time because of randomness.

We needed to prove that open rate could be used as a proxy for conversion rate before basing Smart Send Time on open rates. We did a few checks to make sure it was.

First, we checked that in general, for all campaigns, open rate and conversion rate were correlated. We found that they were. There is a lot of variability (thus the extremely low R-squared = .068 ), but the trend holds up (p value is extremely low, p ≅ 0 t = 67).

However, checking the overall trend leaves a lot of information out. Since all emails are mixed together, brands performing better will have both higher open and conversion rates. These brands typically send stronger content, have more segmented lists, and have more desired products, leading overall to a higher interest rate manifesting in a higher open and conversion rate. To control for differences in content that could change the conversion rate, we’d need to test emails with the same content once opened. Emails where either subject line or send time were changed but the body of the message was identical would be appropriate for assessing if open rate and conversion rate were correlated. Our smart send time beta emails were a perfect test set to learn if open and conversion rates were correlated.

We checked to see if all of our Smart Sending beta testers saw higher conversion rates at their higher open rate times. For each 24 hour exploratory campaign, we plotted the open rate and conversion rate, shown below for our first Beauty + Cosmetics company.

We found the linear regression of the open rate and conversion rate for each company, shown in the table below. We evaluated if the correlation between open and conversion rate was positive or negative and if the correlation was statistically significant. Since many of the companies we worked with sent to a small list with only 8 variations and a handful of conversions, the coefficient relating open rate and conversions was often not significant. Overall, 11/15 of the relationships were positively correlated, with 6/11 showing significance. 3/15 were negative, only one of which showed significance. All together, we determined that there was enough evidence to show a positive correlation between open rate and conversion rate.

We did one additional check before releasing Smart Send Time. Since optimal send times are sometimes in the late afternoon and evening, which is a large change for a lot of brands, we wanted to check that late afternoon and evening hours did not historically have low conversion rates. Two common conceptions are that recipients who receive email in the morning have all day to act and convert and that recipients are more willing to buy in the morning. However, the data showed that morning campaigns were not any more successful in terms of conversion rates than campaigns sent at other times of the day. Looking at historical campaigns, we plotted the conversion rate by hour of sending, shown below. Overall, the conversion rate is pretty constant throughout the day, with a slight increase in the evening. So, when Smart Send Time shifts send times to later in the day for certain companies, it’s guiding them to send at more lucrative times of the day.

Smart send time provided customers with a +10% lift in open rates. Variation in conversion rates would be prohibitively large to make inferences based on conversion rates. Had we based smart send time on conversion rates, we’d find optimal send times that would not be consistent between sends since variation in conversion rates is statistical noise. We’d overfit to times with just a few more conversions instead of fitting to a real trend. To base our decisions on the highest quality data, we’re basing all the decisions in Smart Send Time on open rates. We are confident that a higher conversion rate will be correlated with this higher open rate.

--

--