Work / Corvus / 'Email redesign'

In this case study, I reflect on taking a first-principles approach to redesigning an important user touchpoint, the policyholder email.

We were sending an email to policyholders once a month. What was it trying to accomplish?

A communication audit

This was an email sent to our policyholders after their first security scan, once they had a policy. I wanted to understand the intent behind every element in the email.

The email was trying to accomplish the following:

  • call to action to review 'security report' (primary CTA)

  • educate policyholder on what the security report is

  • educate policyholder on 'vCISO Center'

  • call to action to fill out the 'security questionnaire'

  • educate policyholder on 'Action Center'

  • call to action to speak to service team

As a whole, there were too many things being attempted at once. And visually, it felt like a wall of text. But most importantly, there was no information that the policyholder valued.

Looking at the numbers

We had mediocre email engagement - a 15.6% open rate, below industry average - so I wanted to understand why, and after looking at the numbers my hypothesis was that it had to do with two factors: content and cadence.

One might think, "When it comes to the open rate, doesn't the email subject line have more to do with it than anything else?" And that's a valid question - but it would be more applicable if the subject line was always changing. In a situation where we were sending the same automated email with the same subject line, we could look at things in a little more detail:

Open rate (first 3 of 11 emails)

Open rate (first 3 of 11 emails)

Open rate (first 3 of 11 emails)

As seen above, evidence pointed to the fact that it wasn't so much the subject line that mattered, rather a diminishing interest in the repeated email that was being sent to them. We hypothesized that this could be due to a few things:

  • policyholders not finding value in the emails (saw it once, wasn't useful)

  • policyholders becoming "email blind" due to frequency (seen as spam)

In subsequent user interviews, we were able to confirm both.

Adding value (a 'first principle' approach)

First principle: the policyholder email should add value for the policyholder.

When something in a system has lost its way or no longer makes sense, I think it's useful to get back to 'first principles' with the question "what are we trying to do here?"

After several user interviews and revisiting old feedback, it was clear that policyholders wanted to know a few key things during the term of their policy:

  1. What's my Corvus Score?

  2. Am I in good standing for policy renewal?

And ultimately, even the question about the Corvus Score had to do with renewal - policyholders saw it as an indicator of our evaluation of their cybersecurity health, and therefore their readiness for renewal.

So we felt that if we could help policyholders understand these two questions with our email, we would be adding meaningful, immediate value.

not quite wireframes, not quite documentation - just getting the building blocks down on "paper".

Optimizing email frequency and cadence

The second order of business was to address the cadence and frequency of the policyholder emails.

I explored two factors as a function of frequency, using a cartesian framework:

  • Impact as a function of frequency

  • Value as a function of frequency

Impact as a function of Frequency

The above relationship, while grossly simplified, illustrates the point that there is likely an inverse relationship or negative correlation between the impact an email has on a policyholder and the frequency at which it is sent. The more emails that get sent, the more spammy it becomes. The obvious thing that this tells us is that we should be optimizing by reduction.

Value as a function of Frequency

The second function was interesting to mull over, in that it pointed to another factor or dimension which might matter more than just frequency - the cadence.

Value as a function of Cadence

In the revised version above, we were getting to something more true - when it comes to value for the policyholder, we need to consider both the frequency AND timing of the emails.

Up to this point, we were sending an email once a month. We asked policyholders point blank about this - they said currently it was too much, and they were just ignoring them.

Too much

email sent once a month (11 times in 12 months)

So how often did they want to hear from us? It turned out that cadence was about once a quarter.

Much better

email sent once a quarter (3 times in 12 months)

Immediate value, maximum impact

After several user tests and stakeholder reviews, we landed on a new version of the email that was streamlined and to-the-point.

The new email was now renewal-centric, which aligned with what we discovered was most important to policyholders.

And while it gently encouraged policyholders to sign into the dashboard to see changes from the latest scan, the email was still valuable at a glance - providing all the context that policyholders wanted to know around renewal, and potential actions they may need to take to be in good standing for renewal.

Making sure there was value for users that chose to engage with the email passively was an important change in philosophy from the previous way of gating valuable information behind a sign in screen.

Results

Over a year after shipping this redesign, the results were clear:

Open rate comparison (12-month average)

'Old email' - period 02/2022 - 02/2023, 'New email' -period 03/2023 - 03/2024

Open rate comparison (12-month average)

'Old email' - period 02/2022 - 02/2023, 'New email' -period 03/2023 - 03/2024

Open rate comparison (12-month average)

'Old email' - period 02/2022 - 02/2023, 'New email' -period 03/2023 - 03/2024

We achieved a whopping 2.67x improvement in the average open rates. More importantly, they were also translating into improved sign-ins to the dashboard:

Sign-in rate comparison (opened email-to-dashboard)

'Old sign-in rate': 02/2022 - 02/2023, 'New sign-in rate': 03/2023 - 03/2024

Sign-in rate comparison (opened email-to-dashboard)

'Old sign-in rate': 02/2022 - 02/2023, 'New sign-in rate': 03/2023 - 03/2024

Sign-in rate comparison (opened email-to-dashboard)

'Old sign-in rate': 02/2022 - 02/2023, 'New sign-in rate': 03/2023 - 03/2024

Reflection

The results were great - but looking back, what I'm most happy about with this project is the fact that we

We achieved a whopping 2.67x improvement in the average open rates. More importantly, they were also translating into improved sign-ins to the dashboard: