Reporting on Support Driven Growth

According to Forrester research, companies that invest and focus in customer experience have 5 times the revenue of companies that don’t. Everyone seems to know that investing in customer experience is critical to growing a healthy business, but very few leaders know how to invest in customer experience, and very few support leaders know how to make a case for it. 

The biggest hurdle expressed from C-suite and support managers alike is that it’s too challenging to quantify the success of investing in customer experience, making the ROI too nebulous to inspire action. But that’s just not the case. The problem is not that it’s challenging to see the ROI, but that we aren’t used to looking at support metrics in a way that ties back to key business metrics.

Drawing the line from support to revenue requires tying support’s relationship to some standard high level revenue metrics: trial to paid conversion, MRR/ARR, average revenue per customer, customer LTV, retention rate, expansion, and churn. Specifically, how the variable of support engagement affects those key metrics.

Defining Support Engagement

“Support engaged” means any channel where a customer might have engaged with your support team, like email, chat, phone, social, or any other trendy channel that pops up. You can look at each channel individually, or you can bundle them in a group. 

The benefit of looking at them individually is that you can start to measure if certain channels are more impactful to the business than others, but it’s also incredibly tedious. And, if you don’t already have the means to create beautiful charts in a BI tool like Looker that can mix and match to align support engagement metrics with other business metrics, you’ll likely be starting this project by hand. With that in mind, bundling channels together as one “support engaged” metric is a fine place to start. 

Breaking down metrics by Support Engaged

Reporting on support is similar to reporting on traditional revenue driving roles like sales or success, but the focus shifts from individual performance to collective performance. 

Once you’ve defined and/or bundled your “Support Engaged” segment, you’ll want to look at their engagement in a few different ways: 

  1. How many customers, prospects, or trials engaged with support (and what percentage is that of total customers, prospects, or trial)

  2. Of the customers who engaged with support, what is their:

    • Trial to paid conversion: This metric is specific to businesses with a self-serve trial. It could also manifest as freemium to premium conversion, or even time in checkout or cart before purchase. 

    • MRR and ARR: Monthly recurring revenue or annual recurring revenue is relevant for businesses with a recurring revenue model. If you’re not a business with recurring revenue, this metric is less relevant. 

    • Average revenue per customer: How much is each customer spending, plain and simple.

    • LTV: What is the total amount of money the customer has spent with your business over their lifetime. 

    • Retention: More specific to businesses with recurring revenue models, how long does this customer stay a customer. It can also look like repeat customers to businesses that don’t have a recurring revenue model. 

    • Expansion: How much more is a customer spending with you over time, or do they stay the same size and value as when they started. For example, customers that move to a higher tiered plan. 

    • Churn: The inverse of retention. What customers are you losing?

You’ll want to look at all of these vs. customers who did not engage with support. Depending on your business, that other subset might be considered purely self-serve or transactional customers. If you really want to break down support’s relationship to revenue, you can also look at these self-serve and transactional customers in relationship to how much they consume support generated resources (like knowledge base articles or videos), but that’s a huge project and not quite necessary for a starting point. 

Another way to slice this data is the inverse. Of the total customers:

  • that converted from trial to paid

  • with highest lifetime value

  • that expanded the most

  • that churned

How many and what percentage of these customers engaged with support. Before long, a story starts to emerge that ties how much support touch intersects with revenue.  

Blending with Customer Experience KPIs

Now that you have started measuring how support engagement affects vital business metrics, you can measure how changes in support performance affect those vital business metrics. In other words, you can start blending revenue metrics with typical customer experience KPIs like response time, replies to resolve, and customer satisfaction to see how improvements or lapses in performance effect revenue metrics.

You can start asking (and answering) questions like this:

Do customers that give “Great” customer ratings stay customers longer?

If so, it may be worth investing in improving customer satisfaction if you aren’t seeing a high volume of those great ratings.

Are customers more likely to convert if they receive faster replies during the trial phase?

If so, you may want to work on improving operational efficiency to make speed of service more attainable. 

Interpreting the data

When it comes to making this data actionable, what you’re looking for is discrepancies. Each revenue metric is tied to a different aspect of customer experience, so if you have any areas that are performing poorly vs other areas that are performing well, it’s a good indicator of friction that you can experiment with your team to iron out.

For example, maybe you’re finding that customers that engage with support during trial don’t buy, but those that engage with support later in their customer lifecycle stay happy and grow. It could indicate your team needs help with onboarding type questions, and that everyone’s clear on the ins and outs of troubleshooting problems during initial setup. 

Or the opposite! Maybe you find that customers that engage with your team during trial buy (wahoo!) but then churn out after awhile. It could be that your team needs help driving further adoption and value of the product in their interactions, or that your existing customers aren’t getting the same treatment as those in the buying process. 

There will always be some level of doubt that correlation does not equal causation, and that of course more engaged customers are more invested in the product, and therefore more likely to buy or stick around. The most important thing about having these metrics tied to support is to monitor for variance: it gives you a baseline to both experiment against, and also one to monitor changes. For example, if your trial to paid conversion drops after a team member leaves and your support team has been over-extended and understaffed, you can now point to how the longer reply times directly affect the business bottom line. Likewise, if you experiment with having a team member dedicated to more 1:1 onboarding set-up help and find that trial to paid conversion goes up as a result, you can make a case to staff for that role. It’s a powerful lever for receiving necessary resources.  During my tenure at Help Scout, for example, we were able to see a dramatic rise in “support engaged” revenue connected to experiments with proactive engagement, and practicing Support Driven Growth tactics in the queue.

Self-serve revenue vs. Support Engaged revenue at Help Scout over time.

Self-serve revenue vs. Support Engaged revenue at Help Scout over time.

Without teasing out support engaged revenue, that increase might have been attributed to marketing or product enhancements. But, if that were the case, there would have been an equal rise in self-serve revenue as well. It was a clear win that support engagement, and changes we made to how our support team delivers value to customers, drove the increase in revenue we saw over time.

Beware the chicken and the egg…

A word of caution as you start to dig into these metrics. 

There’s also a bit of a chicken and an egg dilemma that happens when moving to a support driven growth and customer experience focused strategy. Where support sits in your organizational structure and whether or not you have a high-touch/high-cost or low-touch/low cost customer model (and how you define support as a result) will impact how much claim support has to revenue. 

If you have a structure where support lives in a junk drawer, only considered reactionary damage control, you may find that support doesn’t actually impact revenue positively. If you treat customer support like a cost center, it becomes one. 

But, the good news is if you start looking at support’s relationship to revenue with these metrics, you now have a baseline. You now have a means to start measuring investments in customer experience and a Support Driven Growth strategy to see how much lift it can create on key business metrics. Once you can demonstrate lift with some small experiments, it’s easier to demonstrate to C-Suite stakeholders the value of going all in. 

Mo McKibbin