fbpx

Why ‘containment rate’ is NOT the best way to measure your chatbot or voicebot

Why ‘containment rate’ is NOT the best way to measure your chatbot or voicebot 1800 1200 Kane Simms

Here’s why ‘containment rate’ isn’t the right way to measure the success of your chat bot or IVR bot and the 3 things you should use instead.

why containment rate isn't the best way to measure your chatbot or voice bot

What is ‘containment rate’?

The containment rate is the percentage of users who interact with an automated service and leave without speaking to a live human agent.

This is people who have a conversation with your chat bot and exit the conversation without being escalated to a live human. If they leave before speaking to a human, they’re deemed to have been ‘contained’.

Likewise, if someone calls your contact centre and interacts with your IVR, but the hang-up before speaking to a call centre agent, they too have been ‘contained’.

Typically, in the contact centre world, a high containment rate is deemed to be a good thing. If you ‘contain’ a call, you prevent the customer from taking up valuable time and resources from your contact centre.

But that’s not how we should be thinking about customer experience, is it?

What’s wrong with using ‘containment rate’ as a measure of chatbot or voicebot success?

The containment rate doesn’t tell you anything about whether your customer got what they needed from you. It tells you nothing about how well you’re serving them and how effective your automated channels are.

It also language that makes it sound like your customers are an inconvenient virus that need to be stopped dead in their tracks and isolated in your channels. Just think of the culture of avoidance that can create in an area of the business which sole purpose is to build relationships and serve customers.

If your sole aim is to stop people contacting you at all costs, then containment rate is the perfect measure. But how many companies really want that? Self-service, absolutely. But actively preventing people who have a problem from speaking with you, missing out on the chance to repair or build a relationship that could earn you thousands or millions over time; I don’t think so.

I don't think so - home alone

Containment goes against the ethos of what automated self-service agents like chatbots and voicebots are there to do. They’re there to help customers. To combat long wait times. To speed up service access. To give customers power. To open up 24 hour customer care.

Speaking of them as little automated quarantine police is the complete wrong way to think about it.

How to measure chatbot and voicebot success: what to use instead of ‘containment’

Instead, you should measure three things:

1. Percentage of successful automated conversations
2. Percentage of live agent handover
3. Customer satisfaction

All of those things together will give you a proper pulse check on how your chatbot or voicebot is performing in the areas that matter.

Percentage of successful automated conversations

This is the total percentage of interactions with your chatbot or voicebot that you can classify as ‘successful’. Successful meaning that the user got what they needed from the conversation and went on their merry way.

Tools like Dashbot allow you to set goals within your bot to measure those ‘end states’ and to build a dashboard of successful conversations had.

Once you’re able to do this, you know how successful your bots are. Then, you can focus on the conversations where users exit to learn how to improve that number.

Percentage of live agent handover

This is the total percentage of interactions with your chatbot or voicebot that had to be escalated to a live human to resolve the issue.

Now, a percentage of conversations you have might be semi-automated, meaning that a live agent handover is part of the process. Perhaps you use the automated agent to gather information before handing it off to a human for decision making. That’s fine. The aim here is to separate those conversations from the conversations that should be automated from end-to-end.

Once you understand the percentage of conversations that should be automated from end-to-end, but that have a live agent handover, you then have an understanding of where you need to improve.

Customer satisfaction

This is the percentage of users that rate their experience with your automated agent positively.

Different companies have different ways of measuring this. Some use NPS, which is a likert scale of 1-10 measuring how likely they’d be to recommend your brand to a friend or family member.

Others use more practical measures like asking ‘did you achieve everything you wanted to today?’ and measuring the percentage of people responding positively.

You can also use tools like OTO.ai to measure sentiment at the end of your calls. The higher the sentiment score at the end of the call, the more satisfied you can assume your caller is.

Lastly, you can use other channels to gather feedback, such as sending interactive text or RCS messages to customer’s post-call to understand how they felt about it.

You won’t always get high responses from this data, and you could theorize that those with a positive experience are more likely to give their time to offer feedback, and so this isn’t an exact science. But exact science isn’t what we need. We just need indicators that allow us to make decisions. And that’s what this’ll get you.

A shift to having successful conversations

Using these metrics will put the focus on understanding how to have successful conversations, rather than on how to ‘contain’ customer contact. It places the focus on how well you’re enabling customers, rather than how well you’re preventing them from accomplishing their goals.


If you enjoyed this, consider signing up for our weekly newsletter below. You’ll get all of these insights in your inbox every week, as well as invites to our weekly live podcast where we interview conversational AI practitioners about the details of how to implement conversational automation and industry trends.

    The world's most loved conversational AI event is back
    This is default text for notification bar
    Share via
    Copy link
    Powered by Social Snap