fbpx

AI at Marks and Spencer: 2 current use cases

AI at Marks and Spencer: 2 current use cases 1120 840 Ben McCulloch

How do you take a quintessential British brand, steeped in tradition and history, with a legion of customers (quite a few who are in the older demographic) and transform it for the digital age? Here’s how M&S did it.

Marks and Spencer (also known as M&S) is one of the best known UK brands. It sells clothes, food, home furnishings and much more. It has around 600 stores in the UK, more around the world and is 138 years old.

Steven Siddall, Marks and Spencer’s Contact Systems Lead, spoke all about how Marks and Spencer uses conversational AI with Kane Simms in his VUX World podcast interview. Here’s some of the key takeaways from the discussion.

Starting with a chatbot

Marks and Spencer’s first foray into conversational AI began five years ago when it introduced its first chatbot, using Nuance.

The problem statement

The chatbot aimed to solve a number of FAQ use cases and one particular recurring issue Steve and his team discovered. When customers bought goods through the M&S website, sometimes when they tried to apply a promo code to their shopping basket, they got stuck. Those customers often called the contact centre for advice, shifting from a one-to-many, scalable channel (the website) to a one-to-one conversation with an M&S agent.

This took time out of agent’s days. Time away from serving customers with more pressing and complex needs. Those customers were waiting on hold, while agents spent time troubleshooting website issues!

The first chatbot solution

The resolution was simple. M&S focused its first chatbot on helping those customers. Troubleshooting issues, answering basic queries. The sole aim was to save contact centre agents and customers time, to prevent the customer from having to switch channels, and to create space in the call centre for customers with more complex needs.

The results of the chatbot?

To this date, this use case for the chatbot has proven to be one of the most effective automated journeys at M&S.

Since then, M&S has switched provider, from Nuance to another stack, to enable the chatbot to tackle deeper, more transactional automated journeys.

Moving into voice AI

Three years ago, M&S started to introduce its voice strategy.

Problems in the call centre

M&S had a system that was ripe for improvement. When a customer called one of its stores in the past, they were routed through a switchboard to a local expert. M&S had 13 rooms dotted around the country where agents sat and listened to customer queries, forwarding the customer onto another person, somewhere else in the business, who could answer. A real, live switchboard.

M&S saw an opportunity to automate this system. An opportunity to make the caller experience more consistent and efficient. An automated system could get customers the information they needed quickly, and route calls it couldn’t answer to the appropriate department or person.

This automation activity is in line with Marks and Spencer’s broader business plans. As Steve says, “part of the future strategy of M&S is digital first.”

The voice AI solution

The aim of the initial voice AI solution M&S implemented was to simply route calls to the right place. A customer would call, say what they’re calling about, and the digital assistant would understand, then route them to the appropriate department or store.

Sounds simple, but there’s a lot going on under the hood, and a process required to get there.

The approach to voice AI implementation

M&S started with a four week test, focused on just a few stores. Callers would be asked “why are you calling?” and the assistant would simply listen. The aim of this was to capture intents – what were the common things customers asked for that could be automatically resolved? How were they asking for help?

Conversation design matters! Even at the outset of the test. The test began with a question that was too open – customers where asked “what do you want to do today?” and M&S received such wonderfully useful responses, such as “we want to swim with dolphins”! It received responses that were better suited to its actual services when the assistant asked “why are you calling us today?”

So, all of the intents were captured (though the team didn’t pay too much attention to those related to dolphins). It was a research exercise.

Once the assistant had gathered enough data, the team went through the data with a fine tooth comb to build the first version of a language model, which would eventually be used to classify customer utterances, and route the caller to the right place.

At this stage, rather than going live with the fully featured assistant, instead, M&S put the new language model live and listened again. The assistant wasn’t acting on anything the customer said. It just captured an utterance and sent the user to the switchboard.

In capturing the utterances the second time, M&S were running the newly created language model behind the scenes, checking its accuracy against real data. Then, it was tuned regularly by checking that it both recognised the user’s utterance accurately and then matched it to the right intent. Their aim was to achieve 80% accuracy with this process before they started to develop it further and go live.

Success in 6 months

Next, once the language model was proven to have integrity, they connected this system to all the M&S stores in the UK to route customers appropriately.

Once this was up and running, M&S were able to capture further customer utterances and queries and use that to inform what customer questions they could answer with the assistant.

Some are simple – if a caller wants to know what stock is available in a store then they’re sent to a live agent who can discuss the best options with them, or if they want opening hours they’re connected with IVR.

Within 6 months their automation strategy had achieved its initial goals – to automate the calls that were going to the live agents in 13 stores.

They hope to improve these services over time so that they’re not just providing generic information but providing full customer service. For example, callers should be able to track their order with the automated system, and easily be connected with a live agent if they need it.

Sharing the wisdom

Here’s some great learnings from M&S’ automation experience:

  1. Consider whether you actually solved the customer’s query – do they hang up, and then reappear on a different channel soon after? If so, that’s a very good sign that something’s not working in your automated journey.
  2. Don’t over-react if they all hang-up at first – M&S discovered that some customers hung up the first time they experienced the automated journey, but then within a few minutes they called again and spoke to it uninhibited.
  3. To really deliver value, implementation is key – now that they’ve got their service up and running M&S are considering more complicated use cases. In order to achieve those they will need a lot of help from their development teams to connect APIs etc.
  4. They uncovered value they hadn’t expected – they discovered that the intent model they’d created for automation could also be modified for use within the contact centre.
  5. Leverage the data you already have – If a customer just bought a jacket and contacts you soon after, why not begin the conversation with “hi, are you calling about the jacket?” – that saves the customer a lot of steps.

Bots are better

Within 6 months M&S’ automation strategy had achieved its initial goals – to automate the calls that were going to the live agents in 13 stores. They found the most efficient way to help customers who couldn’t apply a promo code to their shopping basket, and many more simple use cases besides, thus taking the burden off their contact centre who could focus on more complex calls. They even found that one of the processes for creating a bot – creating an intent model – was actually a huge help for their contact centre, so they’ve started to innovate beyond their initial scope.

As Steve says “conversational interfaces are just better at getting to the heart of the problem.” 

Wouldn’t you say that’s plenty to celebrate?

Listen to Steve’s full interview here or on Apple podcasts, Spotify or search VUX World on your podcast player.


This article was written by Benjamin McCulloch. Ben is a freelance conversation designer and an expert in audio production. He has a decade of experience crafting natural sounding dialogue: recording, editing and directing voice talent in the studio. Some of his work includes dialogue editing for Philips’ ‘Breathless Choir’ series of commercials, a Cannes Pharma Grand-Prix winner; leading teams in localizing voices for Fortune 100 clients like Microsoft, as well as sound design and music composition for video games and film.

    The world's most loved conversational AI event is back
    This is default text for notification bar
    Share via
    Copy link
    Powered by Social Snap