Vonage hosted the ‘What’s Next for Conversational AI’ panel during VUX @ VOICE22, a full day of VUX World programming at VOICE22 in Arlington, VA.
The panelists Tim Holve (Vonage), Tarren Corbett-Drummond (Ericsson), Arte Merritt (AWS) and Kevin Fredrick (OneReach.ai) were joined by Kane Simms.
So, what did this “mind blowing panel of talented professionals” think the future has in store for conversational AI?
It’s time for our 2023 predictions.
1. Focus will shift from technology to customer outcomes
One thing that was clear during the panel is just how much our industry has evolved already. IVR systems have advanced considerably. As Tim Holve said:
“We finally have the technologies to do all the things we imagined 10 years ago.”
According to Kevin Fredrick, development teams needed to be experts in telephony and other fields to build IVR, but now, anyone can build a full omni-channel solution. This means that the challenge has evolved with the technology – the question is no longer how do you do it, but has become how do you use the tech to make a great experience?
It’s become a creative challenge rather than a technical one, and we’ll see more of this outcome and service focus take the lead in 2023.
2. Great conversational customer experiences will create brand differentiation
Some companies are leading the charge in providing next-generation experiences for customers. For example, the conversation and AI-native bank, Ant Group, which may get so far ahead in its field that it becomes incredibly hard for its competition to catch-up.
Why might that happen? Companies who excel in conversational AI, and have the know-how to utilise it well, can create far better customer experiences. They can make better customer journeys, and have better customer loyalty. This creates competitive advantage and differentiation. It begins to change what it means to interact with brands.
As Kane Simms said:
“We’re seeing some examples of the technology being used in real world scenarios. High volume, complex use cases, delivering real business value and effective customer experience.”
Once you’ve experienced a well design conversational AI that can genuinely solve your problems, everything else seems clunky and less effective.
3. Multimodal interactions will begin to emerge in the mainstream
Conversations that can move from one modality to another create more immersive and engaging experience by bending around the needs and intuition of users.
Arte gave an excellent example of a multimodal conversational AI in action. An insurance company that allows people to upload a photo of their driving licence instead of filling all their details into a webform. AI extracts the personal information, which means the signup process takes 3 minutes instead of 3 hours.
Some interactions don’t have to be resolved within the time span of a user’s single visit to a website chatbot, either. They could continue over days or weeks for the right use case, if that improves the experience.
According to Arte Merritt:
“People want to have the option to communicate freely, rather than follow the rails of a pre-defined experience.”
As more brands reach the limits of pure conversational automation, we’ll begin to see additional modalities creep into the experience in 2023.
4. Easy-to-use tools will lead to more bad experiences
No code/low code tools are changing the game for conversational AI, putting the power of creation in the hands of less technically skilled people. This is allowing almost anyone to create conversational experiences.
WordPress enabled a new generation of writers, builders and entrepreneurs, but it also gave rise to a deluge of poor content. When you open the doors to all, you’ll inevitably see some degradation of quality in places. Low code/no code conversational AI platforms will have the same impact.
Of course, democratising access to creation inevitably results in the production of some stand out experiences. We’ll definitely see more good experiences, too. We just need to hope that the poor quality experiences are short lived and improve quickly as the creators and builders learn more about conversation design and AI ops.
Arte Merritt said:
“With chat and voice experiences, you have to be clear, concise, succinct, unambiguous. You have to take into account things like empathy, inclusion, all that kind of stuff to create a proper experience. Because it’s not just about building something that understands the user and responds appropriately, but responds in a way that actually satisfies the user.”
5. Orchestrating behind the scenes business systems will become mandatory
Conversational AI works best when business systems are orchestrated together on the back-end to deliver a seamless experience on the front-end. Tim Holve gave a great example of how this could work from personal experience – when his first flight landed, he discovered the connecting flight had been cancelled:
“I know there’s more than one flight going to my destination. I’d love to be able to go into the app or call and say, ‘I want to change flights’ and have them say, ‘Hi, Tim, I see your flight was cancelled? Are you looking to find a different one to get home earlier?’”
The airline has data on the customer, their purchased flights and the current status of all planes. When that data is properly orchestrated, it enables the company to get halfway towards solving Tim’s problem before he even contacts them.
In 2023, businesses will realise that, in order to get out of FAQ Land, they need to synchronise business systems together to deliver personalised transactional experiences for customers.
6. Data will be used to pre-empt needs proactively
We’ll likely see use of data to solve customer problems going a step further in 2023, with brands creating proactive services that pre-empt customer needs before they even arise.
Most brands wait for a customer to contact them to discuss a problem. By using data sensibly, and funnelling it through your conversational assistant, brands can predict when a problem is likely to emerge (or flag when a problem has occurred) and can make the first move.
As Kevin Fredrick explained, it’s about brands going from a defensive position to an offensive position:
“Rather than focusing on how to be really good at answering a question, what if we reduce the propensity for the question to be asked?”
7. For outsourced contact centres 2023 will be a rocky road
Currently, many big brands outsource their contact centres. Those call centres have a ‘bums on seats’ business model. They can turn the biggest profit by employing cheap labour and charging their clients for headcount. The brands who use those contact centres have very little control over customer experience.
The outsourced contact centre industry will be disrupted by conversational AI, as it allows brands to have more quality control but also relies on talented and focused live agents working in cooperation with AI.
In the interim, user experience may suffer while those contact centres try to protect their business. According to Kevin Fredrick:
“It’s the tension between contact centres that are being paid to provide bodies and knowing that sometimes the right thing to do for the customer experience is to allow a self-service capability.”
As customer experiences become more automated, it’s going to be a challenge for those contact centres to reconcile their business model with the widespread adoption of conversational AI.
In 2023, as these BPO contracts begin to hit the renewal stage, we’ll see more brands invest in conversational AI to regain control and reduce costs. We’ll also see more BPOs build out conversational AI practices in an effort to hang on to that business.
8. Digital humans will emerge as useful brand ambassadors
As conversational AI has improved, it’s now possible to have useful digital humans. There’s no longer the danger that a bot with human-ish appearance will be limited to saying “sorry, can you repeat that?”
As Tarren Corbett-Drummond says, users have more forgiveness with digital humans than they do with chatbots and voicebots:
“They’ll stay on the line longer and repeat a few more times and have higher satisfaction after the interaction ends.”
While a digital human is essentially just the interface of the experience (and can have any back-end integrations) having a digital human affects the conversations in profound ways. Tarren tells us that users are more engaged, and they stick with the experience longer. Danny Tomsett (CEO UneeQ) also told the VUX World podcast that digital humans convert twice as many users as regular chatbots.
Perhaps this shows a correlation between the user experience and human psychology – perhaps we prefer to talk with digital humans.
Kevin Fredrick added that, even with self-driving cars, users have been known to exit the car and say thank you to the driver, but there’s no driver! Our habits for social norms are hard to kick, and tech such as digital humans may tap into those habits better than a disembodied voice or chat window.
Be on the lookout for this in 2023.
9. Hyper-personalised on the front, hyper-specialised on the back
According to Tim Holve, we’re leading to the point where:
“… the digital human looks more like whatever demographic or culture that you’re serving.”
But at the same time, he thinks the back end AI engines of these bots will become more specialised.
“You’re gonna see niche AI engines emerge, because they will be in that specific area – say the vernacular for clinical visits, or some industry that has its own lingo – and will be better at understanding it and processing it. And then speaking back that lingo through maybe a customised TTS engine.”
Kevin Fredrick also sees a more open playing field with a variety of AI engines jostling for market share.
“If there is no single best player for all things – they all have their own strengths and language and dialects and domains and these types of things – then architecturally, it makes sense to place a bet on an open system.”
Kevin sees this having obvious benefits for users, because a first-time user is going to have different needs from someone who has been a customer for years. The long-standing customer would expect a certain amount of personalisation, as they already have a relationship, whereas the first time user might find it unnerving when the bot seems to know them too well already.
We could lead to the point where every single user has a customised bot – for example, one user may prefer more detail, while another may want things to move at a brisk pace.
Machines are great at recognising patterns. There’s an opportunity to creatively tune these systems so that they act on the data they’ve amassed to decide how best to deal with a customer. For example, a customer might ask to be texted rather than phoned, but if there’s an urgent situation, and the customer’s ignoring texts, then the system could phone them instead. Although the customer explicitly chose text rather than phone, due to the urgency of the situation they may still appreciate the phone call.
What we’re seeing here is that AI systems are developing the capability to treat each customer differently, depending on their situation, and we’ll see more brands explore this in 2023.
10. Large Language Models (LLMs) creep into production
According to Arte Merritt, there are a few interesting uses of Large Language Models (LLMs) emerging. One is to use LLMs to synthesise new training data to make the NLU more robust.
Another possible use is for when the NLU can’t match a user utterance to an intent – an LLM would be used to expand upon the user utterance to see if a match can be found in additional utterances generated by the language model. Kevin Fredrick imagines the system would ask the user to pick between the synthesised options which have highest system confidence:
“You can use the LLM to create additional utterances, then process all of those to see what intent might be revealed. And then say, ‘I think you’re asking about, X, Y, and Z?’”
So are LLMs worth the hype? Absolutely, according to Kevin Fredrick:
“They’re gonna completely change how we all think about conversational AI as an interface. And it’s going to create a really interesting career opportunity for people who learn to design using these new skills.”
In 2023, we’ll continue to see LLMs hit production use cases and prove themselves out in the wild.
Thanks very much to Vonage for hosting the panel, and to our panelists – Tim Holve, Tarren Corbett-Drummond, Arte Merritt and Kevin Fredrick.
And a shout out to all the people who came along to VUX World day at Voice22, including Maaike Coppens who made those great summary tweets!
2023 is going to be a crucial year for conversational AI, and we’re certainly excited to see how it unfolds.
This article was written by Benjamin McCulloch. Ben is a freelance conversation designer and an expert in audio production. He has a decade of experience crafting natural sounding dialogue: recording, editing and directing voice talent in the studio. Some of his work includes dialogue editing for Philips’ ‘Breathless Choir’ series of commercials, a Cannes Pharma Grand-Prix winner; leading teams in localizing voices for Fortune 100 clients like Microsoft, as well as sound design and music composition for video games and film.