Human conversations don’t follow a map. Even if you try to plan your chat, things often go off-topic quickly. It doesn’t matter much. No matter what happens, humans are skilled at redirecting conversations and finding agreement on what they’re talking about.
Why does this matter in Conversational AI? Because conversations with bots are mapped out long before they actually happen. The design team must consider how users will behave before their first conversation with the bot. Psychology matters if you’re going to design a fruitful exchange.
Dawn Harpster, Senior Conversation Architect at TalkDesk, talked about this with Kane Simms in her VUX World interview.
Was it your turn or mine?
Turn-taking is core to conversations. Conversational partners don’t continually speak over the top of each other (although interruptions and overlaps do happen sometimes). We’ve learned to wait our turn to speak. While we wait, we’ll nod to show we’re listening, or give feedback (such as saying a quiet “mm-hmm”) to show we’re following along.
Those are just a few examples from the many visual and audible cues humans give during conversations, yet in human-bot conversations they can be problematic.
You can nod all you want while talking to a voice assistant, but it won’t see you. That’s also often the case with subtle ‘mm-hmm’ sounds and other feedback such as ‘yeah,’ ‘I know,’ – known as backchanneling. Bots don’t listen for feedback while they speak. Instead, the bot may consider any user utterance as an interruption.
The user’s learned ability to provide feedback that they’re listening (a good thing) can be misunderstood and make a terrible experience.
Ok, it’s your turn then
It’s a design decision to allow the user to interrupt the bot – it’s called a barge-in. Don’t take barge-ins lightly. You need to consider whether it’s the right moment for the bot to stop talking and start listening.
Sometimes the bot must give vital information, and should say it all without interruption. This would be the case if there was a legal requirement to convey specific information to a customer.
On the other hand, barge-ins are sometimes necessary – power users may know the conversational flows already and want to skip ahead. They don’t need to hear the same information every time they interact with the bot.
Learning which prompts should allow barge-in and knowing how and where to place pauses are must-have skills for conversation designers, according to Dawn. Then turn-taking can happen smoothly.
Everyone loves lists
Dawn also gives the example of lists. If a list is too long, users are likely to only remember the first and last items. That’s the rule of primacy and recency. She recommends keeping lists between three and five items.
It’s all about cognitive load. We don’t want to burden the user with too much information at any one time, or they might get fed up.
The way you make me feel
Dawn raised many great points, but this must be the biggest among them: consider how you make users feel.
If your bot makes the user feel like they’re dumb, or they’re failing, then they’ll walk away angry. It’s unlikely to be the user’s fault. They probably didn’t know what to say because the bot confused them. The bot probably didn’t give them clear guidance.
As Dawn says:
“Who are they going to be mad at? Are they going to be mad at the bot persona? Or are they going to be mad at the company for creating a bad experience?”
The company will get the blame. Users will remember that experience the next time they contact that brand. If you’ve read about Dawn’s brilliant approach to persona design, you’ll know she thinks personas should be designed to serve the user and the brand.
So you see, psychology matters a great deal when designing bots. We’ve spent our lives communicating with each other and continue to practise the art every day. If bots are going to help us they need to be designed to talk in a way that feels intuitive to us. Users shouldn’t be forced to speak differently. That’s no small task.
Have a listen to Dawn’s full interview here.
This article was written by Benjamin McCulloch. Ben is a freelance conversation designer and an expert in audio production. He has a decade of experience crafting natural sounding dialogue: recording, editing and directing voice talent in the studio. Some of his work includes dialogue editing for Philips’ ‘Breathless Choir’ series of commercials, a Cannes Pharma Grand-Prix winner; leading teams in localizing voices for Fortune 100 clients like Microsoft, as well as sound design and music composition for video games and film.