Lily knows which street she is on but she is uncertain precisely where. The contact centre software hears this and displays a message on the agent's screen: Send customer photo of local landmark?
Without explicitly responding to the software's question, the agent tells Lily that she is sending her a text message with a photo of a local landmark and instructs Lily to let her know if she sees it. With that prompt, the AI sends Lily the photo and awaits Lily's response - she sees the building, the AI updates her location accordingly.
The agent says that she'll send a tow truck; AI hears this and sends the job details directly to the tow truck driver, then displays an estimated wait time on the agent's screen.
In this case, Lily got to speak to a human who dealt sympathetically during a worrying situation; the agent played to her strengths as a human, reassuring Lily while gathering necessary information. The AI acted on that information by listening for appropriate triggers.
Automation Where Possible
Later that year Lily wants to change her payment details and texts the customer service number for her roadside assistance provider: I want to change my payment details.
Almost immediately, she receives a reply: No problem, Lily. We'll call you in a few moments to confirm this.
Lily's phone rings and a virtual assistant greets her, asking her to confirm her request. Using voice print analysis, the virtual assistant verifies Lily's identity and then updates her payment details as requested. It asks if there's anything else she wants help with. In fact, there is: Lily wants to know if she can get a discount on her annual fee. The virtual assistant puts Lily on hold, connecting her to someone who can help. Within a short while, Lily is speaking to a human agent in the customer retention team.
This is where the human-plus model really shines. For a routine change of payment method a human agent wasn't necessary, and a virtual assistant seamlessly handled the conversation. When it came to a question it couldn't handle - or where data showed that a human interaction had better outcomes - it brought in a human agent.
Thanks to machine learning, the virtual assistant can listen into Lily's call with the human agent and learn from that interaction. Already software is available that reviews chat transcripts and call recordings to analyse sentiment; allowing machine learning tool to learn what vocabulary and vocal qualities show that someone is becoming dissatisfied and also what types of response disarm the caller.
Sign up for Computerworld eNewsletters.