The contact centre of the future will anticipate a customer's inquiry, predict what they want to discuss and even provide appropriate support throughout the interaction, all thanks to artificial intelligence (AI)-- be they customers or contact centre agents -- to get more done in less time. Think of it as human-plus.
There are three roles AI plays in contact centres:
- Anticipating needs: using big data to predict customer needs
- Augmenting conversations: providing instant help through virtual assistants
- Automating where possible: freeing human agents to manage interactions where human-touch and expertise is needed
What Is AI?
The AI we refer to is soft AI. The tools seen in this type of AI give the impression of intelligence by drawing meaning from data. Such tools are already in use and apply the following techniques:
- Big data: finding patterns in large amounts of varied, fast moving data
- Natural language processing: analysing language as spoken and written by humans (such as in Amazon's Alexa)
- Machine learning: self-programming by adapting to changing circumstance and data
When combined, these tools take resources that previously were of little value - such as hours of call recordings - and draw out knowledge that would otherwise be lost.
AI Anticipating Customer Needs
It's late on a Saturday night in 2022. Lily is having car trouble and calls her roadside assistance service. Even before the call is answered, the contact centre's AI judges it to be urgent. It made that determination, in a faction of seconds, by acting on the context of the call:
- Caller-ID was associated with Lily's account
- Lily calling the rescue line for the first time, despite being a customer for 10 years
- Other customers with a similar profile to Lily call only when they really need help
The AI puts Lily at the top of the queue; it finds out how long she'll have to wait for help, informing the agent when they answer her call.
The AI used the context of Lily's call to judge its purpose and urgency, and then routed her call appropriately. While it's not a leap to assume someone calling late at night might need urgent help, non-obvious patterns will be revealed in both public and private sources of data. Machine-learning tools will then anticipate how best to respond when it sees those patterns unfolding.
AI Augmenting Conversations
The contact centre agent answers Lily's call. Lily explains that she is downtown in her home city.
As Lily speaks, the agent's screen updates with a map of the area where Lily is stranded, along with live locations of roadside assistance trucks nearby. When Lily says she thinks she needs to be towed, the nearest tow truck is highlighted. All of this happens without explicit instruction from the agent; a virtual assistant is listening to the call and uses natural language processing to identify key terms.
Lily knows which street she is on but she is uncertain precisely where. The contact centre software hears this and displays a message on the agent's screen: Send customer photo of local landmark?
Without explicitly responding to the software's question, the agent tells Lily that she is sending her a text message with a photo of a local landmark and instructs Lily to let her know if she sees it. With that prompt, the AI sends Lily the photo and awaits Lily's response - she sees the building, the AI updates her location accordingly.
The agent says that she'll send a tow truck; AI hears this and sends the job details directly to the tow truck driver, then displays an estimated wait time on the agent's screen.
In this case, Lily got to speak to a human who dealt sympathetically during a worrying situation; the agent played to her strengths as a human, reassuring Lily while gathering necessary information. The AI acted on that information by listening for appropriate triggers.
Automation Where Possible
Later that year Lily wants to change her payment details and texts the customer service number for her roadside assistance provider: I want to change my payment details.
Almost immediately, she receives a reply: No problem, Lily. We'll call you in a few moments to confirm this.
Lily's phone rings and a virtual assistant greets her, asking her to confirm her request. Using voice print analysis, the virtual assistant verifies Lily's identity and then updates her payment details as requested. It asks if there's anything else she wants help with. In fact, there is: Lily wants to know if she can get a discount on her annual fee. The virtual assistant puts Lily on hold, connecting her to someone who can help. Within a short while, Lily is speaking to a human agent in the customer retention team.
This is where the human-plus model really shines. For a routine change of payment method a human agent wasn't necessary, and a virtual assistant seamlessly handled the conversation. When it came to a question it couldn't handle - or where data showed that a human interaction had better outcomes - it brought in a human agent.
Thanks to machine learning, the virtual assistant can listen into Lily's call with the human agent and learn from that interaction. Already software is available that reviews chat transcripts and call recordings to analyse sentiment; allowing machine learning tool to learn what vocabulary and vocal qualities show that someone is becoming dissatisfied and also what types of response disarm the caller.
Human Plus AI in the Contact Centre
Perhaps in 20 years, we'll have natural, flowing voice conversations with AI agents in contact centres. In this meanwhile, AI will be crucial to the contact centre but in a background role. It will draw on multiple data sources to anticipate customer and company needs, handle interactions on its own where possible, and provide in-call support where needed. Humans will still be there for when the data - or simple common sense - shows they do a better job. The future of AI in the contact centre is one where software tools make humans more efficient.