Device Type: desktop
Skip to Main Content Skip to Main Content

How Artificial Intelligence in the Contact Center Will Work

Este artículo se publicó en August 11, 2021

The contact center of the future will anticipate a customer’s inquiry and predict what they’ll want to talk about. It will even provide appropriate support throughout the interaction, all thanks to artificial intelligence (AI).

 

But this isn’t a story where computers replace humans. Instead, you can think of it as human-plus: AI will help humans—whether the customer or the contact center agent—to get more done in less time.

What will that mean in practice for the contact centers of the next 10 years? Let’s look at three of the roles AI will play:

  • anticipating needs: big data will help to predict customer needs
  • augmenting conversations: virtual assistants will provide instant help
  • automating where possible: save valuable human agents for those interactions where they’re most needed.

First, though, it’s worth diving beneath the hype and reminding ourselves of what we really mean by AI.

What Is AI, Anyway?

This isn’t the artificial intelligence of sci-fi: there won’t be conscious, thinking software in our contact centers anytime soon.

Instead, this is what computer scientists call soft AI. The tools we see in this type of AI give the impression of intelligence by drawing meaning from data, and they’re already in use:

  • big data: finding the patterns in large amounts of varied, fast moving data
  • natural language processing: allowing computers to parse language as spoken and written by humans (such as in Amazon’s Alexa)
  • machine learning: allowing computers to effectively program themselves by adapting to changing circumstance and data

 

Combined, these tools let us take resources that previously were of little value—such as hours of call recordings—and draw out knowledge that would otherwise be lost.

Let’s look at how they’ll make the future contact center more effective.

AI Anticipating Customer Needs

It’s late on a freezing Saturday night in 2022. Lily is having car trouble and calls her roadside assistance service.

Even before the call is answered, the contact center’s AI judges it to be urgent. In a few milliseconds, it made that determination by acting on the context of the call:

  • The caller-ID was associated with her account.
  • She has been a customer for 10 years but hasn’t called the rescue line once.
  • Other customers with a similar profile to Lily call only when they really need help.
  • The weather in Lily’s home city is cold enough that she could risk hypothermia if she is stuck without heat for too long.

 

The AI puts Lily at the top of the queue; her call will be answered next. It then finds out how long she’ll have to wait for help (assuming she is in her home city) and shows that to the agent when they answer her call.

In this case, the AI used the context of Lily’s call to judge its purpose and urgency and then routed her call appropriately. While it’s not a leap to assume someone calling late at night from an icy city might need urgent help, non-obvious patterns will be revealed in both public and private sources of data. Machine-learning tools will then anticipate how best to respond when it sees those patterns unfolding. Everything from staffing levels, through the best promotions to run, to the type of interaction a customer prefers will be set by software programmed through machine learning.

AI Augmenting Conversations

The contact center agent answers Lily’s call. Lily explains that she is downtown in her home city, where she hit a patch of ice, skidded into the curb, and buckled her wheel. She needs to be towed home.

As Lily speaks, the agent’s screen updates with a map of the area where Lily is stranded, along with live locations of roadside assistance trucks nearby. When Lily says she thinks she needs to be towed, the nearest tow truck is highlighted. All of this happens without explicit instruction from the agent. Instead, a virtual assistant is listening to the call and uses natural language processing to pull out key terms.

Lily knows which street she is on but she is uncertain precisely where. The contact center software hears this and displays a message on the agent’s screen:

     Send customer photo of local landmark?

Without explicitly responding to the software’s question, the agent tells Lily that she is sending her a text message with a photo of a local landmark and instructs Lily to let her know if she sees it. With that prompt, the AI sends Lily the photo and waits to hear what she tells the agent. She sees the building; the AI updates her location accordingly.

The agent says that she’ll send a tow truck. The virtual assistant hears this and sends details of the job directly to the tow truck driver, then displays an estimated wait time on the agent’s screen.

 

In this case, Lily got to speak to a reassuring human who dealt sympathetically with a potentially scary situation. The contact center agent played to her strengths as a human: she could reassure Lily and draw the right information out of her. The virtual assistant was able to act on that information by listening for appropriate triggers.

This technology is already available. Vonage’s communications API platform works with IBM Watson to provide just this type of virtual in-call assistant. In the contact center of the next few years, it will be commonplace.

Automation Where Possible

Later that year Lily wants to change her payment details. She texts the customer service number for her roadside assistance provider:

     I want to change my payment details.

Almost immediately, she receives a reply:

No problem, Lily. We’ll call you in a few moments to confirm this.

Lily’s phone rings and a virtual assistant greets her. It asks her to confirm that she wants to make the change. Using voice print analysis, the virtual assistant verifies Lily’s identity and then updates her payment details as requested. It asks if there’s anything else she wants help with. In fact, there is: Lily wants to know if she can get a discount on her annual fee. The virtual assistant asks Lily to wait while it connects her to someone who can help. Within a short while, Lily is speaking to a human agent in the customer retention team.

This is where the human-plus model really comes into its own. For a routine change of payment method—especially one initiated by text message—a human agent wasn’t necessary. Instead, a virtual assistant seamlessly handled the conversation via SMS and a voice call. However, when it came to a question it couldn’t handle—or where data showed that a human interaction had better outcomes—it brought in a human agent.

Thanks to machine learning, the virtual assistant can even listen into Lily’s call with the human agent and learn from that interaction. Already software is available that reviews chat transcripts and call recordings to analyze sentiment and pinpoint the moment that, for example, a customer lost their temper. By analyzing thousands of calls and transcripts, a machine learning tool could learn what vocabulary and vocal qualities show that someone is becoming dissatisfied and also what types of response disarm the caller.

Human Plus AI in the Contact Center

So, perhaps in 20 years, we’ll have natural, flowing voice conversations with AI agents in contact centers. The immediate future, though, is just as exciting; the difference being that the role of AI in the contact center during the coming decade will be far less visible to the end customer.

In the coming years, AI will be crucial to the contact center but in much more of a background role. It will draw on multiple data sources to anticipate customer and company needs, handle interactions on its own where possible, and provide in-call support where needed.

Humans will still be there for when the data–or simple common sense–shows they do a better job. So the future of AI in the contact center is one where software tools make humans more efficient; pretty much in just the same way that they have for the past 60 years.

 
Written by Vonage Staff