No Jitter is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Decoding Dialogflow: Routing to a Human, Testing, Going Live

Kelly_Handoff.JPG

Passing an interaction from an intelligent virtual agent to a live agent

Passing an interaction from an intelligent virtual agent to a live agent

Image: KelCor
Last week I had an unusual interaction with a voice-enabled contact center IVR while helping my mother sign up for some supplemental medical coverage. After calling the health insurance provider’s customer support line, a bot greeted us and began asking for account number and other related information. However, because mom is now hard of hearing, processing the message and beginning a response would take her a few seconds. The delay would cause the bot to time out, and repeat the question, resulting in mom effectively cross talking over the bot. This messed up the bot’s speech-to-text capability, and it could never find her account number. It then went on to ask some other identifying information, all with the same result. Finally, the bot’s automated messaging asked us to press zero to talk to a human.
 
This example of a poor experience interfacing with a speech-enabled bot brings two questions to mind:
  1. When should a bot hand off a call to a human?
  2. What best practices should developers follow to ensure the bot works as designed?
In this article, the tenth in a series, we’ll examine these issues and provide some insights that are applicable not just for Google Dialogflow, but for any intelligent virtual agent (IVA) design and development effort.
 
Virtual Agent Handoff to a Live Agent
When should an IVA hand off a call to a live agent? Well, this question has many answers. One response would be to be follow an IVR approach: If the caller doesn’t make a valid choice after a certain number of attempts, then a business rule kicks in. For some organizations, the business rule simply states that if a caller doesn’t make an applicable choice, disconnect. More customer-friendly systems tell the caller to wait on hold while they summon a live agent.
 
In the IVA world, there are similar rules. These rules usually involve trying to use different phrases to extract intent and entity information from the caller: Multiple ways of querying the user for information are coded into the system, and if the IVA cannot identify the intent and/or the entity information associated with that intent, it asks the question again in a different way.
 
But what if the system still cannot determine the intent or an entity? There are multiple ways of handling the situation:
  1. In an IVA that runs independently of live agents, the only option is to invoke a fallback intent, which basically starts the query again. This is usually not a very good outcome if the user is already having problems, but for many Dialogflow customers that build virtual agents outside of a contact center environment supporting live agents, this is the only option.
  2. One IVA provider in our industry, Interactions, has a unique “human-assisted understanding” capability in which the system sends the snippet of conversation the IVA didn’t understand to a live agent. The live agent listens only to this short phrase and figures out the intent or entity information, and then returns control back to the IVA. In this scenario, Interaction’s system then automatically feeds the phrase and its associated intent or entity data back into the machine learning algorithm, for use in training the IVA. Because these agents only get audio snippets of just a few seconds they can respond in real-time.
  3. The IVA can route the call to a live agent. Just like in an IVR system, the IVA system will prompt the user for information, and once the maximum number of tries is reached, the call goes to a live agent.
Hybrid IVA-Human Systems
When an organization deploys an IVA along with live agents, then the entire solution can be thought of as a hybrid system. In such a system, there are guidelines that will be helpful in considering when and how to hand off a call from an IVA to a live agent. By the way, if you want callers to know that they have the option to be routed to a live agent during the call, it is good practice to tell them early in the dialog exchange that this capability is available; the welcome message is a good place for this announcement.
 
The first of these guidelines is around the IVA’s scope and the complexity of the intent. The idea here is that the system needs to try to identify the user’s intent, and if the intent is outside the scope of what the IVA is programmed to handle, or if fulfillment of the intent is somewhat complex, the call may be handed off to a live agent. A related consideration is how critical the user’s business is and/or how critical the task is to the organization that owns the IVA. If the system identifies the user as someone whose business justifies a live agent or if the task to be completed is critical to the company’s operations, then the call could be routed to a live agent. Clearly, as a default, if the IVA can’t figure out what the person’s intent is after repeated query attempts, the call may be routed to the live agent.
 
A few other reasons why calls should be handed off, per the organization’s business rules, are when the caller just asks to talk to a live agent or if the system detects that the caller’s sentiment is negative or angry. People can express empathy and compassion far better than an IVA can, and a live agent may save a situation by intervening when a caller expresses negative sentiments.
hybrid IVA-Human customer engagement workflow

A simple example of a hybrid IVA-human customer engagement workflow

Image: botcore.ai with KelCor modifications
 
Next, when a call needs to be handed off from an IVA to a live agent, it is helpful to tell the user explicitly in the handoff process that the call is being routed to an agent and why. It is also important to be able to trace back to the point in the conversation where the IVA-live agent handoff needed to occur. This will allow the IVA design team to identify problems in IVA workflow and redesign that part of the IVA so that people don’t need to be transferred to a live agent.
 
While a customer is waiting for the live agent to pick up the call, it is helpful to give the caller information as to length of queue as well as other contact options, such as providing the alternative for a later callback or exchanging an email or an SMS/chat message. Basically, keeping the customer informed as to what is happening is a good practice to reduce frustration and call abandonments.
 
Prior to picking up the call, the agent should be presented with the entire transcript of the interaction and the agent needs to take the time to read the transcript before responding. This will preclude the need of asking for information that the caller has already provided, which will speed the interaction time while reducing caller frustration.
 
In some customer engagement scenarios, the live agent will handle part of a call flow, and then the caller will be returned to the IVA. An example may be gathering Net Promoter Score data through a survey. If the user is transitioned back to the IVA, the system needs to tell the caller that the live agent is no longer connected, and that the caller is once again interacting with the IVA.
 
Following these conceptually simple steps will make transitioning between the IVA and the live agent easier while providing a better experience for both the caller and the live agent.
 
Testing the IVA
There are two main parts to building (and testing) an IVA:
  1. Developing the conversational AI portion, which includes the dialog, queries, text-to-speech, responses, speech-to-text, and persona associated with the virtual agent
  2. The intent fulfillment portion of the IVA
Testing the conversational AI portion of the IVA can be divided between human testing and automated testing. Human, or live, testing is done to avoid bias by having people test the IVA who didn’t help with the development of the virtual agent and who do not know the training phrases. Live testing will give you insights into how the IVA is performing and how natural the conversation flows. Live testing can also be helpful when determining how well the bot transitions to the live agent. Going through these flows and tests with real people unassociated with the project development is a best practice.
 
Testing the natural language understanding portion of the IVA can also be partly automated. Dialogflow has an API call that accepts a user query, either as an audio file or as a text string, and it returns the intent and any entity values in a JSON object. Using this method to test the IVA assures developers that Dialogflow’s natural language understanding is understanding the test queries properly.
 
The intent fulfillment portion of the IVA can also be automated, because it is like other software for which tests are developed and saved. Dialogflow allows developers to pass an intent and any associated entity values to the test engine, and any webhook functions can be executed and the return values can be verified.
 
Goggle also suggests doing load testing and spike testing to ensure the entire IVA can handle continuous large loads as well as load spikes. This is less for the Google cloud portions of the IVA and more for the items related to limited compute and memory of the webhook functions, quota restrictions from providers, slow data reads/writes, and concurrency issues in the code.
 
You will also want to make sure you test the IVA on all platforms you’re targeting. If you’re using rich messaging in your bot responses (images, speech, cards), test these within the context and on the devices in which the IVA is expected to operate.
 
What’s Next
The next article in this series will appear in early December, with a focus on internationalizing IVAs. It will also include a description of Avaya’s CCAI partner interface.

Enterprise Connect 2020 logo

To learn more about AI's role in the contact center, join us at Enterprise Connect 2020, where we'll be featuring sessions on Contact Center & Customer Experience as well as Practical AI. Our Advance Rate is still open! Register today using the code NOJITTER to save an additional $200 off this rate.