What if your chatbot can’t read minds?

2021.09.08

Wisia Neo

PERSPECTIVES

5 features to improve your chatbot fallback function when it can’t answer a query.

 

If you are a chatbot builder or administrator, you would have realised that anticipating your customers’ intentions is just about as difficult as reading your spouse’s mind. In fact, misunderstanding humans has been suggested by some to be the most common problem a chatbot will face.

While you can pre-train your chatbot with a long list of answers to anticipated questions, its comprehensiveness goes as far as the amount of time one is willing to spend. Trying to predict and incorporate as many of these answers, or dialogue flows, as possible simply isn’t realistic.

At the same time, chatbots are rapidly becoming the baseline standard in customer service and even in the modern workplace, as customers, employees and partners are expecting more and more personalised and prompt assistance. It’s no longer a ‘good to have’, but a ‘must have’, to establish oneself within competitive markets.

 

“Sorry, I don’t understand your question,” and then what?

There’s a definite urgency to tackle these challenges. Leaving the conversation with an apologetic “sorry” simply won’t cut it. It not only leaves a lasting negative impression on customers, but also closes doors to possible lead generation, e-commerce transactions, enquiry resolution, even assistance during an emergency… and the list continues.

Chatbot: Sorry I couldn't catch that.

More often than not, a chatbot’s ability to answer questions effectively involves a higher order of intelligence that can disambiguate intentions and decipher nuances. This would require a strong semantic artificial intelligence (AI) core to successfully extract meaning from textual intents and concepts, through technologies like natural language understanding services, ontologies, contextual memory and more.

Though conceptually straightforward, these are difficult to deliver well. The lower hanging fruit for many would be to develop a thorough chatbot fallback strategy that catches intents which fall through the cracks. So how can we teach chatbots to continue the conversation towards a meaningful end for the user with a better fallback strategy?

5 features to bulletproof your chatbot’s fallback function

Fallback features for your chatbot

1. Connect your chatbot to a search engine

When met with a question the chatbot has yet to be trained to answer, or a question that is simply too complex for the chatbot to answer, a fallback to a search engine becomes a tight safety net. This allows users to retrieve answers from the far larger content repository that the search engine is integrated to, such as SharePoint.

An example here shows how fluid and natural it can be when using Omnitive Converse and Omnitive Search in conjunction. Minimal to no effort is required on the user’s part as the mechanism is configured back end.

Chatbot search fallback

As seen above, this ‘search fallback’ could also provide a variety of pre-curated or open sourced rich media that can potentially address the queries a user has. This means that information in the form of documents, PDFs, and images can be retrieved and surfaced for the user to review.

 

2. Redirect your user to an external site

A close second to the fallback mechanism via a search engine is to simply redirect users to another source of information. While the former feature has its merits, it requires an additional layer of intelligence through the integration of a semantic search engine that can parse human language in content like documents.

An easy fix for enterprises starting out in executing their chatbot fallback strategy is to implement a navigation button that guides users to relevant external sources of information. This is a trusted and well-accepted way to continue the conversation, without extraneous chatbot training as well.

Chatbot redirect to external site
Source: Hubspot
3. Suggest possible queries similar to your user’s

When met with queries that are too vague or broad, another well used option is for your chatbot to offer close suggestions that fit the query. User intents come in all shapes and sizes, despite being fundamentally similar. This is often the main challenge in getting your chatbot to understand queries.

For example, your chatbot may understand this:
“What are your opening hours on the weekends?”

But not this:
“Are you open at 6pm this coming Saturday evening?”

Based on the knowledge your chatbot is trained with and its experiences with other past users, it can suggest intents it actually has the answers to. Further embedding these suggestions as clickable items means that your user wouldn’t have to spend that few extra seconds rehashing his or her initial query–potentially leading to customer attrition.

The example below perfectly illustrates how to suggest answers without scrimping on the user experience. A quick read, and the user can simply click the question that best fits his or her query and retrieve the right response.

Chatbot suggest queries
4. Make it easy to contact a human customer service representative

No chatbot should be left without a hand-off to a human agent, especially if your chatbot is customer facing. Handing the unhandled requests or utterances could often make or break a deal. This is particularly essential and well-used in service-intensive or time-critical industries like hospitality and e-commerce businesses.

The key here is to make the hand-off as smooth as possible for your customer. Your customer should be able to easily trigger this hand-off option, and stay within the same interface if continuing the conversation with an agent via chat. At the same time, the agent should be provided the full conversation history and context to jump right on the conversation seamlessly.

Chatbot live handoff

Live chat and help desk softwares can be easily integrated with your chatbot provider as a partner solution. Often at a low cost to subscribe, this simple remedy ensures your chatbot fallback strategy has a reliable human layer in the interim, while the chatbot gradually improves from learning and training over time.

 

5. Collect information for offline follow ups

The final safety net if enterprises are unable to schedule a live follow up is to capture the contact details of your customer, before redirecting them to a hotline or email address.

This chatbot fallback strategy uses the chatbot as a lead qualifier, by helping to answer routine questions without fatiguing your agents or sales reps. Automating this critical lead capturing phase helps your team rapidly collect details of your prospects while keeping their productivity in check.

Chatbot collect information

 

The bottom line for chatbot fallbacks: an opportunity to add value in the user journey

While reading a customer’s mind is near impossible, being authentic, acknowledging shortcomings and providing alternatives are the best ways to cushion a conversation into a proper closure. Having a defined chatbot fallback strategy in place does this and turns conversations into an opportunity to connect and add value to the customer.

About TAIGER’s Omnitive Converse and Omnitive Search

Omnitive Converse combined with Omnitive Search is TAIGER’s answer to address the industry’s chatbot fallback requirements. Today, Omnitive Converse is deployed in multiple core government projects across verticals—a testament to TAIGER’s capability to deal with increasing information complexity and variability of user requirements.

Speak with one of our solutions managers today on how Omnitive Converse with Search capabilities can enhance your internal and external customer experiences.

Sorry

Your browser is unable to display our site correctly.
It is best viewed on a modern browser, such as Chrome, Safari or Firefox.