The hype surrounding chatbots has led many businesses to deploy bots without considering the fundamental requirements for success.
We talked to Susannah Richardson, marketing director, IFS | mplsystems to establish why, just like humans, bots need proper training to excel at their job.
Like it or not, when presenting a support query to a brand, most consumers don’t put the value of human interaction above their need to get a problem solved.
According to a PwC report last year, 64% of consumers said they would rather have instant access to quality customer service than preserve the jobs of customer service reps.
This simple fact is precisely why the interest in chatbots has increased 10-fold in the last five years alone.
Yet the reality is that, to date, most bot deployments have failed to meet this critical need. Despite a consumer survey by Survata finding that 67% of people had come into direct contact with a bot during a support query in 2017, further research by Nuance revealed that fewer than one in three (30%) consumers were confident in a chatbot’s ability to successfully help them when it comes to customer service.
“Reduce the pressure on the contact centre and offer a great experience – if a chatbot can’t do this then both of those objectives are instantly eradicated and you end up looking stupid,” says Susannah Richardson, marketing director, IFS | mplsystems.
“Businesses understand they need to introduce better self-service and artificial intelligence such as chatbots. One of the problems is businesses are saying ‘we need to do a pilot’ in customer service, but it’s an isolated thing rather than looking at customer support, the contact centre and how artificial intelligence could be best deployed to give them quick wins. They’re not looking at it as an iterative process.”
Marketing getting in the way?
One key dilemma is that many chatbot projects are currently assigned to marketing departments for deployment, due to their online presence and positioning. Yet, as Richardson states, their delivery is generally grounded in contact centre process.
“Typically, one of the problems is that marketing have the budget. Marketing look after the website and they have the budget for digital self-service, so if you’re trying to put AI into a self-service component it would be marketing that own it rather than customer service.
“The issue here is they’re being conceived by the marketing team and put out on their website, but if customers get stuck there’s no connection up with the service and support team. If positioned with the customer service team, when a request is too complex for the chatbot, it should seamlessly transfer the conversation to an appropriately skilled human agent. Thus, blending the boundaries from automated to agent-assisted care. The experience is good for the customer because they’ll still get their question answered.”
If a chatbot doesn’t know the answer to a question then seamlessly the chatbot should transfer to a real agent
And as most research highlights, it’s predominantly customer service where consumers see chatbots providing the most benefit.
Of course, at present, a chatbot doesn’t usually know the answer to a question on its own.
Just last year, stats revealed that chatbots on Facebook Messenger failed to answer queries 70% of the time. The result has been a massive scaling back in brands using Messenger as a platform for chatbots.
The issue is entirely tied up in linguistics and natural language processing, something scientists and AI specialists have grappled with since the Turing test, and from 2006 onwards, through the Loebner prize.
Much like Malcolm Gladwell’s assertion that it takes 10,000 hours to make you an expert in something, Richardson believes a chatbot requires 300 different ways of answering any given question. In many cases this could be akin to 10,000 hours of training and development.
“If you look at manufacturing and service companies as examples – many already have chatbots. Vaillant (boiler company) for instance. It has a virtual assistant on its website. You can type in a question and it will try and answer that. The problem is, in a substantial amount of cases, it can’t answer your question. It can answer standard questions but if you word them differently, it isn’t trained to understand you. The customer gets frustrated and has to find a telephone support number to ring up to get a question resolved.
“Chatbots and voice assistance fails because they need to be fed with data. Many organisations don’t have the data to train the chatbot so the chatbot is trying to learn on the job, but offers a poor experience whilst it learns.”
Having sufficient data to train bots and AI, whether it text or voice-based, is an ongoing problem. It’s something Amazon is committing millions of dollars to overcoming via its Alexa Prize (indeed, its 2017 competition prize stood at exactly one million dollars).
It is also something that most businesses have been surprisingly willing to forego at the expense of the pressure to get their bot or virtual assistant out into a live environment.
Richardson has sympathy for brands in this respect. Training a bot can be a chicken and egg scenario, in which it needs a live environment in order to train itself, but can often leave consumers incensed by its early stages of incompetence. Yet there are some straightforward rules to follow to ease the pressure.
“You must start small,” Richardson adds. “Nothing good comes of over-complicating a bot’s requirements.
“Start with a narrow subset,” she adds. “It might be that you know 60% of the questions that come into your contact centre are of a certain subject. Focus on those and let the agents deal with the more complex stuff. But make sure the bot is connected up properly with the contact centre, otherwise you’ll run into problems. And make sure a customer knows when they’re dealing with a bot and when they’re being transferred to a human.”
Richardson also believes that the route to chatbot success may not involve deploying a chatbot at all, to begin with.
“One quick win is to use artificial intelligence at the front-end of calls – when you initially call a customer, use voice recognition to establish what the enquiry is. At the moment if you call up, all enquiries go to an agent who then hands you off to a different department to deal with. But if you say ‘billing’ to voice recognition you can get calls routed to the best expert to deal with your call and prioritise you dependent on the urgency.
“You can do the same for emails, webchat and social messaging. If a text comes in, you can interpret it with natural language processing and be routed accordingly. At the moment if you email a business your email will probably sit in an inbox waiting to be dealt with. With AI, you can establish its importance, who to route it to and how to resolve the problem.”
“It’s the intermediate goal before you do automated chat. You can capture all of that data which can be used to train a chatbot for later down the line. Even FAQs are an underused source of capturing data – it’s the easiest thing to start with prior to any potential bot deployment later down the line.”
Given their very nature, chatbots and AI will continue to be an exercise of trial and error. But as highlighted by Facebook Messenger's bot failures, customers will quickly disengage from any experience that doesn't lead to the ultimate goal: getting their query resolved.
About Chris Ward
Chris is Editor of MyCustomer. He is a practiced editor, having worked as a copywriter for creative agency, Stranger Collective from 2009 to 2011 and subsequently as a journalist covering technology, marketing and customer service from 2011-2014 as editor of Business Cloud News. He joined MyCustomer in 2014.