Voice recognition + contextual data: The formula for fully automated interactions?

18th Aug 2014

Voice recognition has a bad rap when it comes to contact centres. Many of us have had the frustrating experience of attempting to book a cinema ticket by talking to a robot that seems determined to book you for the family musical when you want to see the latest Bruce Willis flick.

It’s fair enough; historically the technology used for voice recognition in contact centres has been terrible. But what is it that customers find so objectionable? Is it the experience of talking to a robot, or is it just the frustration when the technology doesn’t work as well as it should?

As the world’s most powerful technology brands invest in systems that overcome some of the traditional problems of voice recognition, the experience of talking to a device can be a delight. With Apple’s Siri, Google Now and Microsoft’s Kinect on the Xbox One we see voice recognition delivering fantastic customer experiences, a far cry from the traditional voice recognition systems that consumers encounter in the contact centre.   

If technology is the problem

Assuming then, that it is the effectiveness of the technology that has always been consumers’ gripe with contact centre voice recognition, it seems we are now at a point where this can be largely overcome. The examples we see on our smartphones, tablets, games consoles, laptops and even in watches and eyewear, show that sophisticated voice recognition that actually works is possible. Deploy this level of intelligent voice recognition in a contact centre environment, and problem solved.

Talking to a machine

But what if it is simply the experience of talking to a robot that consumers take issue with, regardless of how well the technology works? Will we be happy to forsake the human element of customer service interaction, even if the machine on the other end of the line now understands what we’re saying?

The growth of voice recognition in consumer technology may offer an answer here too. As your customers become more used to the idea of talking to a machine, as they do so regularly when interacting with personal technology, it follows that they should become more open to doing so when interacting with an organisation in this way.

Considering we now live in a world where a robot has passed the Turing test, we may not even need to worry about whether customers are happy to talk to a machine. They might not even know about it.

The ‘miscellaneous’ queries

However, surely there are some interactions that a machine may not be equipped to deal with. We know that self-servers tend to pick up the phone as a last resort, they’ve already attempted to tackle the issue they’re facing by searching online, going through the FAQs on your website and haven’t been able to reach a satisfying conclusion. When they go through call routing they need to speak to a customer service representative because their problem does not fit into any of the predetermined boxes.

In the age of self-service, this is a fundamental function of the contact centre, to tackle the ‘other’ and the ‘miscellaneous’. Surely it takes a real customer service agent with a human being’s full understanding to tackle these queries satisfactorily?

Data driven contextualisation

Perhaps, but consider for a moment how Google Now is able to process queries. If I pick up my phone and ask Google a question, the answer it serves me is not likely to be a one-size-fits-all response that it would have given to anyone, it’s a personalised response based on the multitude of data points Google has about me. Previous searches, online browsing behaviour, phone contacts, where I live, where I work, what I bought from Amazon yesterday and when the delivery’s coming. If I ask about football scores it will tell me about my favourite team, if I ask about flight times it will give me the status of flight it’s seen the confirmation for in my Gmail account.

With this kind of data, combined with sophisticated machine learning, it is possible to answer queries with the kind of personal and contextualised response that we would normally only expect from a human being.

It is no wonder then, that it was Google’s chairman Eric Schmidt who warned us earlier this year about the possibility of many jobs that we currently consider to be only possible for humans to be taken over by machines in the future. Customer service professionals included.

Not everyone is Google

The problem here, of course, is that most organisations do not have the kind of data that the likes of Google, Facebook, Apple or Microsoft have about their customers. But it does pose an interesting idea about a plausible future.

Most businesses are focussing on gathering richer customer data; could a significant enough portion reach a level of sophistication whereby voice recognition combined with contextual data can allow most customer interactions to be automated?

Maybe, and it is worth remembering that while the more complex customer issues might prove a sticking point, if the voice recognition technology works, the more straightforward queries that still make up the bulk of interactions should not be out of reach of the robots.

David Ford is managing director at Magnetic North.

Replies (0)

Please login or register to join the discussion.

There are currently no replies, be the first to post a reply.