Brought to you by

Three chatbot fails and what they teach us about customer service

1st Feb 2019
Brought to you by

Though they’ve come a long way in recent years, like their human counterparts, it's fair to say chatbots are still not perfect. Sometimes technology gets it wrong.

Usually when this happens, it makes for an amusing anecdote, but as we become more accustomed to interacting with technology, our tolerance for poor exchanges is decreasing.

This is especially true when it comes to customer service chatbots. Companies have had more than enough time to test the waters and find out what works. If in 2019, they’re still offering a bot that doesn’t truly help their customers, it can actually damage their reputation.

A recent Digitas survey found that 73% of customers would never use a chatbot again after one bad experience – so in today’s digital age, companies truly have one chance to get it right. But with more and more organisations deploying chatbots – Business Insider estimates that by 2020, 80% of companies will have a chatbot – the potential for things to go wrong is very real.

Let’s take a look at some entertaining examples of previous chatbot deployments that didn’t quite perform the way they were supposed to.

1. Siri

Starting with Siri, the chatbot we all know and love. The Apple invention that first launched in 2011 was designed to make our lives easier, but she doesn’t always do so.

Though she’s mainly used for entertainment purposes and is reliable enough when tasked with simple questions or basic tasks, she still struggles to derive meaning from what we’re asking if we’re not specific (sometimes, even when we are specific), as you can see:


Conversational technology has advanced enough today that we’re able to interact with it the same way we’d interact with friends or family, so it’s frustrating when we can’t. Not to say it isn’t fun to spend time interacting with Siri, but with Amazon Alexa and Google Home gaining traction every day in the virtual personal assistant market, Siri will have to find a way to be a little more helpful if she wants to compete.

Want to experience the silly side of Siri? Ask her:

  • How old are you?
  • Is winter coming?
  • What is Inception about?
  • Why are fire trucks red?

2. Poncho

Another popular example of a chatbot not exactly living up to expectations is Poncho, the now defunct weather app. Poncho was packed with personality and made weather forecasting fun – until it didn’t.

Users eventually found that Poncho’s answers and attention span left a lot to be desired and that perhaps they were better off getting their weekend weather from a more traditional source.

In an interview with Gizmodo, Sam Mandel, CEO of Poncho, said that less than 24 hours after the bot launched, they ran into unanticipated problems and that fixes were needed fast, “because tolerance for a mediocre bot is much less than for a mediocre app". 

He was right, and Poncho was pulled after a few years of frustrating forecasts.


3. Tay

Perhaps the most famous incident of a chatbot gone bad is Microsoft’s Tay (though humanity might be more to blame than poor chatbot technology). Microsoft launched Tay in an effort to learn about natural language and artificial intelligence. The intent was to allow Tay to interact with Twitter users so she could develop a unique personality based on those interactions.

The more people chatted with Tay, the smarter she’d get – or so they thought. Of course, Twitter was very… Twitter about things, and within 24 hours Tay was pulled after becoming incredibly racist.

You can find some of Tay’s more colorful exchanges online if you’re so inclined, but be warned, it’s not nice.

Tay fail

Chatbot fails don’t have to be this grand to be considered problematic. Any time your chatbot can’t help a customer find the information they’re after, it’s failing to do its job.

This can happen for a number of reasons, but often it’s because organisations severely underestimate the complexity of enterprise grade chatbots and what they need to succeed.

Replies (1)

Please login or register to join the discussion.

Louis Smith
By LouisSmith
26th Feb 2019 12:32

That's for sure XD

Thanks (0)