Designing a Chatbot: Making a Chatbot sound stupid (how to test a chatbot)

Is your chatbot designed for those curveballs that some notorious users may throw at it? Humans are… well… humans and you can’t impose an HR policy on the them saying:

“Abusing chatbots is a punishable offence.
You might hurt its self confidence”.

There are usually 3 ways in which you can converse with a chatbot. When you know which type of conversation a chatbot can have, you can know how to break the flow and make it sound stupid (haha).

One thing you need to know about the chat flow is that any bot would start by asking you a set of questions that it needs as inputs to continue with the conversation. Read more on how these inputs are usually designed.

Lets start with the simplest type of conversation that a chatbot can handle:
Chatbot takes inputs in one sentence at a time
This is the simplest case and most of the chatbots today are being designed with this type of chat flow. Everything is expected and predictable and it is just like filling up a web form (frankly).

So, conversation with a hotel chatbot would go like this:

Chatbot: Hi. I can help you book a room at our hotel. When would you like to check in?

User: Hey. I will be checking in on 8th of March 2018
Chatbot: Cool. When would you like to checkout?
User: 10th of March
Chatbot: This room would be for how many guests?
User: 2
Chatbot: Here are the 3 types of rooms we have on the available dates

How to break this chatbot:

Note that the chatbot has the privilege to start the chat here and force the user into a chat flow that is predetermined, which eases alot of things and makes it predictable. But here are the cases in which the chatbot would break:

1. User breaks the flow in the first sentence. Says something like: I just wish to know if you have allow pets.

2. If the user changes the form of output. So, instead of a numerical number, if the user says “me and my wife with son and daughter”, that just means 4 guests, which chatbot may never understand.

3. If the guest gives answer to a different question that is not as per the flow.

Chatbot can accept all inputs in one single sentence

So, the user is a little impatient and crams everything into one single sentence. This is an ideal conversation for a human but a complicated one for a chatbot to process. Here is how the chat would look like:

Chatbot: Hi. I can help you book a room at our hotel. When would you like to check in?
User: Hey. I wish to book a room from 8th March till 10th March for 2 guests.
Chatbot: Cool. Here are the 3 types of rooms we have on the available dates

How to break this chatbot:

The above conversation itself is a curve ball for a lot of chat development platforms and most of them cannot make sense of the above sentence at all. Though, here are conversations that it may not be able to handle:

1. User breaks the flow in the first sentence. Says something like: I just wish to know if you have allow pets.

2. If the user changes the form of output. So, instead of a numerical number, if the user says “me and my wife with son and daughter”, that just means 4 guests, which chatbot may never understand.

Though, I am yet to see a chatbot that is able to deliver such a chat experience.

Chatbot asks a query that was missed by the user

Here, user does a good job in telling his requirement but misses out on something which is required to perform the task. In such a case, the bot needs to followup with a question to complete the transaction.

Here is how this will look like:

Chatbot: Hi. I can help you book a room at our hotel. When would you like to check in?
User: Hey. Me and my wife will arrive on 8th March.
Chatbot: Cool. When would you be checking out?
User: 10th of March
Chatbot: Here are the 3 types of rooms we have available on the dates above

How to break this chatbot:

Just like second conversation, even this one is hard to manage by many chat platforms, but still should be prone to these disasters:

1. User breaks the flow in the first sentence. Says something like: I just wish to know if you have allow pets.
2. If the user changes the form of output. So, instead of a numerical number, if the user says “me and my wife with son and daughter”, that just means 4 guests, which chatbot may never understand.

Do you know a bot that did something stupid? Feel free to share the conversations or screenshots below.