In this video of Deconstructing Chatbots with Priyanka Vergadia, learn all the important terms you need to know to start building your own chatbot with DialogFlow.
What's important to note is that Tellephant and Aiyo Labs are part of the Google Cloud for StartUps program and can help your business get on Google Business Messages live and running in no time integrated with your own personalized chatbot.
A full transcript of the video is below. Enjoy!
PRIYANKA VERGADIA: Welcome to "Deconstructing Chatbots." I am Priyanka Vergadia. And in this episode, we will dissect the architecture of chatbots to better understand the building blocks.
In the last episode, we introduced DialogFlow, which is an end-to-end tool powered by natural language understanding to facilitate rich and natural conversations. Today let's look at a high-level architecture. DialogFlow sits in the middle of the stack. A user can interface with it via all the common channels, including text, websites, apps, messengers, and smart voice devices like Google Home.
DialogFlow handles the job of translating natural language into machine-readable data using machine learning models trained by your examples. Once it identifies what the user's talking about, it can hand this data to your back-end where you can use it to make stuff happen. At the back-end you can fulfil the request by integrating with your other services, databases, or even third party tools like your CRM.
Now let's dig one layer deeper into the DialogFlow piece itself. We first create an agent within DialogFlow. Agent is basically your entire chatbot application-- the experience of collecting what the user's saying, mapping it to an intent, taking an action on it, and then providing the user with the response. And in your agent, this all starts with the trigger event called utterance. This is how our users invoke the chatbot.
So if I say, hey Google, play some music, the whole sentence is uttered. While the phrase, hey Google, is the trigger. Let's take another example. Hey Google, talk to Smart Scheduler. The phrase, talk to Smart Scheduler, is the invocation phrase for our chatbot. And Smart Scheduler here is the invocation name.
Once the bot is activated and has collected the user utterance, we need to understand what the user's intent is. Why do they want to talk to our bot? So when you say, I want to set an appointment, here is the intent. Or if you ask what are your hours of operation, then hours of operation is the intent. To control all this, you provide DialogFlow with different examples of users' intent like set an appointment, hours of operation, and others. DialogFlow then trains the machine learning model with many more similar phrases and finally maps the user's phrase to the right intent. This process is called intent matching.
Now that we know our user's intent we need to know what to do to give them a response. For that you configure actions and parameters to define the variables you want to collect and store. Let's look at an example.
Set an appointment for 5:00 AM tomorrow. When a user says that, 5:00 AM and tomorrow are two critical pieces of information in that statement that we would actually need to book an appointment. Those variables are defined as entities. DialogFlow offers different types of entities. And we will cover those in detail in an upcoming episode. Once we have the variables, we may use them to provide a static response to the user. Or in most cases, we may want to send the variable to our back-end, take some action on it, and then provide a user with a dynamic response. We will look at that in a second. OK.
To summarize, an intent includes training phrases, actions and parameters, and response. Depending on what services your bot offers, you might typically have from few to thousands of intents. They could also be in different languages. While you're looking at the architecture, it's worth mentioning context. Context is the method for your chatbot to store and access variables so it can exchange information from one intent to another in a conversation. We will have more information on context in an upcoming episode.
Fulfilment is the last piece in the puzzle. It's the code that you write to interface with the back-end services to respond to a dynamic request. We will play with this a lot more soon. But for now, note that DialogFlow has inbuilt integration with Google Cloud Functions to interface with your back-end. And you can also provide another HTTPS endpoint, and DialogFlow will just connect to it.
All right. So today we looked at a simple high-level architecture of setting up an agent within DialogFlow and learned about intents, entities, context, and fulfilment at a high level.
Next we are going to go hands-on and build a simple appointment scheduler chatbot. If that sounds exciting to you, join me in the next episode of "Deconstructing Chatbots."
Thinking about integrating DialogFlow with Google Cloud ML APIs? Google Cloud shows us how to use DialogFlow to create a reliable, conversational chatbot.Read Story
In this blog, the Google Cloud Team will be building their own chatbot with DialogFlow and Google Cloud Platform. Read the blog to learn moreRead Story