Real-world Walkthrough

Contents

Building Chatbots with Google Dialogflow
1
Introduction
PREVIEW2m 17s
2
How & Why
PREVIEW12m 18s
6

The course is part of these learning paths

Start course
Difficulty
Intermediate
Duration
38m
Students
374
Ratings
5/5
starstarstarstarstar
Description

This course is the first in a two-part series that explores how to create a chatbot using Google Dialogflow. We'll take an introductory look at what the Google Dialogflow tool is used for and look at the basic steps and components required to make a chatbot. We'll cover the concepts and technical aspects to consider when building a chatbot, and then put these into context by applying them to a real-world scenario.

Learning Objectives

  • Understand the fundamentals of creating chatbots using Dialogflow
  • Learn how chatbots interact with users
  • Understand the technical aspects of developing a Dialogflow application
  • Learn how the concepts covered in the course can be applied to a real-world scenario

Intended Audience

  • Anyone looking to build chatbots using Google Dialogflow

Prerequisites

In order to get the most out of this course, you should have at least a basic understanding of:

  • Computer science techniques
  • REST APIs and SQL
  • Google Cloud Platform
Transcript

So let's take all of these concepts and show them in a real-world scenario. Now, what we're about to show you, is something that's anonymized and slightly edited in order to protect privacy, but this is an actual production use case that calculates a systems engineered for a client.

A lot of the decisions we're about to show, actually reflect the practicality and reality of the situation and not something that should be necessarily viewed as textbook academic best action all the time. Oftentimes in the real world we need to take specific approaches in order to get it to work. Best case, we're able to use something serverless and modern cloud applications. However, in many cases you're stuck. Well, I don't wanna say stuck. You're going to continue to use the traditional server full databases.

Now, what we're working with is a small to medium sized business. This client wants a chatbot that's able to receive incoming text messages as an SMS in order to interact with an underlying eCommerce store. Basically, we need a good way for a user to understand how their business is performing. The nuance here is that this chatbot is aimed at the business owner, not the business consumer.

Now, this actually presents a lot of unique challenges for this chatbot because this data needs to be served entirely over text messages, and which we can't use any rich tests message functionality or other services such as integrated web pages. We also need to interact with whether it is actually fundamentally a complex and potentially sensitive set of data.

There's going to be multiple time periods, multiple data sets, and multiple sources. There's going to be differences and things such as real time versus business quarter time. So how can we begin to handle this and what are the nuances that we had to consider when building this out? So let's revisit our favorite person Edith.

Now when Edith isn't managing her personal bank accounts, she's actually running an eCommerce business. You'll notice that the first step we're showing is actually Edith typing into her phone. When creating something that talks into human conversations, it's very important to understand the hardware interface that somebody is going to use.

Somebody texting on their phone is going to expect something very different than somebody using Facebook Messenger and something entirely different than somebody using a voice service like Google Assistant. So understanding who's creating the message in what interface is key. And then that phone is going to send the SMS message, and there's a slew of ways to receive this.

In my personal opinion however, Twilio is one of the best ways to handle it. Twilio is a third-party service that Google used to have a first-party integration with where you can just toggle it on. They now provide an open-source, GitHub bucket, that contains all of the integrations and some very easy to execute instructions.

There's a little bit of effort here at up in standing, but it shouldn't take terribly long, and if you need to send text messages, this is a really good option. And then Twilio passes the text message to Dialogflow. It actually passes a ton of parameters based on things like phone number and account, which is intern assessable by any fulfillment cost that dialogue makes.

So not only are we able to build out the entities and parameters within Dialogflow, we're also able to use some of the extra metadata that Twilio has collected. So of course, Dialogflow does this things and ensures that intense and parameters exist before invoking a backend service.

Now there's a few different ways we could have done this, but we use combination of Google Cloud functions and Firebase. There's a few others we handled for some other cases that we'll discuss in a moment, but just now I like flow kind of invoke a Google Cloud function via rest API like we did. This function is then able to fetch the data from Firebase.

So basically the thought is when designing a system like this, we want to go as serverless and as modern as possible. So cool, functions are a great way to store snippets of code and do it as a high level of agility without a server, and Firebase is a no SQL way of storing and retrieving data. I'm sure that somebody will slap me or have specifics of why this isn't entirely true, but as a quick shortcut, you can think a Firebase is very similar to MongoDB in terms of core functionalities even though it's a little different. It's basically just a no SQL database.

So this is actually all the way from Edith typing into her phone with her intent to the source of truth and system of record. We now have to go back to the end user. So after Firebase gets a well-formatted query from the Google cloud function, we actually have to start to respond.

So going back, we provide a raw JSON response to the Google cloud function and the function then starts interpreting it. Now the function actually has a little bit of burden here. We need to look at the response and format it in a fulfillment style of message that dialogue understands. At its core though, don't worry too much about this, it's a simple JSON that basically contains the human readable text and then some basic instructions to Dialogflow about where it's going to be used and things like what is the default language.

Dialogflow now has the intelligence to route it right back along the same phone number. The session concept that we're not gonna cover very much here, and the concept that all this was done within the same post request is key, but just know that Dialogflow has the ability to handle what user is going to get a response from each fulfillment call.

So then Twilio receives the message forward from Dialogflow and shoots it across the site at work. And then it goes all the way across the network to Edith who then gets a message displayed on her phone. So this full round trip for Dialogflow is a full architecture that starts with the user creating a message all the way through handling the data and displaying a meaningful result back to the person.

We actually faced a lot of challenges in making this in production because there's a lot of nuance in the intent and how to differentiate them for the end user. It was actually a relatively easier part of the challenge for us to create the functions and systems of records to retrieve the data, once we understood what the intent and questions that user is going to ask.

I really wanna highlight though a very important nuance. All this stuff on the right, the function and system of record can really be anything. We've also used things such as Apache NiFi which is able to run a rest API, route the messages and transform it. All you really need for this backend service, is anything that can run a rest API and respond via the same.

There's a lot of simplified streaming solutions that can do that. You can do this with a Python server. You could do this with a Java server. If you're feeling fancy, you could even do it with something such as an Apache Spark server. There's a lot of open-source systems for how to handle this connection, and also your system of record.

Don't think that you're just limited to Firebase and no SQL. This can be just like anything in the previous layer that the business functions can talk to. So this could be things such as Postgres, such as MongoDB, such as SQL, and odds are, you'll be able to find a combination of systems of records that your company uses and systems and business functions that you can use in order to tie it all together.

So to summarize all of the components really quickly and list them out, we use Twilio in order to serve as the integration interface to the outside world. This handled all of the routing, receiving and sending of text messages. We of course use Dialogflow, and we also used cloud functions and NiFi as the fulfillment service.

For the backend system of truth and source of records, we used Postgres originally and started migrating towards a Firebase to store and retrieve data. We had to build an agent as well of course. All of the intense, specific entities and web hooks needed to be designed and integrated, and the real challenge was actually differentiating the intense so that we could have good clean web hooks to provide data.

About the Author
Students
22528
Labs
31
Courses
13
Learning Paths
35

Calculated Systems was founded by experts in Hadoop, Google Cloud and AWS. Calculated Systems enables code-free capture, mapping and transformation of data in the cloud based on Apache NiFi, an open source project originally developed within the NSA. Calculated Systems accelerates time to market for new innovations while maintaining data integrity.  With cloud automation tools, deep industry expertise, and experience productionalizing workloads development cycles are cut down to a fraction of their normal time. The ability to quickly develop large scale data ingestion and processing  decreases the risk companies face in long development cycles. Calculated Systems is one of the industry leaders in Big Data transformation and education of these complex technologies.