In this lab you will build a simple action using Dialogflow and learn how to integrate it with the Google Assistant.

The exercises are ordered to reflect a common cloud developer experience:

  1. Create a Dialogflow v2 agent
  2. Create entities
  3. Create intents
  4. Setup a webhook with GCP Cloud functions
  5. Use the knowledge base to import FAQs
  6. Test the chatbot
  7. Enable the Google Assistant integration

What you will build

We will build a Google Assistant app for the Women in Voice meetup group. It will be possible to ask when the next meetup is, article & book tips or general questions about the meetup group.

What you'll learn

Prerequisites

Enable Web Activity in your browser

1. Click: http://myaccount.google.com/activitycontrols

Google Activity Controls

2. Make sure Web & App Activity is enabled:

Create a Dialogflow agent

1. Open: https://console.dialogflow.com

Dialogflow Console

2. In the left bar, right under the logo, select "Create New Agent". In case you have existing agents, click the dropdown first.

3. Specify an agent name: yourname-wiv (use your own name)

4. As the default language choose: English - en.

5. As the default time zone, choose the time zone that's the closest to you.

6. Do not select Mega Agent. (With this feature you can create an overarching agent, which can orchestrate between "sub" agents. We do not need this now.)

7. Click Create

Configure Dialogflow

1. Click on the gear icon, in the left menu, next to your project name.

2. Enter the following agent description: Women in Voice agent

3. Scroll down to Beta Features and flip the switch, to enable beta features.

4. Scroll down to Log Settings and flip both switches to Log the interactions of Dialogflow and to log all interactions in Google Cloud Stackdriver. We will need this later, in case we want to debug our action.

5. Click Save

6. Click Done

Configure Actions on Google

1. Click on the Google Assistant link in See how it works in Google Assistant in the right hand panel.

This will open: http://console.actions.google.com

NOTE: Make sure you are logged in with the same Google account as in Dialogflow.

When you are new to Actions on Google, you will need to go through this form first:

2. Try to open your action in the simulator, by clicking on the project name.

3. Select Develop in the menu bar

Uncheck Match user's language setting. To make sure the text to speech synthesizer won't be overruled by the Assistant default language.

5. Click Save

6. Select Test in the menu bar

6. Make sure, the simulator is set to English and Click on Talk to my test-app

The action will greet you, with the basic Dialogflow default intent. That means that setting up the integration with Action on Google worked!

Configure Google Cloud

For this tutorial you will need a GCP account with a billing account. If you don't have one yet, you can createone with these steps.

Normally a billing account requires a payment method such as a credit card. For this workshop, we can make use of workshop credits which can skip this process.

  1. Navigate to this URL and login

gcpcredits.com/wivnl

  1. Click: Click here and access your credits
  2. Click Accept & Continue

You are all set. You've created a billing account with 25 dollars, which should be more than enough to use Cloud Functions for a long time.

Enable Google Sheets API

If your agent needs more than static intent responses (for example to fetch data from a web service, database or Sheet), you need to use fulfillment to connect your web service to your agent. Connecting your web service allows you to take actions based on user expressions and send dynamic responses back to the user.

For example, if a user wants to receive a blog or book tip, your web service can check in your database and respond to the user with an article to read.

In this tutorial we won't make use of a database, instead we will make use of a Google Sheet. Once the sheet gets updated, the Google Assistant action will be updated as well. Neat!

  1. Open this Google Sheet in a new browser tab, if you haven't done so already:
  1. https://docs.google.com/spreadsheets/d/1UWx3WYVCrqz0D4uJ_pO56WeqEPa9rQDG1cfc_H11kgY/edit#gid=1240329448
  2. IMPORTANT: Make a copy of this sheet. Click File > Make a Copy
  3. Once the sheet has been copied, click Share
  4. We will need to give the Dialogflow Service Account edit rights. To do this, open Dialogflow > Settings (cog wheel).
  1. Scroll down to Google Project
  2. Copy the service account (email) address. It should look something like this: dialogflow-<someid>@<my-gcp-project>.iam.gserviceaccount.com

  1. Paste this service account in the Share popup of Google Sheets, and give it Edit rights.

  1. Next we will need to remember the Sheet ID that we are currently working in.

The Sheets URL will look something like this:

https://docs.google.com/spreadsheets/d/1fPd8b_z19U7ZzAaY327QhYoogn6q8c1rpGSNF8KIR_o/edit#gid=1240329448

But we are only interested in the Sheet id, which is the part between:

https://docs.google.com/spreadsheets/d/ and /edit#gid=1240329448 (without the slashes).

So it will look something like this: 1fPd8b_z19U7ZzAaY327QhYoogn6q8c1rpGSNF8KIR_o

Write this Sheet ID down, or copy it to Notepad. In the Webhook steps we will use this again.

  1. Open in another browser tab; http://console.cloud.google.com. (In case you have more Google Cloud projects, activate the new Dialogflow project: yourname-wiv). - In the search bar search for: Google Sheets API

  1. Click this, and click the Enable Google Sheets API button in the top.

Entities are objects your app or device takes actions on. Think about it as parameters / variables. In our action we will ask:

"I want a reading tip about chatbots / I want a reading tip about voice"

Whether you say Chatbots, Voice or Both, this will be gathered from a custom entity which will be used as a parameter in my request to a web service.

Here's more information on Dialogflow Entities.

Creating the Channel Entity

1. Click in the Dialogflow Console on the menu item: Entities

2. Click Create Entity

3. Entity name: tech (make sure it's all lowercase)

4. Specify the options with the synonyms. (You can tab through the interface.)

5. Switch to Raw Edit mode by clicking on the menu button next to the blue save button.

6. Notice that you could have entered all the entities in CSV format as well. This can be handy when you have a lot of entities that need to be created.

"Chatbots","Chatbots","Chat","Web"
"Voice","Voice","Voicebots","Voice Assistants"
"Both","Both","All"

7. Hit Save

Dialogflow uses intents to categorize a user's intentions. Intents have Training Phrases, which are examples of what a user might say to your agent.

For instance, a user who wants to know who wants to know when the next event is might ask:

"When is the next meetup?"

When a user writes or says something, referred to as a user expression, Dialogflow matches the user expression to the best intent in your agent. Matching an intent is also known as intent classification.

Here's more information on Dialogflow Intents.

Modifying the Default Welcome Intent

When you create a new Dialogflow agent, two default intents will be created automatically. The Default Welcome Intent, is the first flow you get to when you start a conversation with the agent. The Default Fallback Intent, is the flow you'll get once the agent can't understand you or can not match an intent with what you just said.

  1. Click Intents > Default Welcome Intent

In the case of the Google Assistant, it will auto-start with the Default Welcome Intent. This is because Dialogflow is listening to the Welcome event. However, you can also invoke the intent by saying one of the entered training phrases.

Here's the welcome message for the Default Welcome Intent:

User

Agent

"Ok Google, talk to <yourname>-WIV"

"Hey there, I'm Anna, the virtual agent of Women in Voice."

"You can ask me for information about meetups, Women in Voice or a reading tip.

"What would you like to know?"

  1. Scroll down to Responses.
  1. Clear all Text Responses.
  1. In the default tab, create the following 3 responses. (Click Add Responses > Text or SSML Response, for each new line:)

The configuration should be similar to this screenshot.

  1. The previous output is used for chatbots, we can modify the output a bit, specifically for the Google Assistant. We will use SSML (Speech Synthesis Markup Language) to build in pauses in our sentences. Click the Google Assistant tab.

Hey there, I'm Anna, the virtual agent of Women in Voice.

You can ask me for information about meetups, Women in Voice or a reading tip. What would you like to know?

<speak><p><s>Hey there, I'm Anna, the virtual agent of Women in Voice.</s><s>

You can ask me for information about meetups, Women in Voice or a reading tip.</s></p><break time="500ms"/><p><s> What would you like to know?</s></p></speak>

The configuration should be similar to this screenshot.

  1. Click Save

Here you can find more information about SSML for Actions on Google.

  1. Let's test this intent. First we can test it in the Dialogflow Simulator.

Type: Hello. It should return this message.

  1. Now, switch back to the Actions on Google console.

(You might want to keep this in another tab.)

Actions on Google Console

Click: "Talk to my test app." And listen to the new welcome message.

Modifying the Default Fallback Intent

  1. Click Intents > Default Fallback Intent
  2. Scroll down to Responses.
  3. Clear all Text Responses.
  4. In the default tab, create the following responses, each on a new line, so it alternates between these options:

  1. Click Save

Note, when you don't enter a Google Assistant output, it will take the default.

Create the Stop Intent

1. Click on the Intents menu item.

2. Click Create Intent

3. Enter the Intent Name: Stop Intent

4. Click Add Training phrases

4. Scroll down to Responses.> Add Response

5. Add the following text options:

6. Flip the switch: Set this intent as the end of conversation. This will make sure, once this intent gets matched, it will close the Google Assistant action.

7. Click Save.

Create the Meetup Intent

The Meetup Intent will contain this part of the conversation:

User

Agent

"When is the next meetup?"

"The next meetup will be <date> at <time> in <location>. The topic will be <topic>. And the speakers are: <speakers>. You can register via our newsletter."

1. Click on the Intents menu item.

2. Click Create Intent

3. Enter the Intent Name: Meetup Intent (make sure you use a capital M and a capital I. - If you spell the intent differently, the back-end service won't work!)

4. Click Add Training phrases

5. Click Fulfillment > Enable Fulfillment

6. Flip the Enable Webhook call for this intent switch.

7. Hit Save

Create the Tip Intent

The Tip Intent will contain this part of the conversation:

User

Agent

"I want a reading tip."

"Do you want to read more about Chatbots, Voice or Both?"

"Voice"

"Alright, here's the tip of the day! The <type> <title> of <author>. Do you want another book or article tip? Also, I can tell you more about meetups or what we do. How can I help?"

1. Click on the Intents menu item again.

2. Click Create Intent

3. Enter the Intent Name: Tip Intent (make sure you use a capital L and a capital I. - If you spell the intent differently, the back-end service won't work!)

4. Click Add Training phrases and add the following:

5. Scroll down to Action and parameters

6. Mark tech as required

Click: Define Prompt and enter:

8. Click Fulfillment > Enable Fulfillment

This time we are not hardcoding a response. The response will come from the cloud function! Thus flip the Enable Webhook call for this intent switch.

Click Fulfillment > Enable Fulfillment

10. Flip the Enable Webhook call for this intent switch.

11. Hit Save

Knowledge connectors complement defined intents. They parse knowledge documents to find automated responses. (for example, FAQs or articles from CSV files, online websites or even PDF files!) To configure them, you define one or more knowledge bases, which are collections of knowledge documents.

Read more about Knowledge Connectors.

Let's try this out.

  1. Select the en tag, to select the English language in the top menu.
  2. Select Knowledge (beta) in the menu.

  1. Click the right blue button: Create Knowledge Base
  1. Type as a Knowledge Base name; Women in Voice and hit save.
  1. Click Create the first one link

  1. This will open a window.

Use the following config:

Document Name: Women in Voice FAQ Sheet

Knowledge Type: FAQ

Mime Type: CSV

  1. We will need the data from this sheet, make sure the data sheet is opened, and select the FAQ tab
  1. Select File > Download > CSV

  1. Back in Dialogflow, Click Upload File from Computer and select the CSV file you have downloaded. Click Create

A knowledge base have been created:

  1. Click Add Response

Create the following answers and hit save:

$Knowledge.Answer[1]

  1. Click View Detail

This will display all the FAQs you have implemented in Dialogflow.

That's easy!

Know that you could also point to an online HTML website with FAQs to import FAQs to your agent. It's even possible to upload a PDF with a block of text, and Dialogflow will come up with questions itself.

  1. Click on Knowledge (beta) in the Dialogflow menu to go back to all the Knowledge base connectors.
  1. It's possible to change the strength and weakness of the Knowledge Base. This makes sense, when you have the idea that your FAQs are winning or losing from your own intents. Since we don't have many intents, let's make our Knowledge Base a bit stronger. Change the scale to -0.2. After dragging the slider, it will automatically save the value.

Now FAQs should be seen as 'extras' to add to your agents, next to your intent flows. Knowledge Base FAQs can't train the model. So asking questions in a completely different way, might not get a match because it makes no use of Natural Language Understanding (Machine Learning models). This is why, sometimes it's worth converting your FAQs to intents.

Create a Google Cloud Function

  1. Navigate to http://console.cloud.google.com in another browser tab.
  2. Select in the left menu Cloud Functions
  3. Click Create Function

  1. Specify the following configuration:
  1. Make sure this authentication checkbox is checked:

  1. Here's the contents for package.json. Copy and paste this in the package.json tab of the editor.

This piece of code, loads the correct npm libraries into Google Cloud:

{
 "name": "dialogflow",
 "description": "Cloud Functions",
 "engines": {
   "node": "8"
 },
 "dependencies": {
   "request": "^2.85.0",
   "request-promise": "^4.2.5",
   "dialogflow-fulfillment": "^0.6.1",
   "actions-on-google": "^2.2.0",
   "googleapis": "^48.0.0",
   "moment": "^2.24.0"
 },
 "devDependencies": {
   "eslint": "^5.12.0",
   "eslint-plugin-promise": "^4.0.1",
   "ngrok": "^3.2.7"
 },
 "private": true
}
  1. Here's the contents for the index.js. Copy and paste this in the index.js tab of the editor.

This piece of code will integrate with the googleapis library, to fetch data from a Google Sheet. It makes uses of the actions-on-google library to display cards on a Google Assistant device. It makes use of library dialogflow-fulfillment, to classify Dialogflow intents. And it makes use of the library moment to handle date and time objects.

/* jshint esversion: 8 */
'use strict';

process.env.DEBUG = 'dialogflow:debug';

const ACCOUNTS_SHEET_ID = '1UWx3WYVCrqz0D4uJ_pO56WeqEPa9rQDG1cfc_H11kgY';

const {
 BasicCard,
 Button,
} = require('actions-on-google');

const {google} = require('googleapis');
const moment = require('moment');
moment.locale('nl');
const { WebhookClient } = require('dialogflow-fulfillment');
var books;
var meetups;

const SHEETS_SCOPE = 'https://www.googleapis.com/auth/spreadsheets.readonly';

/**
* Authenticates the Sheets API client for read-only access.
*
* @return {Object} sheets client
*/
async function getSheetsClient() {
   // Should change this to file.only probably
   const auth = await google.auth.getClient({
       scopes: [SHEETS_SCOPE],
   });
   return google.sheets({version: 'v4', auth});
}

/**
* Return a natural spoken date
* @param {string} date in 'YYYY-MM-DD' format
* @returns {string}
*/
var getSpokenDate = function(date){
   let datetime = moment(date, 'YYYY-MM-DD');
   return `${datetime.format('D MMMM')}`;
};


/* When the tipIntent Intent gets invoked.  */
function tipIntent(agent) {
 var par = agent.parameters.tech;
   var selection = [];
   //console.log(par);
   //console.log(books);
    for(var i = 0; i<books.length; i++){
     if(books[i][2].toLowerCase() == par.toLowerCase()) {
         selection.push(books[i]);
       }
   }
    var random = Math.floor(Math.random() * selection.length);
   var booktip = selection[random];
   //console.log(selection[random]);

   let spokenText = `<p><s>Alright, here's the tip of the day!</s></p><p>The ${booktip[6]} ${booktip[0]} of ${booktip[1]}.</p>`;
   let writtenText = `Alright, here's the tip of the day! The ${booktip[6]} ${booktip[0]} of ${booktip[1]}.`;
   //console.log(booktip[8]);
    if (agent.requestSource === agent.ACTIONS_ON_GOOGLE) {
       let conv = agent.conv();
       conv.ask(`<speak>${spokenText}</speak>`);
       conv.ask(new BasicCard({
           title: `Tip of the day!`,
           subtitle: `${par}`,
           text: `The ${booktip[6]} ${booktip[0]} of ${booktip[1]}.`,
           buttons: new Button({
               title: 'Read',
               url: `${booktip[8]}`,
           })
       }));
       conv.ask(`<speak><p><s>Do you want another book or article tip? Also, I can tell you more about meetups or what we do. How can I help?</s></p></speak>`);
       // Add Actions on Google library responses to your agent's response
       agent.add(conv);
   } else {
       agent.add(writtenText + ' Do you want another book or article tip? Also, I can tell you more about meetups or what we do. How can I help?');
   }
}


function meetupIntent(agent) {
 let conv = agent.conv();
 let record;
  console.log(meetups);
  for(var i = 0; i<meetups.length; i++){
   let d = moment(meetups[i][0], 'YYYY-MM-DD');
   let today = moment(new Date());

   if(moment(d).isSameOrAfter(today)) {
     // the i event is not in the past
     record = meetups[i];
     console.log(record);
     break; 
   }
 }
  let date = getSpokenDate(record[0]);
 let spokenText1 = `The next meetup will be ${date} at ${record[1]} in ${record[3]}.`;
 let spokenText2 = `The topic will be <emphasis level="moderate">${record[2]}.</emphasis>`;
 let spokenText3 = `You can register via our newsletter.`;
  let writtenText = `${spokenText1} The topic will be ${record[2]}. ${spokenText3}`;

 if (agent.requestSource === agent.ACTIONS_ON_GOOGLE) {
   conv.ask(`<speak>${spokenText1} ${spokenText2} ${spokenText3}</speak>`);
   conv.ask(new BasicCard({
     title: `Meetup`,
     subtitle: `${record[2]}`,
     text: `${record[0]} ${record[1]} - ${record[3]}`,
     buttons: new Button({
       title: 'Register',
       url: `http://www.meetup.com`
     })
   }));
   conv.ask('<speak><p><s>Is there anything else I can help you with?</s></p></speak>');
   agent.add(conv);
 } else {
   agent.add(`${writtenText} Is there anything else I can help you with?`);
 }
}

exports.dialogflow = async (request, response) => {
 var agent = new WebhookClient({ request, response });

 console.log('Dialogflow Request headers: ' + JSON.stringify(request.headers));
 console.log('Dialogflow Request body: ' + JSON.stringify(request.body));
  const client = await getSheetsClient();   
 const allBooks = await client.spreadsheets.values.get({
   spreadsheetId: ACCOUNTS_SHEET_ID,
   range: 'Books&Blogs!A:I',
 });
 const allEvents = await client.spreadsheets.values.get({
   spreadsheetId: ACCOUNTS_SHEET_ID,
   range: 'Meetups!A:D',
 });
  books = allBooks.data.values;
 meetups = allEvents.data.values;
 books.shift();
 meetups.shift();

 var intentMap = new Map();
 intentMap.set('Tip Intent', tipIntent);
 intentMap.set('Meetup Intent', meetupIntent);
 agent.handleRequest(intentMap);
};

  1. Click the the Environment variables, networking, timeouts and more link

  1. Select the Dialogflow Integrations service account.

(By default it's using a GAE App Engine Service account but this should be the same service account as the one that was shared within your Google Sheets, in the first steps of this tutorial.)

  1. Before we will deploy the cloud function. We will change one line in our code in the index.js tab. The 3rd line of code:

const ACCOUNTS_SHEET_ID = '1Yo_E8KONgSiUm00ZmTOqtjXCwULmc2JuI3sjxRyvrkE';

In one of the first steps, we wrote this key down in Notepad. So copy and paste this id in your code.

  1. Now we are ready. Click the Create button. It will take a moment, because it's deploying your serverless function.

Enable fulfillments in Dialogflow

  1. Switch back to Dialogflow
  2. Click Fulfillment in the main menu
  3. Enable the Webhook switch.
  4. Enter the URL of the cloud function which was copied to the clipboard.

For example: https://us-central1-leeboonstra-wiv-uhtefa.cloudfunctions.net/dialogflow

  1. Click Save.

  1. Let's test the webhook, to see if the code works and test the flows directly in the Dialogflow simulator.

Actions on Google is a development platform for the Google Assistant. It allows the third-party development of "actions"—applets for the Google Assistant that provide extended functionality.

You will need to invoke a Google Action, by asking Google to open or talk to an app.

This will open your action, it will change the voice, and you leave the 'native' Google Assistant scope. Which means, everything you ask your agent from this point needs to be created by you. Within your own action you can't suddenly ask the Google Assistant for Google weather information, if that's what you want; you should leave (close) the scope of your current action (your app) first.

Testing your action in the Google Assistant simulator

Let's test the following conversation:

User

Google Assistant

"Hey Google, talk to my test app."

"Getting the test version of my app.""

User

Your-Name-WIV

-

"Hey there, I'm Anna, the virtual agent of Women in Voice. You can ask me for information about meetups, Women in Voice or a reading tip. What would you like to know?"

Can I have a tip for a blog?

"Do you want to read more about Chatbots, Voice or Both?"

Chatbots

"Alright, here's the tip of the day! The Article Chatbots And The Challenge Of Invisible Affordances of Cobus Greyling.

Do you want another book or article tip? Also, I can tell you more about meetups or what we do. How can I help?"

When is the next meetup?

"The next meetup will be 29 april at 6:00 PM in a virtual Hangout. The topic will be

Online Workshop: Create Google Actions with Dialogflow!.

You can register via our newsletter.

Is there anything else I can help you with?"

What's the url for your website?

"You can find us at www.womeninvoice.com. Is there something else I can do for you?"

No that's it, thanks

"Alright! Hopefully we will see you at one of our meetups!"

  1. Switch back to the Google Assistant simulator

Open: https://console.actions.google.com

Google Assistant Console

  1. Click on the microphone icon and ask the following:

  1. Now let's ask:

This should return:

"Do you want to read more about Chatbots, Voice or Both?"

"Alright, here's the tip of the day! The Article Chatbots And The Challenge Of Invisible Affordances of Cobus Greyling.

Do you want another book or article tip? Also, I can tell you more about meetups or what we do. How can I help?"

  1. Let's try a different version of the same question:

"Alright, here's the tip of the day! The Book Designing Voice User Interfaces: Principles of Conversational Experiences. ... of Cathy Pearl.

Do you want another book or article tip? Also, I can tell you more about meetups or what we do. How can I help?"

Notice that you have never used this training phrase in Dialogflow before. It just matched the right intent.

Also notice that you didn't get a followup question, because you provided enough information for Dialogflow to continue.

  1. Continue the Dialogflow with the following phrases:

Errors? Check the logs!

Every time you use console.log() in your Cloud Function code, data will be written to your GCP logs (Stackdriver). You can access these logs, by opening the Cloud Console > Logging.

In the first dropdown, you can select Cloud Function > dialogflow to filter for your logs.

You have created your first Google Assistant action with Dialogflow, well done!

As you might have noticed, your action was running in test-mode which is tied to your Google Account. If you would login on your Nest device or Google Assistant app on your iOS or Android phone, with the same account. You could test your action as well.

Now this is a workshop demo. But when you are building applications for the Google Assistant for real, you could submit your Action for approval. Read this guide for more information.

What we've covered

What's next?

Enjoyed this code lab? Have a look into these great labs!