Hands-on: Women in Voice Workshop

1. Introduction

In this lab you will build a simple action using Dialogflow and learn how to integrate it with the Google Assistant.

The exercises are ordered to reflect a common cloud developer experience:

  1. Create a Dialogflow v2 agent
  2. Create entities
  3. Create intents
  4. Setup a webhook with GCP Cloud functions
  5. Use the knowledge base to import FAQs
  6. Test the chatbot
  7. Enable the Google Assistant integration

What you will build

We will build a Google Assistant app for the Women in Voice meetup group. It will be possible to ask when the next meetup is, article & book tips or general questions about the meetup group.

What you'll learn

  • How to create a chatbot with Dialogflow v2
  • How to create linear conversation with Dialogflow
  • How to make use of entities
  • How to make use of the knowledge base
  • How to setup webhook fulfillments with Dialogflow and GCP functions
  • How to bring your application to the Google Assistant with Actions on Google

Prerequisites

2. Getting set up

Enable Web Activity in your browser

  1. Click: http://myaccount.google.com/activitycontrols

  1. Make sure Web & App Activity is enabled:

bf8d16b828d6f79a.png

Create a Dialogflow agent

  1. Open: https://console.dialogflow.com

  1. In the left bar, right under the logo, select "Create New Agent". In case you have existing agents, click the dropdown first.

1d7c2b56a1ab95b8.png

  1. Specify an agent name: yourname-wiv (use your own name)
  2. As the default language choose: English - en.
  3. As the default time zone, choose the time zone that's the closest to you.
  4. Do not select Mega Agent. (With this feature you can create an overarching agent, which can orchestrate between "sub" agents. We do not need this now.)
  5. Click Create

382af956cbf308a4.png

Configure Dialogflow

  1. Click on the gear icon, in the left menu, next to your project name.

1d7c2b56a1ab95b8.png

  1. Enter the following agent description: Women in Voice agent
  2. Scroll down to Beta Features and flip the switch, to enable beta features.
  3. Scroll down to Log Settings and flip both switches to Log the interactions of Dialogflow and to log all interactions in Google Cloud Stackdriver. We will need this later, in case we want to debug our action.

e80c17acc3cce993.png

  1. Click Save
  2. Click Done

Configure Actions on Google

  1. Click on the Google Assistant link in See how it works in Google Assistant in the right hand panel.

5a4940338fc351e3.png

This will open: http://console.actions.google.com

NOTE: Make sure you are logged in with the same Google account as in Dialogflow.

When you are new to Actions on Google, you will need to go through this form first:

3fd4e594fa169072.png

  1. Try to open your action in the simulator**, by clicking on the project name.**
  2. Select Develop in the menu bar

Uncheck Match user's language setting. To make sure the text to speech synthesizer won't be overruled by the Assistant default language.

3b6bc284050571f5.png

  1. Click Save
  2. Select Test in the menu bar

dd55b141677932fe.png

  1. Make sure, the simulator is set to English and Click on Talk to my test-app

The action will greet you, with the basic Dialogflow default intent. That means that setting up the integration with Action on Google worked!

Configure Google Cloud

For this tutorial you will need a GCP account with a billing account. If you don't have one yet, you can createone with these steps.

Normally a billing account requires a payment method such as a credit card. For this workshop, we can make use of workshop credits which can skip this process.

  1. Navigate to this URL and login

gcpcredits.com/wivnl

  1. Click: Click here and access your credits
  2. Click Accept & Continue

You are all set. You've created a billing account with 25 dollars, which should be more than enough to use Cloud Functions for a long time.

Enable Google Sheets API

If your agent needs more than static intent responses (for example to fetch data from a web service, database or Sheet), you need to use fulfillment to connect your web service to your agent. Connecting your web service allows you to take actions based on user expressions and send dynamic responses back to the user.

For example, if a user wants to receive a blog or book tip, your web service can check in your database and respond to the user with an article to read.

In this tutorial we won't make use of a database, instead we will make use of a Google Sheet. Once the sheet gets updated, the Google Assistant action will be updated as well. Neat!

  1. Open this Google Sheet in a new browser tab, if you haven't done so already:
  2. https://docs.google.com/spreadsheets/d/1UWx3WYVCrqz0D4uJ_pO56WeqEPa9rQDG1cfc_H11kgY/edit#gid=1240329448
  3. IMPORTANT: Make a copy of this sheet. Click File > Make a Copy
  4. Once the sheet has been copied, click Share
  5. We will need to give the Dialogflow Service Account edit rights. To do this, open Dialogflow > Settings (cog wheel).
  6. Scroll down to Google Project
  7. Copy the service account (email) address. It should look something like this: dialogflow-<someid>@<my-gcp-project>.iam.gserviceaccount.com

8bc778a04efb3dd2.png

  1. Paste this service account in the Share popup of Google Sheets, and give it Edit rights.

e296b9c069c2028e.png

  1. Next we will need to remember the Sheet ID that we are currently working in.

f9061a3724086bf7.png

    The Sheets URL will look something like this:

https://docs.google.com/spreadsheets/d/1fPd8b_z19U7ZzAaY327QhYoogn6q8c1rpGSNF8KIR_o/edit#gid=1240329448

    But we are only interested in the Sheet id, which is the part between:

https://docs.google.com/spreadsheets/d/ and /edit#gid=1240329448 (without the slashes).

    So it will look something like this: **1fPd8b_z19U7ZzAaY327QhYoogn6q8c1rpGSNF8KIR_o**

    **Write this Sheet ID down**, or copy it to Notepad. In the Webhook steps we will use this again.
  1. Open in another browser tab; http://console.cloud.google.com. (In case you have more Google Cloud projects, activate the new Dialogflow project: yourname-wiv). - In the search bar search for: Google Sheets API

8b42de259eb40547.png

  1. Click this, and click the Enable Google Sheets API button in the top.

4b41a64a6cd5a37e.png

3. Custom Entities

Entities are objects your app or device takes actions on. Think about it as parameters / variables. In our action we will ask:

"I want a reading tip about

chatbots

/ I want a reading tip about

voice*"*

Whether you say Chatbots, Voice or Both, this will be gathered from a custom entity which will be used as a parameter in my request to a web service.

Here's more information on Dialogflow Entities.

Creating the Channel Entity

  1. Click in the Dialogflow Console on the menu item: Entities
  2. Click Create Entity
  3. Entity name: tech (make sure it's all lowercase)
  4. Specify the options with the synonyms. (You can tab through the interface.)
  • Chatbots - Chatbots, Chat, Web
  • Voice - Voice, Voicebots, Voice Assistants
  • Both - Both, All

f9b213472a75915b.png

5**.** Switch to **Raw Edit** mode by clicking on the menu button next to the blue save button.

e294b49b123e034f.png

  1. Notice that you could have entered all the entities in CSV format as well. This can be handy when you have a lot of entities that need to be created.
"Chatbots","Chatbots","Chat","Web"
"Voice","Voice","Voicebots","Voice Assistants"
"Both","Both","All"

6cfaa328bcd2bad3.png

  1. Hit Save

4. Intents

Dialogflow uses intents to categorize a user's intentions. Intents have Training Phrases, which are examples of what a user might say to your agent.

For instance, a user who wants to know who wants to know when the next event is might ask:

"When is the next meetup?"

When a user writes or says something, referred to as a user expression, Dialogflow matches the user expression to the best intent in your agent. Matching an intent is also known as intent classification.

Here's more information on Dialogflow Intents.

Modifying the Default Welcome Intent

When you create a new Dialogflow agent, two default intents will be created automatically. The Default Welcome Intent, is the first flow you get to when you start a conversation with the agent. The Default Fallback Intent, is the flow you'll get once the agent can't understand you or can not match an intent with what you just said.

  1. Click Intents > Default Welcome Intent

In the case of the Google Assistant, it will auto-start with the Default Welcome Intent. This is because Dialogflow is listening to the Welcome event. However, you can also invoke the intent by saying one of the entered training phrases.

17610dbd5450e53.png

Here's the welcome message for the Default Welcome Intent:

User

Agent

"Ok Google, talk to <yourname>-WIV"

"Hey there, I'm Anna, the virtual agent of Women in Voice.""You can ask me for information about meetups, Women in Voice or a reading tip."What would you like to know?"

  1. Scroll down to Responses.
  2. Clear all Text Responses.
  3. In the default tab, create the following 3 responses. (Click Add Responses > Text or SSML Response, for each new line:)
  • Hey there, I'm Anna, the virtual agent of Women in Voice.
  • You can ask me for information about meetups, Women in Voice or a reading tip.
  • What would you like to know?

The configuration should be similar to this screenshot.

a0078ea79188dcb3.png

  1. The previous output is used for chatbots, we can modify the output a bit, specifically for the Google Assistant. We will use SSML (Speech Synthesis Markup Language) to build in pauses in our sentences. Click the Google Assistant tab.
  • Do not enable the Default toggle, as we won't re-use the chatbot message.
  • Click Add Responses > Simple Response
  • Add the following text version:

Hey there, I'm Anna, the virtual agent of Women in Voice.

You can ask me for information about meetups, Women in Voice or a reading tip. What would you like to know?

  • Then click Customize audio output
  • And add the following SSML version:

<speak><p><s>Hey there, I'm Anna, the virtual agent of Women in Voice.</s><s>

You can ask me for information about meetups, Women in Voice or a reading tip.</s></p><break time="500ms"/><p><s> What would you like to know?</s></p></speak>

The configuration should be similar to this screenshot.

62f0f58753463fbe.png

  1. Click Save

Here you can find more information about SSML for Actions on Google.

  1. Let's test this intent. First we can test it in the Dialogflow Simulator.

Type: Hello. It should return this message.

12d40056fbd25dfe.png

  1. Now, switch back to the Actions on Google console.

(You might want to keep this in another tab.)

Click: "Talk to my test app." And listen to the new welcome message.

Modifying the Default Fallback Intent

  1. Click Intents > Default Fallback Intent
  2. Scroll down to Responses.
  3. Clear all Text Responses.
  4. In the default tab, create the following responses, each on a new line, so it alternates between these options:
  • Sorry, can you repeat this?
  • I didn't understand you. You can ask me questions about Women in Voice, a book or article tip or when the next meetup will be.

bdecc217bafff97b.png

  1. Click Save

Note, when you don't enter a Google Assistant output, it will take the default.

Create the Stop Intent

  1. Click on the Intents menu item.
  2. Click Create Intent
  3. Enter the Intent Name: Stop Intent
  4. Click Add Training phrases
  • No
  • That's it
  • Bye
  • I don't want that
  • Goodbye
  • It's ok for now
  • Quit
  • I want to stop
  • Close this
  • End the conversation

7ec6455cabdf7e36.png

  1. Scroll down to Responses.> Add Response
  2. Add the following text options:
  • Alright! Hopefully we will see you at one of our meetups!
  • No problem. See you at one of our meetups!
  1. Flip the switch: Set this intent as the end of conversation. This will make sure, once this intent gets matched, it will close the Google Assistant action.

ba532398680d457d.png

  1. Click Save.

Create the Meetup Intent

The Meetup Intent will contain this part of the conversation:

User

Agent

"When is the next meetup?"

"The next meetup will be <date> at <time> in <location>. The topic will be <topic>. And the speakers are: <speakers>. You can register via our newsletter."

  1. Click on the Intents menu item.
  2. Click Create Intent
  3. Enter the Intent Name: Meetup Intent (make sure you use a capital M and a capital I. - If you spell the intent differently, the back-end service won't work!)
  4. Click Add Training phrases
  • When is the next meetup?
  • Do you have any events?
  • Which events are in the planning?
  • Are there meetup events soon?
  • I would love to attend a meetup
  • Can I join a virtual meetup?
  • When will you get together?
  • Can I join?
  • What does your calendar look like?
  1. Click Fulfillment > Enable Fulfillment

7eb73ba04d76140e.png

  1. Flip the Enable Webhook call for this intent switch.

748a82d9b4d7d253.png

  1. Hit Save

Create the Tip Intent

The Tip Intent will contain this part of the conversation:

User

Agent

"I want a reading tip."

"Do you want to read more about Chatbots, Voice or Both?"

"Voice"

"Alright, here's the tip of the day! The <type> <title> of <author>. Do you want another book or article tip? Also, I can tell you more about meetups or what we do. How can I help?"

  1. Click on the Intents menu item again.
  2. Click Create Intent
  3. Enter the Intent Name: Tip Intent (make sure you use a capital L and a capital I. - If you spell the intent differently, the back-end service won't work!)
  4. Click Add Training phrases and add the following:
  • Can I get a tip for an article?
  • I would like to receive a reading tip
  • Any book tips?
  • What's nice to read?
  • I want to learn more about Chatbots, what should I read?
  • What are nice blogs?
  • Do you have book suggestions?
  • I want to receive information about Both
  • Can I have Chatbots reading tip
  • I would like to read more about Voice
  • Voice please
  • Both are okay.
  • Reading tip
  • Tip
  • Blog
  • Article
  • Book
  • Book suggestions
  • Yes
  • Yeah
  • Another tip
  • Yes one more
  1. Scroll down to Action and parameters
  2. Mark tech as required

7cdf7fdf5d2c3fbe.png

Click: Define Prompt and enter:

  • Do you want to read more about Chatbots, Voice or Both?
  1. Click Fulfillment > Enable Fulfillment

This time we are not hardcoding a response. The response will come from the cloud function! Thus flip the Enable Webhook call for this intent switch.

Click Fulfillment > Enable Fulfillment

7eb73ba04d76140e.png

  1. Flip the Enable Webhook call for this intent switch.

748a82d9b4d7d253.png

  1. Hit Save

5. Knowledge Connectors

Knowledge connectors complement defined intents. They parse knowledge documents to find automated responses. (for example, FAQs or articles from CSV files, online websites or even PDF files!) To configure them, you define one or more knowledge bases, which are collections of knowledge documents.

Read more about Knowledge Connectors.

Let's try this out.

  1. Select the en tag, to select the English language in the top menu.
  2. Select Knowledge (beta) in the menu.

e0a3d2e03dd1076c.png

  1. Click the right blue button: Create Knowledge Base
  2. Type as a Knowledge Base name; Women in Voice and hit save.
  3. Click Create the first one link

9b2ca6b25c4100ab.png

  1. This will open a window.

Use the following config:

Document Name: Women in Voice FAQ Sheet

Knowledge Type: FAQ

Mime Type: CSV

  1. We will need the data from this sheet, make sure the data sheet is opened, and select the FAQ tab
  2. Select File > Download > CSV

e7f0066408fc1721.png

  1. Back in Dialogflow, Click Upload File from Computer and select the CSV file you have downloaded. Click Create

A knowledge base have been created:

64513e2e484dda31.png

  1. Click Add Response

Create the following answers and hit save:

$Knowledge.Answer[1]

  1. Click View Detail

This will display all the FAQs you have implemented in Dialogflow.

That's easy!

Know that you could also point to an online HTML website with FAQs to import FAQs to your agent. It's even possible to upload a PDF with a block of text, and Dialogflow will come up with questions itself.

  1. Click on Knowledge (beta) in the Dialogflow menu to go back to all the Knowledge base connectors.
  2. It's possible to change the strength and weakness of the Knowledge Base. This makes sense, when you have the idea that your FAQs are winning or losing from your own intents. Since we don't have many intents, let's make our Knowledge Base a bit stronger. Change the scale to -0.2. After dragging the slider, it will automatically save the value.

Now FAQs should be seen as ‘extras' to add to your agents, next to your intent flows. Knowledge Base FAQs can't train the model. So asking questions in a completely different way, might not get a match because it makes no use of Natural Language Understanding (Machine Learning models). This is why, sometimes it's worth converting your FAQs to intents.

6. Webhook Fulfillment

Create a Google Cloud Function

  1. Navigate to http://console.cloud.google.com in another browser tab.
  2. Select in the left menu Cloud Functions
  3. Click Create Function

bf2441ba1271a95e.png

  1. Specify the following configuration:
  • Name: dialogflow
  • Memory allocated: 256MiB
  • Trigger: HTTP
  • Copy the URL to your clipboard.
  • Select Inline Editor
  • Runtime: NodeJS 8
  • Function to execute: dialogflow
  1. Make sure this authentication checkbox is checked:

317140d44ec3299c.png

61ecb8f57a6fd21b.png

  1. Here's the contents for package.json. Copy and paste this in the package.json tab of the editor.

This piece of code, loads the correct npm libraries into Google Cloud:

{
 "name": "dialogflow",
 "description": "Cloud Functions",
 "engines": {
   "node": "8"
 },
 "dependencies": {
   "request": "^2.85.0",
   "request-promise": "^4.2.5",
   "dialogflow-fulfillment": "^0.6.1",
   "actions-on-google": "^2.2.0",
   "googleapis": "^48.0.0",
   "moment": "^2.24.0"
 },
 "devDependencies": {
   "eslint": "^5.12.0",
   "eslint-plugin-promise": "^4.0.1",
   "ngrok": "^3.2.7"
 },
 "private": true
}
  1. Here's the contents for the index.js. Copy and paste this in the index.js tab of the editor.

This piece of code will integrate with the googleapis library, to fetch data from a Google Sheet. It makes uses of the actions-on-google library to display cards on a Google Assistant device. It makes use of library dialogflow-fulfillment, to classify Dialogflow intents. And it makes use of the library moment to handle date and time objects.

/* jshint esversion: 8 */
'use strict';

process.env.DEBUG = 'dialogflow:debug';

const ACCOUNTS_SHEET_ID = '1UWx3WYVCrqz0D4uJ_pO56WeqEPa9rQDG1cfc_H11kgY';

const {
 BasicCard,
 Button,
} = require('actions-on-google');

const {google} = require('googleapis');
const moment = require('moment');
moment.locale('nl');
const { WebhookClient } = require('dialogflow-fulfillment');
var books;
var meetups;

const SHEETS_SCOPE = 'https://www.googleapis.com/auth/spreadsheets.readonly';

/**
* Authenticates the Sheets API client for read-only access.
*
* @return {Object} sheets client
*/
async function getSheetsClient() {
   // Should change this to file.only probably
   const auth = await google.auth.getClient({
       scopes: [SHEETS_SCOPE],
   });
   return google.sheets({version: 'v4', auth});
}

/**
* Return a natural spoken date
* @param {string} date in 'YYYY-MM-DD' format
* @returns {string}
*/
var getSpokenDate = function(date){
   let datetime = moment(date, 'YYYY-MM-DD');
   return `${datetime.format('D MMMM')}`;
};


/* When the tipIntent Intent gets invoked.  */
function tipIntent(agent) {
 var par = agent.parameters.tech;
   var selection = [];
   //console.log(par);
   //console.log(books);
    for(var i = 0; i<books.length; i++){
     if(books[i][2].toLowerCase() == par.toLowerCase()) {
         selection.push(books[i]);
       }
   }
    var random = Math.floor(Math.random() * selection.length);
   var booktip = selection[random];
   //console.log(selection[random]);

   let spokenText = `<p><s>Alright, here's the tip of the day!</s></p><p>The ${booktip[6]} ${booktip[0]} of ${booktip[1]}.</p>`;
   let writtenText = `Alright, here's the tip of the day! The ${booktip[6]} ${booktip[0]} of ${booktip[1]}.`;
   //console.log(booktip[8]);
    if (agent.requestSource === agent.ACTIONS_ON_GOOGLE) {
       let conv = agent.conv();
       conv.ask(`<speak>${spokenText}</speak>`);
       conv.ask(new BasicCard({
           title: `Tip of the day!`,
           subtitle: `${par}`,
           text: `The ${booktip[6]} ${booktip[0]} of ${booktip[1]}.`,
           buttons: new Button({
               title: 'Read',
               url: `${booktip[8]}`,
           })
       }));
       conv.ask(`<speak><p><s>Do you want another book or article tip? Also, I can tell you more about meetups or what we do. How can I help?</s></p></speak>`);
       // Add Actions on Google library responses to your agent's response
       agent.add(conv);
   } else {
       agent.add(writtenText + ' Do you want another book or article tip? Also, I can tell you more about meetups or what we do. How can I help?');
   }
}


function meetupIntent(agent) {
 let conv = agent.conv();
 let record;
  console.log(meetups);
  for(var i = 0; i<meetups.length; i++){
   let d = moment(meetups[i][0], 'YYYY-MM-DD');
   let today = moment(new Date());

   if(moment(d).isSameOrAfter(today)) {
     // the i event is not in the past
     record = meetups[i];
     console.log(record);
     break; 
   }
 }
  let date = getSpokenDate(record[0]);
 let spokenText1 = `The next meetup will be ${date} at ${record[1]} in ${record[3]}.`;
 let spokenText2 = `The topic will be <emphasis level="moderate">${record[2]}.</emphasis>`;
 let spokenText3 = `You can register via our newsletter.`;
  let writtenText = `${spokenText1} The topic will be ${record[2]}. ${spokenText3}`;

 if (agent.requestSource === agent.ACTIONS_ON_GOOGLE) {
   conv.ask(`<speak>${spokenText1} ${spokenText2} ${spokenText3}</speak>`);
   conv.ask(new BasicCard({
     title: `Meetup`,
     subtitle: `${record[2]}`,
     text: `${record[0]} ${record[1]} - ${record[3]}`,
     buttons: new Button({
       title: 'Register',
       url: `http://www.meetup.com`
     })
   }));
   conv.ask('<speak><p><s>Is there anything else I can help you with?</s></p></speak>');
   agent.add(conv);
 } else {
   agent.add(`${writtenText} Is there anything else I can help you with?`);
 }
}

exports.dialogflow = async (request, response) => {
 var agent = new WebhookClient({ request, response });

 console.log('Dialogflow Request headers: ' + JSON.stringify(request.headers));
 console.log('Dialogflow Request body: ' + JSON.stringify(request.body));
  const client = await getSheetsClient();   
 const allBooks = await client.spreadsheets.values.get({
   spreadsheetId: ACCOUNTS_SHEET_ID,
   range: 'Books&Blogs!A:I',
 });
 const allEvents = await client.spreadsheets.values.get({
   spreadsheetId: ACCOUNTS_SHEET_ID,
   range: 'Meetups!A:D',
 });
  books = allBooks.data.values;
 meetups = allEvents.data.values;
 books.shift();
 meetups.shift();

 var intentMap = new Map();
 intentMap.set('Tip Intent', tipIntent);
 intentMap.set('Meetup Intent', meetupIntent);
 agent.handleRequest(intentMap);
};

b130ee596061832c.png

  1. Click the the Environment variables, networking, timeouts and more link

e81c29549f696937.png

  1. Select the Dialogflow Integrations service account.

(By default it's using a GAE App Engine Service account but this should be the same service account as the one that was shared within your Google Sheets, in the first steps of this tutorial.)

61ecb8f57a6fd21b.png

  1. Before we will deploy the cloud function. We will change one line in our code in the index.js tab. The 3rd line of code:

const ACCOUNTS_SHEET_ID = '1Yo_E8KONgSiUm00ZmTOqtjXCwULmc2JuI3sjxRyvrkE';

In one of the first steps, we wrote this key down in Notepad. So copy and paste this id in your code.

  1. Now we are ready. Click the Create button. It will take a moment, because it's deploying your serverless function.

Enable fulfillments in Dialogflow

  1. Switch back to Dialogflow
  2. Click Fulfillment in the main menu
  3. Enable the Webhook switch.
  4. Enter the URL of the cloud function which was copied to the clipboard.

For example: https://us-central1-leeboonstra-wiv-uhtefa.cloudfunctions.net/dialogflow

  1. Click Save.

89dfd437c6689538.png

  1. Let's test the webhook, to see if the code works and test the flows directly in the Dialogflow simulator.

297e8f7ed1b9e801.png

7. Actions on Google

Actions on Google is a development platform for the Google Assistant. It allows the third-party development of "actions"—applets for the Google Assistant that provide extended functionality.

You will need to invoke a Google Action, by asking Google to open or talk to an app.

This will open your action, it will change the voice, and you leave the ‘native' Google Assistant scope. Which means, everything you ask your agent from this point needs to be created by you. Within your own action you can't suddenly ask the Google Assistant for Google weather information, if that's what you want; you should leave (close) the scope of your current action (your app) first.

Testing your action in the Google Assistant simulator

Let's test the following conversation:

User

Google Assistant

"Hey Google, talk to my test app."

"Getting the test version of my app.""

User

Your-Name-WIV

-

"Hey there, I'm Anna, the virtual agent of Women in Voice. You can ask me for information about meetups, Women in Voice or a reading tip. What would you like to know?"

Can I have a tip for a blog?

"Do you want to read more about Chatbots, Voice or Both?"

Chatbots

"Alright, here's the tip of the day! The Article Chatbots And The Challenge Of Invisible Affordances of Cobus Greyling.Do you want another book or article tip? Also, I can tell you more about meetups or what we do. How can I help?"

When is the next meetup?

"The next meetup will be 29 april at 6:00 PM in a virtual Hangout. The topic will beOnline Workshop: Create Google Actions with Dialogflow!.You can register via our newsletter.Is there anything else I can help you with?"

What's the url for your website?

"You can find us at www.womeninvoice.com. Is there something else I can do for you?"

No that's it, thanks

"Alright! Hopefully we will see you at one of our meetups!"

  1. Switch back to the Google Assistant simulator

Open: https://console.actions.google.com

  1. Click on the microphone icon and ask the following:

aa43d7f1568c4d80.png

  • Talk to my test app.
  1. Now let's ask:
  • Can I have a tip for a blog?

This should return:

"Do you want to read more about Chatbots, Voice or Both?"

  • Chatbots

"Alright, here's the tip of the day! The Article Chatbots And The Challenge Of Invisible Affordances of Cobus Greyling.

Do you want another book or article tip? Also, I can tell you more about meetups or what we do. How can I help?"

  1. Let's try a different version of the same question:
  • "Yes, I want to read more about Voice"

"Alright, here's the tip of the day! The Book Designing Voice User Interfaces: Principles of Conversational Experiences. ... of Cathy Pearl.

Do you want another book or article tip? Also, I can tell you more about meetups or what we do. How can I help?"

Notice that you have never used this training phrase in Dialogflow before. It just matched the right intent.

Also notice that you didn't get a followup question, because you provided enough information for Dialogflow to continue.

  1. Continue the Dialogflow with the following phrases:
  • What's the URL for your website
  • Bye

a0bd5578d6833c98.png

Errors? Check the logs!

Every time you use console.log() in your Cloud Function code, data will be written to your GCP logs (Stackdriver). You can access these logs, by opening the Cloud Console > Logging.

In the first dropdown, you can select Cloud Function > dialogflow to filter for your logs.

8. Congratulations

You have created your first Google Assistant action with Dialogflow, well done!

As you might have noticed, your action was running in test-mode which is tied to your Google Account. If you would login on your Nest device or Google Assistant app on your iOS or Android phone, with the same account. You could test your action as well.

Now this is a workshop demo. But when you are building applications for the Google Assistant for real, you could submit your Action for approval. Read this guide for more information.

What we've covered

  • How to create a chatbot with Dialogflow v2
  • How to create custom entities with Dialogflow
  • How to create linear conversation with Dialogflow
  • How to setup webhook fulfillments with Dialogflow and Google Cloud Functions
  • How to bring your application to the Google Assistant with Actions on Google

What's next?

Enjoyed this code lab? Have a look into these great labs!