Hands-on: Create a TV guide action for the Google Assistant with Dialogflow and Actions on Google

1. Introduction

You are watching tv, but you can't find the remote, or maybe you don't want to visit each TV channel to figure out if there's something nice on the television? Let's ask the Google Assistant what's on TV! In this lab you will build a simple action using Dialogflow and learn how to integrate it with the Google Assistant.

The exercises are ordered to reflect a common cloud developer experience:

  1. Create a Dialogflow v2 agent
  2. Create a custom entity
  3. Create intents
  4. Setup a webhook with Firebase Functions
  5. Test the chatbot
  6. Enable the Google Assistant integration

What you will build

We will build an interactive TV guide chatbot agent for the Google Assistant. You can ask the TV guide, what's currently airing on a particular channel. For example; "What's on MTV?"The TV Guide action will tell you what's currently playing and what will be on next.

What you'll learn

  • How to create a chatbot with Dialogflow v2
  • How to create custom entities with Dialogflow
  • How to create linear conversation with Dialogflow
  • How to setup webhook fulfillments with Dialogflow and Firebase Functions
  • How to bring your application to the Google Assistant with Actions on Google

Prerequisites

  • You will need a Google Identity / gmail address to create a Dialogflow agent.
  • Basic knowledge of JavaScript is not required, but can be handy in case you want to change the webhook fulfillment code.

2. Getting set up

Enable Web Activity in your browser

  1. Click: http://myaccount.google.com/activitycontrols

  1. Make sure Web & App Activity is enabled:

bf8d16b828d6f79a.png

Create a Dialogflow agent

  1. Open: https://console.dialogflow.com

  1. In the left bar, right under the logo, select "Create New Agent". In case you have existing agents, click the dropdown first.

1d7c2b56a1ab95b8.png

  1. Specify an agent name: your-name-tvguide (use your own name)

35237b5c5c539ecc.png

  1. As default language choose: English - en
  2. As the default time zone, choose the time zone that's the closest to you.
  3. Click Create

Configure Dialogflow

  1. Click on the gear icon, in the left menu, next to your project name.

1d7c2b56a1ab95b8.png

  1. Enter the following agent description: My TV Guide

26f262d359c49075.png

  1. Scroll down to Log Settings and flip both switches to Log the interactions of Dialogflow and to log all interactions in Google Cloud Stackdriver. We will need this later, in case we want to debug our action.

e80c17acc3cce993.png

  1. Click Save

Configure Actions on Google

  1. Click on the Google Assistant link in See how it works in Google Assistant in the right hand panel.

5a4940338fc351e3.png

This will open: http://console.actions.google.com

When you are new to Actions on Google, you will need to go through this form first:

3fd4e594fa169072.png

  1. Try to open your action in the simulator**, by clicking on the project name.**
  2. Select Test in the menu bar

6adb83ffb7adeb78.png

  1. Make sure, the simulator is set to English and Click on Talk to my test-app

The action will greet you, with the basic Dialogflow default intent. That means that setting up the integration with Action on Google worked!

3. Custom Entities

Entities are objects your app or device takes actions on. Think about it as parameters / variables. In our TV Guide we will ask: "What's on MTV". MTV is the entity and variable. I could ask for other channels as well, such as: "National Geographic" or "Comedy Central". The gathered entity will be used as a parameter in my request to the TV Guide API web-service.

Here's more information on Dialogflow Entities.

Creating the Channel Entity

  1. Click in the Dialogflow Console on the menu item: Entities
  2. Click Create Entity
  3. Entity name: channel (make sure it's all lowercase)
  4. Pass in a channel name. (some channels will need a synonym in case the Google Assistant understands something else). You can use the tab and enter keys, while typing. Enter the channel number as a reference value. And the channel names as synonyms, like:
  • 1 - 1, Net 1, Net Station 1

ee4e4955aa77232d.png

5**.** Switch to **Raw Edit** mode by clicking on the menu button next to the blue save button.

e294b49b123e034f.png

  1. Copy & paste the other entities in CSV format:
"2","2","Net 2, Net Station 2"
"3","3","Net 3, Net Station 3"
"4","4","RTL 4"
"5","5","Movie Channel"
"6","6","Sports Channel"
"7","7","Comedy Central"
"8","8","Cartoon Network"
"9","9","National Geographic"
"10","10","MTV"

ed78514afd5badef.png

  1. Hit Save

4. Intents

Dialogflow uses intents to categorize a user's intentions. Intents have Training Phrases, which are examples of what a user might say to your agent. For instance, a user who wants to know what's on TV might ask, "What is on TV today?", "What's currently playing?", or simply say "tvguide".

When a user writes or says something, referred to as a user expression, Dialogflow matches the user expression to the best intent in your agent. Matching an intent is also known as intent classification.

Here's more information on Dialogflow Intents.

Modifying the Default Welcome Intent

When you create a new Dialogflow agent, two default intents will be created automatically. The Default Welcome Intent, is the first flow you get to when you start a conversation with the agent. The Default Fallback Intent, is the flow you'll get once the agent can't understand you or can not match an intent with what you just said.

  1. Click Default Welcome Intent

In the case of the Google Assistant, it will auto-start with the Default Welcome Intent. This is because Dialogflow is listening to the Welcome event. However, you can also invoke the intent by saying one of the entered training phrases.

6beee64e8910b85d.png

Here's the welcome message for the Default Welcome Intent:

User

Agent

"Ok Google, talk to your-name-tvguide."

"Welcome, I am the TV Guide agent. I can tell you what's currently playing on a tv channel. For example, you can ask me: What's on MTV."

  1. Scroll down to Responses.
  2. Clear all Text Responses.
  3. Create one new text response, which contains the following greeting:

Welcome, I am the TV Guide agent. I can tell you what's currently playing on a TV channel. For example, you can ask me: What's on MTV?

84a1110a7f7edba2.png

  1. Click Save

Create a temporary test intent

For testing purposes, we will create temporary test intent, so we can test the webhook later.

  1. Click on the Intents menu item again.
  2. Click Create Intent
  3. Enter the Intent Name: Test Intent (make sure you use a capital T and a capital I. - If you spell the intent differently, the back-end service won't work!)

925e02caa4de6b99.png

  1. Click Add Training phrases
  • Test my agent
  • Test intent

2e44ddb2fae3c841.png

  1. Click Fulfillment > Enable Fulfillment

7eb73ba04d76140e.png

This time we are not hardcoding a response. The response will come from a cloud function!

  1. Flip the Enable Webhook call for this intent switch.

748a82d9b4d7d253.png

  1. Hit Save

Create the Channel Intent

The Channel Intent will contain this part of the conversation:

User

Agent

"What's on Comedy Central?"

""Currently on Comedy Central from 6 PM, The Simpsons is playing. Afterwards at 7 PM, Family Guy will start.""

  1. Click on the Intents menu item again.
  2. Click Create Intent
  3. Enter the Intent Name: Channel Intent (make sure you use a capital T and a capital I. - If you spell the intent differently, the back-end service won't work!)
  4. Click Add Training phrases and add the following:
  • What's on MTV?
  • What's playing on Comedy Central?
  • What show will start at 8 PM on National Geographic?
  • What is currently on TV?
  • What is airing now.
  • Anything airing on Net Station 1 right now?
  • What can I watch at 7 PM?
  • What's on channel MTV?
  • What's on TV?
  • Please give me the tv guide.
  • Tell me what is on television.
  • What's on Comedy Central from 10 AM?
  • What will be on tv at noon?
  • Anything on National Geographic?
  • TV Guide

6eee02db02831397.png

  1. Scroll down to Action and parameters

b7e917026760218a.png

Notice the @channel & @sys.time entities that are known to Dialogflow. Later in your webhook, the parameter name and parameter value will be sent to your web service. For example:

channel=8

time=2020-01-29T19:00:00+01:00

  1. Mark channel as required

When you are having a conversation with the TV Guide agent, you will always need to fill the slot parameter name channel. If the channel name wasn't mentioned in the start of the conversation, Dialogflow will ask further, till it filled out all parameter slots. 6f36973fd789c182.png

As a prompt enter:

  • For which TV channel do you want to hear the tv guide information?
  • In which TV channel are you interested?

cdb5601ead9423f8.png

  1. Do not set the time parameter as required.

Time will be optional. When no time is specified the web service will return the current time.

  1. Click Fulfillment

This time we are not hardcoding a response. The response will come from the cloud function! Thus flip the Enable Webhook call for this intent switch.

  1. Hit Save

5. Webhook Fulfillment

If your agent needs more than static intent responses, you need to use fulfillment to connect your web service to your agent. Connecting your web service allows you to take actions based on user expressions and send dynamic responses back to the user. For example, if a user wants to receive the TV schedule for MTV, your web service can check in your database and respond to the user, the schedule for MTV.

  1. Click Fulfillment in the main menu
  2. Enable the Inline Editor switch

cc84351f0d03ab6f.png

For simple webhook testing and implementation, you can use the inline editor. It makes use of serverless Cloud Functions for Firebase.

  1. Click on the index.js tab in the editor and copy paste this piece of JavaScript for Node.js code:
'use strict';

process.env.DEBUG = 'dialogflow:debug';

const {
  dialogflow,
  BasicCard,
  Button,
  Image,
  List
 } = require('actions-on-google');

const functions = require('firebase-functions');
const moment = require('moment');
const TVGUIDE_WEBSERVICE = 'https://tvguide-e4s5ds5dsa-ew.a.run.app/channel';
const { WebhookClient } = require('dialogflow-fulfillment');
var spokenText = '';
var results = null;


/* When the Test Intent gets invoked. */
function testHandler(agent) {
    let spokenText = 'This is a test message, when you see this, it means your webhook fulfillment worked!';

    if (agent.requestSource === agent.ACTIONS_ON_GOOGLE) {
        let conv = agent.conv();
        conv.ask(spokenText);
        conv.ask(new BasicCard({
            title: `Test Message`,
            subTitle: `Dialogflow Test`,
            image: new Image({
                url: 'https://dummyimage.com/600x400/000/fff',
                alt: 'Image alternate text',
            }),
            text: spokenText,
            buttons: new Button({
                title: 'This is a button',
                url: 'https://assistant.google.com/',
            }),
        }));
        // Add Actions on Google library responses to your agent's response
        agent.add(conv);
    } else {
        agent.add(spokenText);
    }
}

/* When the Channel Intent gets invoked. */
function channelHandler(agent) {
    var jsonResponse = `{"ID":10,"Listings":[{"Title":"Catfish Marathon","Date":"2018-07-13","Time":"11:00:00"},{"Title":"Videoclips","Date":"2018-07-13","Time":"12:00:00"},{"Title":"Pimp my ride","Date":"2018-07-13","Time":"12:30:00"},{"Title":"Jersey Shore","Date":"2018-07-13","Time":"13:00:00"},{"Title":"Jersey Shore","Date":"2018-07-13","Time":"13:30:00"},{"Title":"Daria","Date":"2018-07-13","Time":"13:45:00"},{"Title":"The Real World","Date":"2018-07-13","Time":"14:00:00"},{"Title":"The Osbournes","Date":"2018-07-13","Time":"15:00:00"},{"Title":"Teenwolf","Date":"2018-07-13","Time":"16:00:00"},{"Title":"MTV Unplugged","Date":"2018-07-13","Time":"16:30:00"},{"Title":"Rupauls Drag Race","Date":"2018-07-13","Time":"17:30:00"},{"Title":"Ridiculousness","Date":"2018-07-13","Time":"18:00:00"},{"Title":"Punk'd","Date":"2018-07-13","Time":"19:00:00"},{"Title":"Jersey Shore","Date":"2018-07-13","Time":"20:00:00"},{"Title":"MTV Awards","Date":"2018-07-13","Time":"20:30:00"},{"Title":"Beavis & Butthead","Date":"2018-07-13","Time":"22:00:00"}],"Name":"MTV"}`;
    var results = JSON.parse(jsonResponse);
    var listItems = {};
    spokenText = getSpeech(results);

    for (var i = 0; i < results['Listings'].length; i++) {
        listItems[`SELECT_${i}`] = {
            title: `${getSpokenTime(results['Listings'][i]['Time'])} - ${results['Listings'][i]['Title']}`,
            description: `Channel: ${results['Name']}`
        }
    }
    if (agent.requestSource === agent.ACTIONS_ON_GOOGLE) {
        let conv = agent.conv();
        conv.ask(spokenText);
        conv.ask(new List({
            title: 'TV Guide',
            items: listItems
        }));
        // Add Actions on Google library responses to your agent's response
        agent.add(conv);
    } else {
        agent.add(spokenText);
    }
}

/**
 * Return a text string to be spoken out by the Google Assistant
 * @param {object} JSON tv results
 */
var getSpeech = function(tvresults) {
    let s = "";
    if(tvresults['Listings'][0]) {
        let channelName = tvresults['Name'];
        let currentlyPlayingTime = getSpokenTime(tvresults['Listings'][0]['Time']);
        let laterPlayingTime = getSpokenTime(tvresults['Listings'][1]['Time']);
        s = `On ${channelName} from ${currentlyPlayingTime}, ${tvresults['Listings'][0]['Title']} is playing.
        Afterwards at ${laterPlayingTime}, ${tvresults['Listings'][1]['Title']} will start.`
    }

    return s;
}

/**
 * Return a natural spoken time
 * @param {string} time in 'HH:mm:ss' format
 * @returns {string} spoken time (like 8 30 pm i.s.o. 20:00:00)
 */
var getSpokenTime = function(time){
    let datetime = moment(time, 'HH:mm:ss');
    let min = moment(datetime).format('m');
    let hour = moment(datetime).format('h');
    let partOfTheDay = moment(datetime).format('a');

    if (min == '0') {
        min = '';
    }

    return `${hour} ${min} ${partOfTheDay}`;
};

exports.dialogflowFirebaseFulfillment = functions.https.onRequest((request, response) => {
    var agent = new WebhookClient({ request, response });

    console.log('Dialogflow Request headers: ' + JSON.stringify(request.headers));
    console.log('Dialogflow Request body: ' + JSON.stringify(request.body));
   
    let channelInput = request.body.queryResult.parameters.channel;
    let requestedTime = request.body.queryResult.parameters.time;
    let url = `${TVGUIDE_WEBSERVICE}/${channelInput}`;

    var intentMap = new Map();
    intentMap.set('Test Intent', testHandler);
    intentMap.set('Channel Intent', channelHandler);
    agent.handleRequest(intentMap);
});

cc84351f0d03ab6f.png

  1. Click on the package.json tab in the editor and copy paste this piece of JSON code, which imports all the Node.js Package Manager (NPM) libraries:
{
  "name": "tvGuideFulfillment",
  "description": "Requesting TV Guide information from a web service.",
  "version": "1.0.0",
  "private": true,
  "license": "Apache Version 2.0",
  "author": "Google Inc.",
  "engines": {
    "node": "8"
  },
  "scripts": {
    "start": "firebase serve --only functions:dialogflowFirebaseFulfillment",
    "deploy": "firebase deploy --only functions:dialogflowFirebaseFulfillment"
  },
  "dependencies": {
    "actions-on-google": "^2.2.0",
    "firebase-admin": "^5.13.1",
    "firebase-functions": "^2.0.2",
    "request": "^2.85.0",
    "request-promise": "^4.2.5",
    "moment" : "^2.24.0",
    "dialogflow-fulfillment": "^0.6.1"
  }
}

af01460c2a023e68.png

  1. Click the Deploy button. It will take a moment, because it's deploying your serverless function. In the bottom of the screen, there will be a popup telling your status.
  2. Let's test the webhook, to see if the code works. In the simulator on right side, type:

Test my agent.

When everything is correct, you should see: "This is a test message".

  1. Let's test the Channel Intent, now ask the question:

What's on MTV?

When everything is correct, you should see:

"On MTV from 4 30 pm, MTV Unplugged is playing. Afterwards at 5 30 pm, Rupauls Drag Race will start."

Optional Steps - Firebase

When you test this with a different channel, you will notice that the tv results are the same. This is because the cloud function is not fetching from a real web server yet.

In order to do this, we will need to make an outbound network connection.

In case you want to test this application with a web service, upgrade your Firebase plan to Blaze. Note: these steps are optional. You could also go to the next steps of this lab, to continue testing your application in Actions on Google.

  1. Navigate to the Firebase console: https://console.firebase.google.com

  1. In the bottom of the screen, press the Upgrade button

ad38bc6d07462abf.png

Select the Blaze plan in the popup.

  1. Now that we know that the webhook works, we can continue and replace the code of index.js with the below code. This will make sure that you can request tv guide information from the web service:
'use strict';

process.env.DEBUG = 'dialogflow:debug';

const {
  dialogflow,
  BasicCard,
  Button,
  Image,
  List
 } = require('actions-on-google');

const functions = require('firebase-functions');
const moment = require('moment');
const { WebhookClient } = require('dialogflow-fulfillment');
const rp = require('request-promise');

const TVGUIDE_WEBSERVICE = 'https://tvguide-e4s5ds5dsa-ew.a.run.app/channel';
var spokenText = '';
var results = null;


/* When the Test Intent gets invoked. */
function testHandler(agent) {
    let spokenText = 'This is a test message, when you see this, it means your webhook fulfillment worked!';

    if (agent.requestSource === agent.ACTIONS_ON_GOOGLE) {
        let conv = agent.conv();
        conv.ask(spokenText);
        conv.ask(new BasicCard({
            title: `Test Message`,
            subTitle: `Dialogflow Test`,
            image: new Image({
                url: 'https://dummyimage.com/600x400/000/fff',
                alt: 'Image alternate text',
            }),
            text: spokenText,
            buttons: new Button({
                title: 'This is a button',
                url: 'https://assistant.google.com/',
            }),
        }));
        // Add Actions on Google library responses to your agent's response
        agent.add(conv);
    } else {
        agent.add(spokenText);
    }
}

/* When the Channel Intent gets invoked. */
function channelHandler(agent) {
    var listItems = {};
    spokenText = getSpeech(results);

    for (var i = 0; i < results['Listings'].length; i++) {
        listItems[`SELECT_${i}`] = {
            title: `${getSpokenTime(results['Listings'][i]['Time'])} - ${results['Listings'][i]['Title']}`,
            description: `Channel: ${results['Name']}`

        }
    }
    if (agent.requestSource === agent.ACTIONS_ON_GOOGLE) {
        let conv = agent.conv();
        conv.ask(spokenText);
        conv.ask(new List({
            title: 'TV Guide',
            items: listItems
        }));
        // Add Actions on Google library responses to your agent's response
        agent.add(conv);
    } else {
        agent.add(spokenText);
    }
}

/**
 * Return a text string to be spoken out by the Google Assistant
 * @param {object} JSON tv results
 */
var getSpeech = function(tvresults) {
    let s = "";
    if(tvresults && tvresults['Listings'][0]) {
        let channelName = tvresults['Name'];
        let currentlyPlayingTime = getSpokenTime(tvresults['Listings'][0]['Time']);
        let laterPlayingTime = getSpokenTime(tvresults['Listings'][1]['Time']);
        s = `On ${channelName} from ${currentlyPlayingTime}, ${tvresults['Listings'][0]['Title']} is playing.
        Afterwards at ${laterPlayingTime}, ${tvresults['Listings'][1]['Title']} will start.`
    }

    return s;
}

/**
 * Return a natural spoken time
 * @param {string} time in 'HH:mm:ss' format
 * @returns {string} spoken time (like 8 30 pm i.s.o. 20:00:00)
 */
var getSpokenTime = function(time){
    let datetime = moment(time, 'HH:mm:ss');
    let min = moment(datetime).format('m');
    let hour = moment(datetime).format('h');
    let partOfTheDay = moment(datetime).format('a');

    if (min == '0') {
        min = '';
    }

    return `${hour} ${min} ${partOfTheDay}`;
};

exports.dialogflowFirebaseFulfillment = functions.https.onRequest((request, response) => {
    var agent = new WebhookClient({ request, response });

    console.log('Dialogflow Request headers: ' + JSON.stringify(request.headers));
    console.log('Dialogflow Request body: ' + JSON.stringify(request.body));
   
    let channelInput = request.body.queryResult.parameters.channel;
    let requestedTime = request.body.queryResult.parameters.time;
    let url = `${TVGUIDE_WEBSERVICE}/${channelInput}`;

    if (requestedTime) {
        console.log(requestedTime);
        let offsetMin = moment().utcOffset(requestedTime)._offset;
        console.log(offsetMin);
        let time = moment(requestedTime).utc().add(offsetMin,'m').format('HH:mm:ss');
        url = `${TVGUIDE_WEBSERVICE}/${channelInput}/${time}`;
      }
    
      console.log(url);
  
      var options = {
          uri: encodeURI(url),
          json: true
      };
       
      // request promise calls an URL and returns the JSON response.
      rp(options)
        .then(function(tvresults) {
            console.log(tvresults);
            // the JSON response, will need to be formatted in 'spoken' text strings.
            spokenText = getSpeech(tvresults);
            results = tvresults;
        })
        .catch(function (err) {
            console.error(err);
        })
        .finally(function(){
            // kick start the Dialogflow app
            // based on an intent match, execute
            var intentMap = new Map();
            intentMap.set('Test Intent', testHandler);
            intentMap.set('Channel Intent', channelHandler);
            agent.handleRequest(intentMap);
        });
});

6. Actions on Google

Actions on Google is a development platform for the Google Assistant. It allows the third-party development of "actions"—applets for the Google Assistant that provide extended functionality.

You will need to invoke a Google Action, by asking Google to open or talk to an app.

This will open your action, it will change the voice, and you leave the ‘native' Google Assistant scope. Which means, everything you ask your agent from this point needs to be created by you. You can't suddenly ask the Google Assistant for Google weather information, if that's what you want; you should leave (close) the scope of your action (your app) first.

Testing your action in the Google Assistant simulator

Let's test the following conversation:

User

Google Assistant

"Hey Google, talk to your-name-tv-guide."

"Sure. Let me get your-name-tv-guide."

User

Your-Name-TV-Guide Agent

-

"Welcome, I am the tv guide...."

Test my agent

"This is a test message, when you see this, it means your webhook fulfillment worked!"

What's on MTV?

On MTV from 4 30 pm, MTV Unplugged is playing. Afterwards at 5 30 pm, Rupauls Drag Race will start.

  1. Switch back to the Google Assistant simulator

Open: https://console.actions.google.com

  1. Click on the microphone icon and ask the following:

c3b200803c7ba95e.png

  • Talk to my test agent
  • Test my agent

The Google Assistant should respond with:

5d93c6d037c8c8eb.png

  1. Now let's ask:
  • What's on Comedy Central?

This should return:

Currently on Comedy Central from 6 PM, The Simpsons is playing. Afterwards at 7 PM, Family Guy will start.

7. Congratulations

You have created your first Google Assistant action with Dialogflow, well done!

As you might have noticed, your action was running in test-mode which is tied to your Google Account. If you would login on your Nest device or Google Assistant app on your iOS or Android phone, with the same account. You could test your action as well.

Now this is a workshop demo. But when you are building applications for the Google Assistant for real, you could submit your Action for approval. Read this guide for more information.

What we've covered

  • How to create a chatbot with Dialogflow v2
  • How to create custom entities with Dialogflow
  • How to create linear conversation with Dialogflow
  • How to setup webhook fulfillments with Dialogflow and Firebase Functions
  • How to bring your application to the Google Assistant with Actions on Google

What's next?

Enjoyed this code lab? Have a look into these great labs!

Continue this code lab by integrating it for Google Chat:

Create a TV guide Google Chat with G Suite and Dialogflow