googlecastnew500.png

In this codelab will teach you how to build a Cast-enabled Receiver app to play content on a Google Cast device.

What is Google Cast?

Google Cast allows users to cast content from a mobile device to a TV. Users can then use their mobile device or desktop Chrome Browser as a remote control for media playback on the TV.

The Google Cast SDK allows your app to control Google Cast enabled devices (e.g. a TV or sound system). The Cast SDK provides you with the necessary UI components based on the Google Cast Design Checklist.

The Google Cast Design Checklist is provided to make the Cast user experience simple and predictable across all supported platforms.

What are we going to be building?

When you have completed this codelab, you will have a HTML5 app that acts as your very own custom receiver capable of displaying video content on Cast-enabled devices.

What you'll learn

What you'll need

Experience

How will you use this tutorial?

Read it through only Read it and complete the exercises

How would you rate your experience with building web apps?

Novice Intermediate Proficient

How would you rate your experience with watching TV?

Novice Intermediate Proficient

You can download all the sample code to your computer...

Download Source

and unpack the downloaded zip file.

To be able to use your receiver with a Cast device it needs to be hosted somewhere where your Cast device can reach it. Should you already have a server available to you that supports https, just skip the following instructions, just remember the URL, you'll need it in the next section.

If you don't have any server available to you, don't fret. You may install node.js, the http-server and ngrok node module.

npm install -g http-server
npm install -g ngrok

Run the server

If you're using http-server, go to your console, and do the following:

cd app-done
http-server

You should then see something like the following:

Starting up http-server, serving ./
Available on:
  http://127.0.0.1:8080
  http://172.19.17.192:8080
Hit CTRL-C to stop the server

Notice the local port used and do the following in a new terminal to expose your local receiver over HTTPS using ngrok:

ngrok http 8080


This will setup an ngrok tunnel to your local HTTP server, assigning you a globally available HTTPS secured endpoint you can use in the next step (https://116ec943.eu.ngrok.io):

ngrok by @inconshreveable                                                                                                                                                                                                                                     (Ctrl+C to quit)

Session Status         online
Version                2.2.4
Web Interface          http://127.0.0.1:8080
Forwarding             http://116ec943.eu.ngrok.io -> localhost:8080
Forwarding             https://116ec943.eu.ngrok.io -> localhost:8080

You should keep both ngrok and http-server running for the duration of the codelab. Any changes you make locally will be instantly available.

You must register your application to be able to run a custom receiver, as built in this codelab, on Chromecast devices. After you've registered your application, you'll receive an application ID that your sender application must use to perform API calls, such as to launch a receiver application.

Click "Add new application"

Select "Custom Receiver", this is what we're building.

Enter the details of your new receiver, be sure to use the URL you ended up with

in the last section. Make a note of the Application ID assigned to your brand new receiver.

You must also register your Google Cast device so that it may access your receiver application before you publish it. Once you publish your receiver application, it will be available to all Google Cast devices. For the purpose of this codelab it's advised to work with an unpublished receiver application.

Click on "Add new Device"

Enter the serial number printed on the back of your Cast device and give it a descriptive name. You can also find your serial number by casting your screen in Chrome when accessing Google Cast SDK Developer Console

It will take 5-15 minutes for your receiver and device to be ready for testing. After waiting 5-15 minutes you must reboot your Cast device.

Google_Chrome_logo_icon.png

While we wait for our new receiver application be ready for testing let's see what the completed receiver app looks like. The receiver we're going to build will be capable of playing back media using adaptive bitrate streaming (we'll be using sample content encoded for Dynamic Adaptive Streaming over HTTP (DASH))

In your browser, visit https://cast-tse-demo.firebaseapp.com/sender/

  1. You should see our sample web sender
  2. Click the Cast button and select your Google Cast device.
  3. Select a video, click on the play button.
  4. The video will start playing on your Google Cast device using the finished receiver you're going to build during the next few sections.

We need to add support for Google Cast to the start app you downloaded. Here are some Google Cast terminology that we will be using in this codelab:

Now you're ready to build on top of the starter project using your favorite text editor:

  1. Select the android_studio_folder.pngapp-start directory from your sample code download.
  2. Open up js/receiver.js and index.html

Note, as you're working through this codelab, http-server should be picking up changes you make. If you notice it doesn't, try killing and restarting http-server.

App Design

The receiver app initializes the Cast session and will stand-by until a LOAD request (ie. the command to playback a piece of media) from a sender arrives.

The app consists of one main view, defined in index.html and one JavaScript file called js/receiver.js containing all the logic to make our receiver work.

index.html

This html file will contain all of the UI for our receiver app. For now it's basically empty.

receiver.js

This script manages will manage all of the logic for our receiver app. Right now it's just an empty file, but we're going to turn it into a fully functioning Cast receiver with just a few lines of code in the next section.

Frequently Asked Questions

A basic Cast receiver has to initialize the Cast session on startup. This is necessary to tell all connected sender applications that bringing up the receiver was successful. More than that, the new SDK comes pre-configured to handle adaptive bitrate streaming media (using DASH, HLS and Smooth Streaming) and plain MP4 files out of the box. Let's try this out.

Initialization

Add the following code to index.html just in front of the tag loading our own js/receiver.js

<script src="//www.gstatic.com/cast/sdk/libs/caf_receiver/v3/cast_receiver_framework.js"></script>

Add the following code to index.html <body> to provide the receiver SDK with space to bring-up the default receiver UI which is shipping with the script you have just added.

<cast-media-player></cast-media-player>

Now, we need to initialize the SDK in js/receiver.js, that corresponds to the method that we just called in index.html. It's just three lines, consisting of

const context = cast.framework.CastReceiverContext.getInstance();
const playerManager = context.getPlayerManager();

context.start();

For the purpose of this Codelab we have created a sample web sender loaded with some media you can use to try out your brand new receiver.

Point your web browser to https:/cast-tse-demo.firebaseapp.com/sender/?appid=<yourAppId>

Be sure to substitute your own App Id, as registered earlier, in the URL to make sure the sender uses your receiver when starting up on the Cast device. Also make sure "Cast MP4s" is checked, you can find the setting in the top right of the sender application.

Casting media

At a high level, if you want to play a media on a Cast device, the following needs to happen:

  1. The Sender creates a MediaInfo JSON object from the Cast SDK that models a media item.
  2. The user connects to the Cast device to launch your receiver application.
  3. Load the MediaInfo object using a LOAD request into your receiver and play the content.
  4. Track the media status.
  5. Send playback commands to the receiver based on user interactions.

In this first basic attempt we're going to populate MediaInfo with a playable asset URL (stored in MediaInfo.contentId). Real-world applications usually just put an application-specific media identifier in there and leave it to the receiver to make appropriate backend API calls to resolve the actual asset URL and handle things such as DRM license acquisition or injecting information about ad breaks.

We're going to extend your receiver to do something just like that in the next section. For now, just pick a sample media item from the sender, click on the Cast icon and hopefully have your receiver start up on the Cast device playing the media item you have just selected.

So out-of-the-box the Receiver SDK handles:

Feel free to explore the sample web sender and it's code before moving on to the next section, where we're going to extend our receiver to talk to a simple sample API to fulfill incoming LOAD requests from senders.

We have prepared a catalogue of sample media on Firebase (it's all the pieces of content you might have already seen in the sample web sender) that can be queried using its REST API.

In line with how most developers interact with their Cast Receivers in real-world applications we're going to modify our receiver to be able to handle LOAD requests that reference the intended media content by it's key in our API instead of sending over a playable asset URL.

Applications usually do this due to the fact that:

It showcases the SDKs hooks for customizing common receiver tasks while still relying on a mostly out-of-the-box experience.

Sample API

Point your browser to https://tse-summit.firebaseio.com/content.json and have a look at our sample video catalog:

{
  "bbb": {
    "author": "The Blender Project",
    "description": "Grumpy Bunny is grumpy",
    "poster": "https://[...]/[...]/bbb/poster.png",
    "prog": "https://[...]/[...]/bbb/bbb-prog.mp4",
    "stream": "https://[...]/[...]/bbb/dash.mpd",
    "title": "Big Buck Bunny"
  },
  "chromecast_ad": {
    "author": "Google Inc.",
    "description": "Chromecast: Love it a lot!",
    "poster": "https://[...]/[...]/chromecast_ad/poster.png",
    "prog": "https://[...]/[...]/chromecast_ad/chromecast_ad-prog.mp4",
    "stream": "https://[...]/[...]/chromecast_ad/dash.mpd",
    "title": "Chromecast - Love it a lot"
  },
  "chromecast_ultra_ad": {
    "author": "Google Inc.",
    "description": "Introducing the latest addition to the [...]",
    "poster": "https://[...]/[...]/chromecast_ad/poster.png",
    "prog": "https://[...]/[...]/chromecast_ad/chromecast__ad-prog.mp4",
    "stream": "https://[...]/[...]/chromecast_ad/dash.mpd",
    "title": "Introducing Chromecast Ultra"
  },
  "io_highlights": {
    "author": "Google Inc.",
    "description": "I/O 2016 was a blast. [...]",
    "poster": "https://[...]/[...]/io_highlights/poster.png",
    "prog": "https://[...].com/[...]/io_highlights/io_highlights-prog.mp4",
    "stream": "[...]/[...]/io_highlights/dash.mpd",
    "title": "I/O 2016 Highlights"
  },
  "multiuser": {
    "author": "Google Inc.",
    "description": "Google Home launches another long awaited feature",
    "poster": "https://[...]/[...]/multiuser/poster.png",
    "prog": "https://[...]/[...]/multiuser/multiuser-prog.mp4",
    "stream": "https://[...]/[...]/multiuser/dash.mpd",
    "title": "Google Home Multiuser"
  }
}

As a next step where going to map each entry's key (bbb, io_hightlights, etc.) to it's stream URL when the receiver gets called with a LOAD request.

Intercept the LOAD request

Add the following to your js/receiver.js file just before the call to context.start();

function makeRequest (method, url) {
  return new Promise(function (resolve, reject) {
    var xhr = new XMLHttpRequest();
    xhr.open(method, url);
    xhr.onload = function () {
      if (this.status >= 200 && this.status < 300) {
        resolve(JSON.parse(xhr.response));
      } else {
        reject({
          status: this.status,
          statusText: xhr.statusText
        });
      }
    };
    xhr.onerror = function () {
      reject({
        status: this.status,
        statusText: xhr.statusText
      });
    };
    xhr.send();
  });
}

playerManager.setMessageInterceptor(
    cast.framework.messages.MessageType.LOAD,
    request => {
      return new Promise((resolve, reject) => {
        // Fetch content repository by requested contentId
        makeRequest('GET', 'https://tse-summit.firebaseio.com/content.json?orderBy=%22$key%22&equalTo=%22'+ request.media.contentId + '%22').then(function (data) {
          var item = data[request.media.contentId];
          if(!item) {
            // Content could not be found in repository
            reject();
          } else {
            // Adjusting request to make requested content playable
            request.media.contentId = item.stream.hls;
            request.media.contentType = 'application/x-mpegurl';

            // Add metadata
            var metadata = new 
               cast.framework.messages.GenericMediaMetadata();
            metadata.title = item.title;
            metadata.subtitle = item.author;

            request.media.metadata = metadata;

            // Resolve request
            resolve(request);
          }
        });
      });
    });

We have provided makeRequest as a convenience method, wrapping a native XHR request inside a Promise.

The interesting part is setMessageInterceptor(); it enables you to intercept incoming messages per type and modify them before they are reaching the SDK's internal message handler. As seen in the previous section, the SDK is able to handle playback of streamable URLs on it's own.

Here we're using our interceptor to:

Save your modifications and get ready to try your modified receiver.

Testing it out

Again point your browser to https://cast-tse-demo.firebaseapp.com/sender/?appid=<yourAppId>

This time make sure to disable the "Cast MP4s" option in the top right corner. This will make our sample application send a LOAD request only containing the reference to our mediaItem.

Assuming everything worked fine with your modifications to the receiver, our interceptor should now take care of shaping the MediaInfo object into something the SDK can play on the screen.

Cast SDK provides another option for developers to easily debug your Receiver App by using CastDebugLogger API and a companion tool to capture logs.

Initialization

Add the caf_receiver_logger.js to index.html right after the cast_receiver_framework.js library.

<script src="//www.gstatic.com/cast/sdk/libs/caf_receiver/v3/cast_receiver_framework.js"></script>
<!-- Cast Debug Logger -->
<script src="//www.gstatic.com/cast/sdk/libs/devtools/debug_layer/caf_receiver_logger.js"></script>

In js/receiver.js, get the CastDebugLogger instance and enable the logger.

const castDebugLogger = cast.debug.CastDebugLogger.getInstance();

// Enable debug logger and show a warning on receiver
// NOTE: make sure it is disabled on production
castDebugLogger.setEnabled(true);

When the debug logger is enabled, you will see an overlay with text ‘DEBUG MODE' on the receiver.

Receiver Debug Overlay

Cast SDK provides a debug overlay on receiver to show your custom log messages. Use showDebugLogs to toggle the debug overlay and clearDebugLogs to clear the log.

// Show debug overlay
castDebugLogger.showDebugLogs(true);

// Clear log messages on debug overlay
castDebugLogger.clearDebugLogs();

Log Messages and Custom Tags

The CastDebugLogger class allows you to create log messages that appear on the receiver debug overlay with different color. Use the following log methods, listed in order from highest to lowest priority:

For each log method, the first parameter should be a tag and the second parameter is the message. The tag can be any string that you find helpful. For example, you might create an info log message in addEventListener to investigate core events:

playerManager.addEventListener(
    cast.framework.events.category.CORE, 
    event => {
        castDebugLogger.info('ANALYTICS', event);
    });

Here is another debug logger example in the LOAD interceptor:

playerManager.setMessageInterceptor(
    cast.framework.messages.MessageType.LOAD,
    request => {
      castDebugLogger.info('MyAPP.LOG', 'Intercepting LOAD request');

      return new Promise((resolve, reject) => {
        // Fetch content repository by requested contentId
        makeRequest('GET', URL).then(function (data) {
          var item = data[request.media.contentId];
          if(!item) {
            // Content could not be found in repository
            castDebugLogger.error('MyAPP.LOG', 'Content not found');

            reject();
          } else {
            // Adjusting request to make requested content playable
            request.media.contentId = item.stream.hls;
            castDebugLogger.warn('MyAPP.LOG', 'Playable URL: ', request.media.contentId);

            .......

            // Resolve request
            resolve(request);
          }
        });
      });
    });

You can control which messages appear on the receiver debug overlay by setting the log level in loggerLevelByTags for each custom tag. For example, enabling a custom tag with log level cast.framework.LoggerLevel.DEBUG would display all messages added with error, warn, and info log messages. Another example is that enabling a custom tag with WARNING would only display error and warn log messages.

// Set verbosity level for custom tags
castDebugLogger.loggerLevelByTags = {
    'MyAPP.LOG': cast.framework.LoggerLevel.WARNING,
    'ANALYTICS': cast.framework.LoggerLevel.INFO,
};

Cast Logging Tool

The Cast Logging Tool helps to capture your logs and control the debug overlay.

  1. Open the logging tool: https://cast-tse-demo.firebaseapp.com/receiver-debug-tool, set your AppId, and click cast button to cast your receiver.
  2. Connect to our sample web sender: https://cast-tse-demo.firebaseapp.com/sender/?appid=<yourAppId>, the web sender should be cast-connected automatically.
  3. Play video and you can see the logging messages printed on the tool.
  4. Click "SHOW" button to see a debug overlay on receiver
  5. Click "MEDIA INFO" or "MEDIA SESSION" tab to see the media status.

Smart displays are devices with touch functionality to allow receiver applications to support touch-enabled controls.

This section explains how to optimize your receiver application when launched on smart displays and how to customize the player controls.

Accessing UI Controls

The UI Controls object for Smart Displays can be accessed with the following code:

const touchControls = cast.framework.ui.Controls.getInstance();

context.start({ touchScreenOptimizedApp: true });

Default buttons are assigned to each slot based on MetadataType.

Video Controls

For MetadataType.MOVIE, MetadataType.TV_SHOW, and MetadataType.GENERIC, the UI Controls object for Smart Displays will be displayed as below:

  1. --playback-logo-image
  2. MediaMetadata.subtitle
  3. MediaMetadata.title
  4. MediaStatus.currentTime
  5. MediaInformation.duration
  6. ControlsSlot.SLOT_1: ControlsButton.QUEUE_PREV
  7. ControlsSlot.SLOT_2: ControlsButton.SEEK_BACKWARD_30
  8. PLAY/PAUSE
  9. ControlsSlot.SLOT_3: ControlsButton.SEEK_FORWARD_30
  10. ControlsSlot.SLOT_4: ControlsButton.QUEUE_NEXT

Audio Controls

For MetadataType.MUSIC_TRACK, the UI Controls object for Smart Displays will be displayed as below:

  1. --playback-logo-image
  2. MusicTrackMediaMetadata.albumName
  3. MusicTrackMediaMetadata.title
  4. MusicTrackMediaMetadata.albumArtist
  5. MusicTrackMediaMetadata.images[0]
  6. MediaStatus.currentTime
  7. MediaInformation.duration
  8. ControlsSlot.SLOT_1: ControlsButton.NO_BUTTON
  9. ControlsSlot.SLOT_2: ControlsButton.QUEUE_PREV
  10. PLAY/PAUSE
  11. ControlsSlot.SLOT_3: ControlsButton.QUEUE_NEXT
  12. ControlsSlot.SLOT_4: ControlsButton.NO_BUTTON

Updating Supported Media Commands

The UI Controls object also decides if a ControlsButton is shown or not based on MediaStatus.supportedMediaCommands.

When the value of supportedMediaCommands equals to ALL_BASIC_MEDIA, default control layout will display as below:

When the value of supportedMediaCommands equals to ALL_BASIC_MEDIA | QUEUE_PREV | QUEUE_NEXT, default control layout will display as below:

When the value of supportedMediaCommands equals to PAUSE | QUEUE_PREV | QUEUE_NEXT, default control layout will display as below:

When text tracks are available, closed caption button would be always shown at SLOT_1.

To dynamically change the value of supportedMediaCommands after starting a receiver context, you can call PlayerManager.setSupportedMediaCommands to override the value. Also, you can add new command by using addSupportedMediaCommands or remove existing command by using removeSupportedMediaCommands.

Customizing Control Buttons

And, you can definitely change the buttons by using PlayerDataBinder.

const playerData = new cast.framework.ui.PlayerData();
const playerDataBinder = new cast.framework.ui.PlayerDataBinder(playerData);
const touchControls = cast.framework.ui.Controls.getInstance();

playerDataBinder.addEventListener(
  cast.framework.ui.PlayerDataEventType.MEDIA_CHANGED,
  (e) => {
    if (!e.value) return;

    // Clear default buttons and re-assign
    touchControls.clearDefaultSlotAssignments();
    touchControls.assignButton(
      cast.framework.ui.ControlsSlot.SLOT_1,
      cast.framework.ui.ControlsButton.SEEK_BACKWARD_30
    );
  });

BrowseContent

Use BrowseContent to customize the title of the Media Browse UI and update items:

  1. BrowseContent.title
  2. BrowseContent.items

BrowseItem

Use BrowseItem to display title, subtitle, duration, and image for each item:

  1. BrowseItem.image
  2. BrowseItem.duration
  3. BrowseItem.title
  4. BrowseItem.subtitle

Aspect Ratio

Use targetAspectRatio to select the best aspect ratio for your image assets. Three aspect ratios are supported by the CAF Receiver SDK: SQUARE_1_TO_1, PORTRAIT_2_TO_3, LANDSCAPE_16_TO_9.

Set Media Browse data

Provide a list of media contents for browsing by calling setBrowseContent:

const touchControls = cast.framework.ui.Controls.getInstance();
let browseItems = getBrwoseItems();

function getBrwoseItems() {
  let browseItems = [];
  makeRequest('GET', 'https://tse-summit.firebaseio.com/content.json')
  .then(function (data) {
    for (let key in data) {
      let item = new cast.framework.ui.BrowseItem();
      item.entity = key;
      item.title = data[key].title;
      item.subtitle = data[key].description;
      item.image = new cast.framework.messages.Image(data[key].poster);
      item.imageType = cast.framework.ui.BrowseImageType.MOVIE;
      browseItems.push(item);
    }
  });
  return browseItems;
}

let browseContent = new cast.framework.ui.BrowseContent();
browseContent.title = 'Up Next';
browseContent.items = browseItems;
browseContent.targetAspectRatio =
  cast.framework.ui.BrowseImageAspectRatio.LANDSCAPE_16_TO_9;

playerDataBinder.addEventListener(
  cast.framework.ui.PlayerDataEventType.MEDIA_CHANGED,
  (e) => {
    if (!e.value) return;

    .......

    // Media browse
    touchControls.setBrowseContent(browseContent);
  });

Clicking on a media browse item will trigger the LOAD interceptor. Add the following code to map request.media.entity from media browse item to request.media.contentId.

playerManager.setMessageInterceptor(
    cast.framework.messages.MessageType.LOAD,
    request => {
      ......

      if (request.media && request.media.entity) {
        request.media.contentId = request.media.entity;
      }

      new Promise((resolve, reject) => {......});
    });

You can also set BrowseContent as null to remove the Media Browse UI.

You now know how to create a custom receiver application using the latest Cast Receiver SDK.

Take a look at our sample apps on GitHub: github.com/googlecast.