googlecastnew500.png

This codelab will teach you how to build a Cast-enabled Receiver app to play content on a Google Cast device.

What is Google Cast?

Google Cast allows users to cast content from a mobile device to a TV. Users can then use their mobile device or desktop Chrome Browser as a remote control for media playback on the TV.

The Google Cast SDK allows your app to control Google Cast enabled devices (e.g. a TV or sound system). The Cast SDK provides you with the necessary UI components based on the Google Cast Design Checklist.

The Google Cast Design Checklist is provided to make the Cast user experience simple and predictable across all supported platforms. See more here.

What are we going to be building?

When you have completed this codelab, you will have a HTML5 app that acts as your very own custom receiver capable of displaying video content on Cast-enabled devices.

What you'll learn

What you'll need

Experience

How will you use this tutorial?

Read it through only Read it and complete the exercises

How would you rate your experience with building web apps?

Novice Intermediate Proficient

How would you rate your experience with watching TV?

Novice Intermediate Proficient

You can download all the sample code to your computer...

Download Source

and unpack the downloaded zip file.

To be able to use your receiver with a Cast device it needs to be hosted somewhere where your Cast device can reach it. Should you already have a server available to you that supports https, skip the following instructions and make note of the URL, as you'll need it in the next section.

If you don't have a server available to you, don't fret. You may install node.js, the http-server and ngrok node module.

npm install -g http-server
npm install -g ngrok

Run the server

If you're using http-server, go to your console, and do the following:

cd app-start
http-server

You should then see something like the following:

Starting up http-server, serving ./
Available on:
  http://127.0.0.1:8080
  http://172.19.17.192:8080
Hit CTRL-C to stop the server

Notice the local port used and do the following in a new terminal to expose your local receiver over HTTPS using ngrok:

ngrok http 8080


This will set up an ngrok tunnel to your local HTTP server, assigning you a globally available HTTPS secured endpoint that you can use in the next step (https://116ec943.eu.ngrok.io):

ngrok by @inconshreveable                                                                                                                                                                                                                                     (Ctrl+C to quit)

Session Status         online
Version                2.2.4
Web Interface          http://127.0.0.1:8080
Forwarding             http://116ec943.eu.ngrok.io -> localhost:8080
Forwarding             https://116ec943.eu.ngrok.io -> localhost:8080

You should keep both ngrok and http-server running for the duration of the codelab. Any changes you make locally will be instantly available.

You must register your application to be able to run a custom receiver, as built in this codelab, on Chromecast devices. After you've registered your application, you'll receive an application ID that your sender application must use to perform API calls, such as to launch a receiver application.

Click "Add new application"

Select "Custom Receiver," this is what we're building.

Enter the details of your new receiver, be sure to use the URL you ended up with

in the last section. Make a note of the Application ID assigned to your brand new receiver.

You must also register your Google Cast device so that it may access your receiver application before you publish it. Once you publish your receiver application, it will be available to all Google Cast devices. For the purpose of this codelab it's advised to work with an unpublished receiver application.

Click on "Add new Device"

Enter the serial number printed on the back of your Cast device and give it a descriptive name. You can also find your serial number by casting your screen in Chrome when accessing Google Cast SDK Developer Console

It will take 5-15 minutes before your receiver and device are ready for testing. After waiting 5-15 minutes you must reboot your Cast device.

Google_Chrome_logo_icon.png

While we wait for our new receiver application to be ready for testing let's see what a sample completed receiver app looks like. The receiver we're going to build will be capable of playing back media using adaptive bitrate streaming (we'll be using sample content encoded for Dynamic Adaptive Streaming over HTTP (DASH)).

In your browser, open the Command and Control (CaC) Tool.

  1. You should see our CaC Tool.
  2. Use the default "CC1AD845" sample receiver ID and click the "Set App ID" button.
  3. Click the Cast button at the top left and select your Google Cast device.

  1. Navigate to the "Load Media" tab at the top.

  1. Click the "Load by Content" button to play a sample video.
  2. The video will start playing on your Google Cast device to show what basic receiver functionality looks like using the Default Receiver.

We need to add support for Google Cast to the start app you downloaded. Here are some Google Cast terminology that we will be using in this codelab:

Now you're ready to build on top of the starter project using your favorite text editor:

  1. Select the android_studio_folder.pngapp-start directory from your sample code download.
  2. Open up js/receiver.js and index.html

Note, as you're working through this codelab, http-server should be picking up changes you make. If you notice it doesn't, try killing and restarting http-server.

App Design

The receiver app initializes the Cast session and will stand-by until a LOAD request (i.e., the command to playback a piece of media) from a sender arrives.

The app consists of one main view, defined in index.html and one JavaScript file called js/receiver.js containing all the logic to make our receiver work.

index.html

This html file will contain the UI for our receiver app. For now it's empty, and we'll be adding to it throughout the code lab.

receiver.js

This script will manage all of the logic for our receiver app. Right now it's just an empty file, but we're going to turn it into a fully functioning Cast receiver with just a few lines of code in the next section.

A basic Cast receiver will initialize the Cast session on startup. This is necessary to tell all connected sender applications that bringing up the receiver was successful. In addition, the new SDK comes pre-configured to handle adaptive bitrate streaming media (using DASH, HLS and Smooth Streaming) and plain MP4 files out of the box. Let's try this out.

Initialization

Add the following code to index.html in the header:

<head>
  ...

  <script src="//www.gstatic.com/cast/sdk/libs/caf_receiver/v3/cast_receiver_framework.js"></script>
</head>

Add the following code to index.html <body> before the <footer> loading receiver.js, to provide the receiver SDK with space to bring-up the default receiver UI which is shipping with the script you have just added.

<cast-media-player></cast-media-player>

Now, we need to initialize the SDK in js/receiver.js, consisting of:

Add the following to js/receiver.js.

const context = cast.framework.CastReceiverContext.getInstance();
const playerManager = context.getPlayerManager();

context.start();

For the purpose of this Codelab, use the CaC Tool to try out your brand new receiver.

Point your web browser to the Command and Control (CaC) Tool.

Be sure to substitute your own App Id as registered earlier in the field and click "Set App ID". This instructs the tool to use your receiver when starting the Cast session.

Casting media

At a high level, to play media on a Cast device the following needs to happen:

  1. The sender creates a MediaInfo JSON object from the Cast SDK that models a media item.
  2. The sender connects to the Cast device to launch the receiver application.
  3. The receiver loads the MediaInfo object through a LOAD request to play the content.
  4. The receiver monitors and tracks the media status.
  5. The sender sends playback commands to the receiver to control playback based on user interactions with the sender app.

In this first basic attempt we're going to populate MediaInfo with a playable asset URL (stored in MediaInfo.contentUrl).

A real-world sender uses an application-specific media identifier in MediaInfo.contentId. The receiver uses the contentId as an identifier to make appropriate backend API calls to resolve the actual asset URL and set it to MediaInfo.contentUrl. The receiver will also handle tasks such as DRM license acquisition or injecting information about ad breaks.

We're going to extend your receiver to do something just like that in the next section. For now, click on the Cast icon and select your device to open your receiver.

Navigate to the "Load Media" tab and click the "Load by Content'' button. Your receiver should start playing the sample content.

So out-of-the-box the Receiver SDK handles:

Feel free to explore the CaC Tool and its code before moving on to the next section, where we're going to extend our receiver to talk to a simple sample API to fulfill incoming LOAD requests from senders.

In line with how most developers interact with their Cast Receivers in real-world applications, we're going to modify our receiver to handle LOAD requests that reference the intended media content by its API key instead of sending over a playable asset URL.

Applications typically do this because:

This functionality is primarily implemented in the PlayerManager setMessageInterceptor() method. This enables you to intercept incoming messages by type and modify them before they reach the SDK's internal message handler. In this section we are dealing with LOAD requests where we will do the following:

The provided sample API showcases the SDK's hooks for customizing common receiver tasks, while still relying on a mostly out-of-the-box experience.

Sample API

Point your browser to https://storage.googleapis.com/cpe-sample-media/content.json and have a look at our sample video catalog. The content includes URLs for poster images in png format as well as both DASH and HLS streams. The DASH and HLS streams point to demuxed video and audio sources stored in fragmented mp4 containers.

{
  "bbb": {
    "author": "The Blender Project",
    "description": "Grumpy Bunny is grumpy",
    "poster": "https://[...]/[...]/BigBuckBunny/images/screenshot1.png",
    "stream": {
      "dash": "https://[...]/[...]/BigBuckBunny/BigBuckBunny_master.mpd",
      "hls": "https://[...]/[...]/BigBuckBunny/BigBuckBunny_master.m3u8",
    "title": "Big Buck Bunny"
  },
  "fbb_ad": {
    "author": "Google Inc.",
    "description": "Introducing Chromecast. The easiest way to enjoy [...]",
    "poster": "https://[...]/[...]/ForBiggerBlazes/images/screenshot8.png",
    "stream": {
      "dash": "https://[...]/[...]/ForBiggerBlazes/ForBiggerBlazes.mpd",
      "hls": "https://[...]/[...]/ForBiggerBlazes/ForBiggerBlazes.m3u8",
    "title": "For Bigger Blazes"
  },

  [...]

}

In the next step we'll map each entry's key (e.g., bbb, fbb_ad ) to the stream's URL after the receiver gets called with a LOAD request.

Intercept the LOAD request

In this step we will be creating a load interceptor with a function that makes an XHR request to the hosted JSON file. Once the JSON file is obtained we will parse the content and set the metadata. In the following sections we will customize the MediaInformation parameters to specify the content type.

Add the following code to your js/receiver.js file, just before the call to context.start().

function makeRequest (method, url) {
  return new Promise(function (resolve, reject) {
    let xhr = new XMLHttpRequest();
    xhr.open(method, url);
    xhr.onload = function () {
      if (this.status >= 200 && this.status < 300) {
        resolve(JSON.parse(xhr.response));
      } else {
        reject({
          status: this.status,
          statusText: xhr.statusText
        });
      }
    };
    xhr.onerror = function () {
      reject({
        status: this.status,
        statusText: xhr.statusText
      });
    };
    xhr.send();
  });
}

playerManager.setMessageInterceptor(
    cast.framework.messages.MessageType.LOAD,
    request => {
      return new Promise((resolve, reject) => {
        // Fetch content repository by requested contentId
        makeRequest('GET', 'https://storage.googleapis.com/cpe-sample-media/content.json').then(function (data) {
          let item = data[request.media.contentId];
          if(!item) {
            // Content could not be found in repository
            reject();
          } else {
            // Add metadata
            let metadata = new 
               cast.framework.messages.GenericMediaMetadata();
            metadata.title = item.title;
            metadata.subtitle = item.author;

            request.media.metadata = metadata;

            // Resolve request
            resolve(request);
          }
        });
      });
    });

The next section will outline how to configure the load request's media property for DASH content.

Using the Sample API DASH Content

Now that we have prepared the load interceptor, we will specify the content type to the receiver. This information will provide the receiver with the master playlist URL and the stream MIME type. Add the following code to the js/receiver.js file in the LOAD interceptor's Promise():

...
playerManager.setMessageInterceptor(
    cast.framework.messages.MessageType.LOAD,
    request => {
      return new Promise((resolve, reject) => {
          ...
          } else {
            // Adjusting request to make requested content playable
            request.media.contentUrl = item.stream.dash;
            request.media.contentType = 'application/dash+xml';
            ...
          }
        });
      });
    });

Once you complete this step you can proceed to Testing It Out to try loading with DASH content. If you would like to test loading with HLS content instead check out the next step.

Using the Sample API HLS Content

The sample API includes HLS content as well as DASH. In addition to setting the contentType like we did in the previous step, the load request will need some additional properties in order to use the sample API's HLS urls. When the receiver is configured to play HLS streams, the default container type expected is transport stream (TS). As a result, the receiver will try to open the sample MP4 streams in TS format if only the contentUrl property is modified. In the load request, the MediaInformation object should be modified with additional properties so that the receiver knows that the content is of type MP4 and not TS. Add the following code to your js/receiver.js file in the load interceptor to modify the contentUrl and contentType properties. Additionally add the HlsSegmentFormat and HlsVideoSegmentFormat properties.

...
playerManager.setMessageInterceptor(
    cast.framework.messages.MessageType.LOAD,
    request => {
      return new Promise((resolve, reject) => {
          ...
          } else {
            // Adjusting request to make requested content playable
            request.media.contentUrl = item.stream.hls;
            request.media.contentType = 'application/x-mpegurl';
            request.media.hlsSegmentFormat = cast.framework.messages.HlsSegmentFormat.FMP4;
            request.media.hlsVideoSegmentFormat = cast.framework.messages.HlsVideoSegmentFormat.FMP4;
            ...
          }
        });
      });
    });

Testing It Out

Again, open the Command and Control (CaC) Tool and set your App ID to your receiver's App ID. Select your device using the Cast button.

Navigate to the "Load Media" tab. This time delete the text in the "Content URL" field beside the "Load by Content" button, which will force our application to send a LOAD request containing only the contentId reference to our media.

Assuming everything worked fine with your modifications to the receiver, the interceptor should take care of shaping the MediaInfo object into something the SDK can play on the screen.

Click the "Load by Content" button to see if your media plays properly. Feel free to change the Content ID to another ID in the content.json file.

Smart displays are devices with touch functionality which allow receiver applications to support touch-enabled controls.

This section explains how to optimize your receiver application when launched on smart displays, and how to customize the player controls.

Accessing UI Controls

The UI Controls object for Smart Displays can be accessed using cast.framework.ui.Controls.GetInstance(). Add the following code to your js/receiver.js file above context.start():

...

// Optimizing for smart displays 
const touchControls = cast.framework.ui.Controls.getInstance();

context.start();

If you don't use the <cast-media-player> element, you will need to set touchScreenOptimizedApp in CastReceiverOptions. In this codelab we are using the <cast-media-player> element.

context.start({ touchScreenOptimizedApp: true });

Default control buttons are assigned to each slot based on MetadataType and MediaStatus.supportedMediaCommands.

Video Controls

For MetadataType.MOVIE, MetadataType.TV_SHOW, and MetadataType.GENERIC, the UI Controls object for Smart Displays will be displayed as in the example below.

  1. --playback-logo-image
  2. MediaMetadata.subtitle
  3. MediaMetadata.title
  4. MediaStatus.currentTime
  5. MediaInformation.duration
  6. ControlsSlot.SLOT_SECONDARY_1: ControlsButton.QUEUE_PREV
  7. ControlsSlot.SLOT_PRIMARY_1: ControlsButton.SEEK_BACKWARD_30
  8. PLAY/PAUSE
  9. ControlsSlot.SLOT_PRIMARY_2: ControlsButton.SEEK_FORWARD_30
  10. ControlsSlot.SLOT_SECONDARY_2: ControlsButton.QUEUE_NEXT

Audio Controls

For MetadataType.MUSIC_TRACK, the UI Controls object for Smart Displays will be displayed as below:

  1. --playback-logo-image
  2. MusicTrackMediaMetadata.albumName
  3. MusicTrackMediaMetadata.title
  4. MusicTrackMediaMetadata.albumArtist
  5. MusicTrackMediaMetadata.images[0]
  6. MediaStatus.currentTime
  7. MediaInformation.duration
  8. ControlsSlot.SLOT_SECONDARY_1: ControlsButton.NO_BUTTON
  9. ControlsSlot.SLOT_PRIMARY_1: ControlsButton.QUEUE_PREV
  10. PLAY/PAUSE
  11. ControlsSlot.SLOT_PRIMARY_2: ControlsButton.QUEUE_NEXT
  12. ControlsSlot.SLOT_SECONDARY_2: ControlsButton.NO_BUTTON

Updating Supported Media Commands

The UI Controls object also determines if a ControlsButton is shown or not based on MediaStatus.supportedMediaCommands.

When the value of supportedMediaCommands is equal to ALL_BASIC_MEDIA, the default control layout will display as below:

When the value of supportedMediaCommands is equal to ALL_BASIC_MEDIA | QUEUE_PREV | QUEUE_NEXT, the default control layout will display as below:

When the value of supportedMediaCommands equals PAUSE | QUEUE_PREV | QUEUE_NEXT, the default control layout will display as below:

When text tracks are available, the closed caption button will always be shown at SLOT_1.

To dynamically change the value of supportedMediaCommands after starting a receiver context, you can call PlayerManager.setSupportedMediaCommands to override the value. Also, you can add a new command by using addSupportedMediaCommands or remove an existing command by using removeSupportedMediaCommands.

Customizing Control Buttons

You can customize the controls by using PlayerDataBinder. Add the following code to your js/receiver.js file below the touchControls to set the first slot of your controls:

...

// Optimizing for smart displays 
const touchControls = cast.framework.ui.Controls.getInstance();
const playerData = new cast.framework.ui.PlayerData();
const playerDataBinder = new cast.framework.ui.PlayerDataBinder(playerData);

playerDataBinder.addEventListener(
  cast.framework.ui.PlayerDataEventType.MEDIA_CHANGED,
  (e) => {
    if (!e.value) return;

    // Clear default buttons and re-assign
    touchControls.clearDefaultSlotAssignments();
    touchControls.assignButton(
      cast.framework.ui.ControlsSlot.SLOT_PRIMARY_1,
      cast.framework.ui.ControlsButton.SEEK_BACKWARD_30
    );
  });

context.start();

Media Browse is a CAF Receiver feature that allows users to explore additional content on touch devices. In order to implement this you will use PlayerDataBinder to set the BrowseContent UI. You can then populate it with BrowseItems based on the content you want to display.

BrowseContent

Below is an example of the BrowseContent UI and its properties:

  1. BrowseContent.title
  2. BrowseContent.items

Aspect Ratio

Use the targetAspectRatio property to select the best aspect ratio for your image assets. Three aspect ratios are supported by the CAF Receiver SDK: SQUARE_1_TO_1, PORTRAIT_2_TO_3, LANDSCAPE_16_TO_9.

BrowseItem

Use BrowseItem to display title, subtitle, duration, and image for each item:

  1. BrowseItem.image
  2. BrowseItem.duration
  3. BrowseItem.title
  4. BrowseItem.subtitle

Set Media Browse data

You can provide a list of media content for browsing by calling setBrowseContent. Add the following code to your js/receiver.js file below your playerDataBinder and in your MEDIA_CHANGED event listener to set the browse items with a title of "Up Next".

// Optimizing for smart displays
const touchControls = cast.framework.ui.Controls.getInstance();
const playerData = new cast.framework.ui.PlayerData();
const playerDataBinder = new cast.framework.ui.PlayerDataBinder(playerData);

...

let browseItems = getBrowseItems();

function getBrowseItems() {
  let browseItems = [];
  makeRequest('GET', 'https://storage.googleapis.com/cpe-sample-media/content.json')
  .then(function (data) {
    for (let key in data) {
      let item = new cast.framework.ui.BrowseItem();
      item.entity = key;
      item.title = data[key].title;
      item.subtitle = data[key].description;
      item.image = new cast.framework.messages.Image(data[key].poster);
      item.imageType = cast.framework.ui.BrowseImageType.MOVIE;
      browseItems.push(item);
    }
  });
  return browseItems;
}

let browseContent = new cast.framework.ui.BrowseContent();
browseContent.title = 'Up Next';
browseContent.items = browseItems;
browseContent.targetAspectRatio = cast.framework.ui.BrowseImageAspectRatio.LANDSCAPE_16_TO_9;

playerDataBinder.addEventListener(
  cast.framework.ui.PlayerDataEventType.MEDIA_CHANGED,
  (e) => {
    if (!e.value) return;

    ....

    // Media browse
    touchControls.setBrowseContent(browseContent);
  });

Clicking on a media browse item will trigger the LOAD interceptor. Add the following code to your LOAD interceptor to map the request.media.contentId to the request.media.entity from media browse item:

playerManager.setMessageInterceptor(
    cast.framework.messages.MessageType.LOAD,
    request => {
      ...

      // Map contentId to entity
      if (request.media && request.media.entity) {
        request.media.contentId = request.media.entity;
      }

      return new Promise((resolve, reject) => {
            ... 
        });
    });

You can also set the BrowseContent object to null to remove the Media Browse UI.

The Cast Receiver SDK provides another option for developers to easily debug receiver apps by using the CastDebugLogger API and a companion Command and Control (CaC) Tool to capture logs.

Initialization

To incorporate the API add the CastDebugLogger source script in your index.html file. The source should be declared in the <head> tag after the Cast Receiver SDK declaration.

<head>
  ...
  <script src="//www.gstatic.com/cast/sdk/libs/caf_receiver/v3/cast_receiver_framework.js"></script>
  <!-- Cast Debug Logger -->
  <script src="//www.gstatic.com/cast/sdk/libs/devtools/debug_layer/caf_receiver_logger.js"></script>
</head>

In js/receiver.js at the top of the file and below the playerManager, add the following code to retrieve the CastDebugLogger instance and enable the logger:

const context = cast.framework.CastReceiverContext.getInstance();
const playerManager = context.getPlayerManager();

// Debug Logger
const castDebugLogger = cast.debug.CastDebugLogger.getInstance();
const LOG_TAG = 'MyAPP.LOG';

// Enable debug logger and show a 'DEBUG MODE' overlay at top left corner.
castDebugLogger.setEnabled(true);

When the debug logger is enabled, an overlay displaying DEBUG MODE will show on the receiver.

Log Player Events

Using CastDebugLogger you can easily log player events that are fired by the CAF Receiver SDK and use different logger levels to log the event data. The loggerLevelByEvents config uses cast.framework.events.EventType and cast.framework.events.category to specify which events will be logged.

Add the following code below the castDebugLogger declaration to log when a player CORE event is triggered or a mediaStatus change is broadcast:

// Debug Logger
const castDebugLogger = cast.debug.CastDebugLogger.getInstance();

// Enable debug logger and show a 'DEBUG MODE' overlay at top left corner.
castDebugLogger.setEnabled(true);

// Set verbosity level for Core events.
castDebugLogger.loggerLevelByEvents = {
  'cast.framework.events.category.CORE': cast.framework.LoggerLevel.INFO,
  'cast.framework.events.EventType.MEDIA_STATUS': cast.framework.LoggerLevel.DEBUG
}

Log Messages and Custom Tags

The CastDebugLogger API allows you to create log messages that appear on the receiver debug overlay with different colors. The following log methods are available, listed in order from highest to lowest priority:

For each log method, the first parameter is a custom tag. This can be any identifying string that you find meaningful. The CastDebugLogger uses tags to filter the logs. Usage of tags is explained in detail further below. The second parameter is the log message.

To show logs in action, add logs to your LOAD interceptor.

playerManager.setMessageInterceptor(
  cast.framework.messages.MessageType.LOAD,
  request => {
    castDebugLogger.info(LOG_TAG, 'Intercepting LOAD request');

    // Map contentId to entity
    if (request.media && request.media.entity) {
      request.media.contentId = request.media.entity;
    }

    return new Promise((resolve, reject) => {
      // Fetch content repository by requested contentId
      makeRequest('GET', 'https://storage.googleapis.com/cpe-sample-media/content.json')
        .then(function (data) {
          let item = data[request.media.contentId];
          if(!item) {
            // Content could not be found in repository
            castDebugLogger.error(LOG_TAG, 'Content not found');
            reject();
          } else {
            // Adjusting request to make requested content playable
            request.media.contentUrl = item.stream.dash;
            request.media.contentType = 'application/dash+xml';
            castDebugLogger.warn(LOG_TAG, 'Playable URL:', request.media.contentUrl);

            // Add metadata
            let metadata = new cast.framework.messages.MovieMediaMetadata();
            metadata.metadataType = cast.framework.messages.MetadataType.MOVIE;
            metadata.title = item.title;
            metadata.subtitle = item.author;

            request.media.metadata = metadata;

            // Resolve request
            resolve(request);
          }
      });
    });
  });

You can control which messages appear on the debug overlay by setting the log level in loggerLevelByTags for each custom tag. For example, enabling a custom tag with log level cast.framework.LoggerLevel.DEBUG will display all messages added with error, warn, info, and debug log messages. Enabling a custom tag with WARNING level will only display error and warn log messages.

The loggerLevelByTags config is optional. If a custom tag is not configured for its logger level, all log messages will display on the debug overlay.

Add the following code below the CORE event logger:

// Set verbosity level for Core events.
castDebugLogger.loggerLevelByEvents = {
  'cast.framework.events.category.CORE': cast.framework.LoggerLevel.INFO,
  'cast.framework.events.EventType.MEDIA_STATUS': cast.framework.LoggerLevel.DEBUG
}

// Set verbosity level for custom tags.
castDebugLogger.loggerLevelByTags = {
    [LOG_TAG]: cast.framework.LoggerLevel.DEBUG,
};

Debug Overlay

The Cast Debug Logger provides a debug overlay on the receiver to display your custom log messages on the cast device. Use showDebugLogs to toggle the debug overlay and clearDebugLogs to clear log messages on the overlay.

Add the following code to preview the debug overlay on your receiver.

// Enable debug logger and show a 'DEBUG MODE' overlay at top left corner.
castDebugLogger.setEnabled(true);

// Show debug overlay
castDebugLogger.showDebugLogs(true);

// Clear log messages on debug overlay
// castDebugLogger.clearDebugLogs();

You now know how to create a custom receiver application using the latest Cast Receiver SDK. More sample apps can be found on GitHub at github.com/googlecast.