Live Streaming on Google Cloud with Media CDN and Live Streaming API

1. Introduction

Content Delivery Networks (CDNs) improve user performance by caching frequently accessed content closer to the end users, terminating connections closer to the clients, re-using connections to the origin, and through adoption of modern networking protocols and customizations. For users (and our customers), this means lower latency, more reliability, and reduced cost - leading to improved sales, web experience, and a net increase in user experience. Very few modern sites and video streaming platforms operate without a CDN nowadays.

What you'll learn

This lab will guide us through the steps to deploy a live streaming workflow environment with Media CDN (CDN) + Cloud Media Live Streaming API (live video transcoding) + Cloud Storage (storage for the videos) + Video Player (VLC, Google Shaka Player, etc - HLS + MPEG-DASH ready player).

We will set up the Live Streaming API components - Input, Channel - and start a live feed to the Input/Channel with FFmpeg (FFmpeg can generate a live test signal). The Live Streaming API will transcode the live feed. The transcoded video manifest and segments will be stored in a Cloud Storage bucket. Then we'll set up Media CDN with the live video Cloud Storage bucket as the origin. Finally, VLC Player will be used to play live content cached via Media CDN. We will also set up a Cloud Monitoring dashboard to visualize the activity of Media CDN.

What you'll build

In this lab we will setup the environment based on the following architecture:

de33cb3e75d52549.png

We will setup the following components and perform the following tasks as part of the lab:

  • Create a Google Cloud Storage (GCS) bucket for storing the live transcoded videos
  • Configure Live Streaming API to transcode the video to multiple formats: HLS + MPEG DASH, SD and HD
  • Set up the Live Streaming components: Input/Channel
  • Start the Live Stream Channel
  • Setup Media CDN with the GCS bucket as origin
  • Set up FFmpeg to feed the live channel
  • Stream the transcoded live feed with a video player
  • Setup a Cloud Monitoring dashboard to monitor Media CDN activity (latency, cache hit, cache miss, etc)

Note: For this lab, we assume that users have access to the Google Cloud Console and already have a project setup. We also assume that users start with a fresh environment and have nothing setup for this demo.

All configuration actions will be done via the command line in Cloud Shell. We can always check the components configured via the command line in the console. We will see pointers throughout the lab that will point to the Google Cloud Console.

2. Before you begin

Media CDN access is restricted. In order to get access to Media CDN, contact your account team. They can create an access request on your behalf. If you are part of Google and want to test Live Streaming with Media CDN, contact the PM for Media CDN to request access to Media CDN.

3. Setup and Requirements

Start Cloud Shell

While Google Cloud can be operated remotely from your laptop, in this codelab you will be using Google Cloud Shell, a command line environment running in the Cloud.

From the Google Cloud Console, click the Cloud Shell icon on the top right toolbar:

55efc1aaa7a4d3ad.png

It should only take a few moments to provision and connect to the environment. When it is finished, you should see something like this:

7ffe5cbb04455448.png

This virtual machine is loaded with all the development tools you'll need. It offers a persistent 5GB home directory, and runs on Google Cloud, greatly enhancing network performance and authentication. All of your work in this codelab can be done within a browser. You do not need to install anything.

4. Google Cloud SDK version

At the time of writing, 408.0.0 is the latest Google Cloud SDK version. All the commands in this lab were tested using the latest version of Google Cloud SDK. Before proceeding, please make sure that Cloud Shell is using the latest version of SDK.

Checking the SDK version

We will use the gcloud version command to check the SDK version.

Command

gcloud version | grep "Google Cloud SDK"

Output Example

Google Cloud SDK 408.0.0

Next Steps

  1. If the SDK version is 408.0.0 or higher, then skip to the next section.
  2. If the SDK version is lower than 408.0.0, then run the command listed below to update the SDK.
sudo apt-get update && sudo apt-get install google-cloud-sdk

5. Prerequisites

Before we start configuring the GCP resources, we need to do the following -

  1. Setup environment variables
  2. Enable required Service APIs

1. Setup Environment Variables

Throughout this lab, we will run gcloud and curl commands with a few variables. We need to configure the following environment variables.

  • Project ID
  • Project Number
  • User Name
  • Region
  • Input ID
  • Channel ID

Project ID and User Name

These environment variables are generally pre-configured in Cloudshell. We will use the env command to verify.

Command

env | grep -E 'DEVSHELL_PROJECT_ID=|LOGNAME'

Output Example

DEVSHELL_PROJECT_ID=<YOUR_PROJECT_ID>
LOGNAME=<YOUR_USERNAME>

Create the env_variables File

Use the cat command to create the env_variables.txt file. The command below will create the file env_variables.txt in user's home directory.

Commands

cat > ~/env_variables.txt << EOF
export PROJECT_NUMBER=$(gcloud projects describe $GOOGLE_CLOUD_PROJECT --format="value(projectNumber)")
export LOCATION=us-west2
export INPUT_ID=lab-live-input
export CHANNEL_ID=lab-live-channel
EOF

Setup the Environment Variables

We will use the source command to set the environment variables

Command

source ~/env_variables.txt

Verify that the variables are set

Let's verify that all the required environment variables are set. We should see a total of 6 environment variables in the output.

Command

env | grep -E 'DEVSHELL_PROJECT_ID=|LOGNAME|PROJECT_NUMBER|LOCATION|INPUT_ID|CHANNEL_ID'

Output Example

LOCATION=us-west2
DEVSHELL_PROJECT_ID=<YOUR_PROJECT_ID>
LOGNAME=<YOUR_USERNAME>
PROJECT_NUMBER=<YOUR_PROJECT_NUMBER>
INPUT_ID=lab-live-input
CHANNEL_ID=lab-live-channel

2. Enable Required Service APIs

We need to make sure that the following APIs are enabled in our project.

  • Network Services API
  • Certificate Manager API
  • Livestream API
  • Media CDN Edge Cache API

Enable the Network Services API

To enable the Network Services API, run the following command:

Command

gcloud services enable networkservices.googleapis.com

Enable the Certificate Manager API

To enable the Certificate Manager API, run the following command:

Command

gcloud services enable certificatemanager.googleapis.com

Enable the Live Stream API

To enable the Live Stream API, run the following command:

Command

gcloud services enable livestream.googleapis.com

Enable the Media CDN Edge Cache API

To enable the Media CDN Edge Cache API, run the following command:

Command

gcloud services enable edgecache.googleapis.com

Verify that the APIs are Enabled

Run the gcloud services list command to list all the enabled APIs. We should see 4 APIs in the output.

Command

gcloud services list | grep -E 'networkservices|certificatemanager|livestream|edgecache'

Output Example

NAME: certificatemanager.googleapis.com
NAME: livestream.googleapis.com
NAME: networkservices.googleapis.com
NAME: edgecache.googleapis.com

6. Create the Cloud Storage Bucket

In this section we will do the following:

  1. Create a Cloud Storage bucket
  2. Make the bucket publicly accessible

Later in the lab, we will use this bucket to store the transcoded video files. This bucket will also act as an origin storage for the Media CDN service.

1. Create the bucket

We will use the gsutil mb command to create the bucket:

Command

gsutil mb gs://live-streaming-storage-$LOGNAME

2. Make the Bucket Publicly Accessible

We will use the gsutil iam command to make the files publicly available:

Command

gsutil iam ch allUsers:objectViewer gs://live-streaming-storage-$LOGNAME

7. Setting up the Live Streaming API environment

The components of the Live Streaming API chain are architectured as follows:

96b5d26aedeb89a6.png

We created the Cloud Storage bucket live-streaming-storage-$LOGNAME in the previous section. In the next two sections, we will create the following resources:

  • Live Streaming Input: An input endpoint is an endpoint to which your encoder sends your input stream. You can use the input endpoint to specify configurations for your stream, such as input resolution, input type, and video cropping.
  • Live Streaming Channel: A channel is a resource that ingests the input stream through an input endpoint, transcodes the input stream into multiple renditions, and publishes output live streams in certain formats in the specified location. You can include a primary and backup input stream in the same channel.

We will create the following resources later in the lab:

  • Encoder: An encoder is a program used to send input streams. In this lab, we will use FFmpeg.

8. Create and Configure the Input Endpoint

Create the input.json File

We will create an input.json file to specify the live stream signal type. In this lab, we are using the RTMP live signal.

Command

cat > input.json << EOF
{
  "type": "RTMP_PUSH"
}
EOF

Create the Input Endpoint

As of writing of this lab, there is no gcloud support for the Live Stream API. We will use the curl command to make the API calls.

Command

curl -X POST \
-H "Authorization: Bearer "$(gcloud auth application-default print-access-token) \
-H "Content-Type: application/json; charset=utf-8" \
-d @input.json \
"https://livestream.googleapis.com/v1/projects/$PROJECT_NUMBER/locations/$LOCATION/inputs?inputId=$INPUT_ID"

Output Example

{
  "name": "projects/PROJECT_NUMBER/locations/us-west2/operations/operation-1661405972853-5e70a38d6f27f-79100d00-310671b4",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.video.livestream.v1.OperationMetadata",
    "createTime": "2022-08-25T05:39:32.884030164Z",
    "target": "projects/PROJECT_NUMBER/locations/us-west2/inputs/lab-live-input",
    "verb": "create",
    "requestedCancellation": false,
    "apiVersion": "v1"
  },
  "done": false
}

The output has a lot of useful information, but at this time, we need to focus on two fields:

  • Operation ID: From the output, copy and note the operation ID. Below is the operation ID from the output example. This can be found on the output line starting with "name". "operation-1661405972853-5e70a38d6f27f-79100d00-310671b4"
  • Status: We need to wait for the status to change from "done": false to "done": true

Check the Status

Before we proceed further, we need to check that the input endpoint is created successfully and is ready.

In the command below, replace <OPERATION> with the ID of the Operation we just got above. In this example it is "operation-1661405972853-5e70a38d6f27f-79100d00-310671b4".

Command

export OPERATION_ID_1=<OPERATION>

Command

curl -X GET \
-H "Authorization: Bearer "$(gcloud auth application-default print-access-token) \
"https://livestream.googleapis.com/v1/projects/$PROJECT_NUMBER/locations/$LOCATION/operations/$OPERATION_ID_1"

Output Example

{
  "name": "projects/PROJECT_NUMBER/locations/us-west2/operations/operation-1661408816982-5e70ae25cea49-617844f0-8fdcb4a1",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.video.livestream.v1.OperationMetadata",
    "createTime": "2022-08-25T06:26:57.001530499Z",
    "endTime": "2022-08-25T06:26:57.043623522Z",
    "target": "projects/PROJECT_NUMBER/locations/us-west2/inputs/lab-live-input",
    "verb": "create",
    "requestedCancellation": false,
    "apiVersion": "v1"
  },
  "done": true,
  "response": {
    "@type": "type.googleapis.com/google.cloud.video.livestream.v1.Input",
    "name": "projects/PROJECT_ID/locations/us-west2/inputs/lab-live-input",
    "createTime": "2022-08-25T06:26:56.997623672Z",
    "updateTime": "2022-08-25T06:26:56.997623672Z",
    "type": "RTMP_PUSH",
    "uri": "rtmp://34.94.97.220/live/4b7846a1-4a67-44ed-bfd0-d98281b6464a",
    "tier": "HD"
  }
}

Re-run the command until you see "done:true" indicating the Input Endpoint was created and is ready.

Save the URI

We will use the URI from the previous output later in the lab. At this time, let's set an environment variable for the URI.

Command

export URI=<uri>

Replace <uri> with the URI you just noted above. Optionally, you can also use the GET method to retrieve the URI

Command

curl -s -X GET -H "Authorization: Bearer "$(gcloud auth application-default print-access-token) "https://livestream.googleapis.com/v1/projects/$PROJECT_NUMBER/locations/$LOCATION/inputs/$INPUT_ID" | jq .uri

9. Create & Configure the Live Streaming Channel

Let's create the Live Streaming Channel, associated with the input endpoint we just created in the previous section. The following example creates a channel generating an HLS live stream that consists of a single, high-definition (1280x720) rendition. The Channel will be associated with the input endpoint, and the storage bucket that we created previously.

Create the channel.json file

In the Cloud Shell terminal, type the following command to create a "channel.json" file:

Command

cat > channel.json << EOF
{
  "inputAttachments": [
    {
      "key": "my-input",
      "input": "projects/$PROJECT_NUMBER/locations/$LOCATION/inputs/$INPUT_ID"
    }
  ],
  "output": {
    "uri": "gs://live-streaming-storage-$LOGNAME"
  },
  "elementaryStreams": [
    {
      "key": "es_video",
      "videoStream": {
        "h264": {
          "profile": "high",
          "widthPixels": 1280,
          "heightPixels": 720,
          "bitrateBps": 3000000,
          "frameRate": 30
        }
      }
    },
    {
      "key": "es_audio",
      "audioStream": {
        "codec": "aac",
        "channelCount": 2,
        "bitrateBps": 160000
      }
    }
  ],
  "muxStreams": [
    {
      "key": "mux_video_ts",
      "container": "ts",
      "elementaryStreams": ["es_video", "es_audio"],
      "segmentSettings": { "segmentDuration": "2s" }
    }
  ],
  "manifests": [
    {
      "fileName": "main.m3u8",
      "type": "HLS",
      "muxStreams": [
        "mux_video_ts"
      ],
      "maxSegmentCount": 5
    }
  ]
}
EOF

Create the Channel

Run the following curl command to create the Channel:

Command

curl -X POST \
-H "Authorization: Bearer "$(gcloud auth application-default print-access-token) \
-H "Content-Type: application/json; charset=utf-8" \
-d @channel.json \
"https://livestream.googleapis.com/v1/projects/$PROJECT_NUMBER/locations/$LOCATION/channels?channelId=$CHANNEL_ID"

Output Example

{
  "name": "projects/PROJECT_NUMBER/locations/us-west2/operations/operation-1661405972853-5e70a38d6f27f-79100d00-310671b4",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.video.livestream.v1.OperationMetadata",
    "createTime": "2022-08-25T05:39:32.884030164Z",
    "target": "projects/PROJECT_NUMBER/locations/us-west2/channels/lab-live-channel",
    "verb": "create",
    "requestedCancellation": false,
    "apiVersion": "v1"
  },
  "done": false
}

Note and copy the operation ID. We will need it in one of the upcoming steps. This can be found on the output line starting with "name".

Check the Status

Before we proceed further, we need to check that the channel is created successfully and is ready.

In the command below, replace <OPERATION> with the ID of the Operation we just got above. In this example it is operation-1661405972853-5e70a38d6f27f-79100d00-310671b4

Command

export OPERATION_ID_2=<OPERATION>

Command

curl -s -X GET \
-H "Authorization: Bearer "$(gcloud auth application-default print-access-token) \
"https://livestream.googleapis.com/v1/projects/$PROJECT_NUMBER/locations/$LOCATION/operations/$OPERATION_ID_2"

Output Example

  "name": "projects/PROJECT_NUMBER/locations/us-west2/operations/operation-1668666801461-5eda4c3f31852-a4d2229f-0efeef9e",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.video.livestream.v1.OperationMetadata",
    "createTime": "2022-11-17T06:33:21.500841488Z",
    "endTime": "2022-11-17T06:33:21.529311112Z",
    "target": "projects/PROJECT_NUMBER/locations/us-west2/channels/lab-live-channel",
    "verb": "create",
    "requestedCancellation": false,
    "apiVersion": "v1"
  },
  "done": true,
  "response": {
    "@type": "type.googleapis.com/google.cloud.video.livestream.v1.Channel",
    "name": "projects/PROJECT_NAME/locations/us-west2/channels/lab-live-channel",
    "createTime": "2022-11-17T06:33:21.497818033Z",
    "updateTime": "2022-11-17T06:33:21.497818033Z",
    "activeInput": "my-input",
    "output": {
      "uri": "gs://live-streaming-storage-LOGNAME"
    },
    "elementaryStreams": [
      {
        "videoStream": {
          "h264": {
            "widthPixels": 1280,
            "heightPixels": 720,
            "frameRate": 30,
            "bitrateBps": 3000000,
            "gopDuration": "2s",
            "vbvSizeBits": 3000000,
            "vbvFullnessBits": 2700000,
            "entropyCoder": "cabac",
            "profile": "high"
          }
        },
        "key": "es_video"
      },
      {
        "audioStream": {
          "codec": "aac",
          "bitrateBps": 160000,
          "channelCount": 2,
          "sampleRateHertz": 48000
        },
        "key": "es_audio"
      }
    ],
    "muxStreams": [
      {
        "key": "mux_video_ts",
        "container": "ts",
        "elementaryStreams": [
          "es_video",
          "es_audio"
        ],
        "segmentSettings": {
          "segmentDuration": "2s"
        }
      }
    ],
    "manifests": [
      {
        "fileName": "main.m3u8",
        "type": "HLS",
        "muxStreams": [
          "mux_video_ts"
        ],
        "maxSegmentCount": 5,
        "segmentKeepDuration": "60s"
      }
    ],
    "streamingState": "STOPPED",
    "inputAttachments": [
      {
        "key": "my-input",
        "input": "projects/PROJECT_NUMBER/locations/us-west2/inputs/lab-live-input"
      }
    ],
    "logConfig": {
      "logSeverity": "OFF"
    }
  }
}

Re-run the command until you see "done:true" indicating the Input Endpoint was created and is ready.

Note that the "streamingState" at the moment is "STOPPED" ; we will start the channel in the next section.

10. Start the Live Streaming Channel

Now that we have created our Live Stream channel, let's start the channel. In this section, we will do the following:

  1. Start the Live Streaming channel
  2. Check the Status of the channel, we need to make sure that the streamingState is "AWAITING INPUT"

1. Start the Channel

In Cloud Shell, run the following curl command to start the channel:

Command

curl -X POST \
-H "Authorization: Bearer "$(gcloud auth application-default print-access-token) \
-H "Content-Type: application/json; charset=utf-8" \
-d "" \
"https://livestream.googleapis.com/v1/projects/$PROJECT_NUMBER/locations/$LOCATION/channels/$CHANNEL_ID:start"

Output Example

{
  "name": "projects/PROJECT_NUMBER/locations/LOCATION/operations/operation-1661405972853-5e70a38d6f27f-79100d00-310671b4",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.video.livestream.v1.OperationMetadata",
    "createTime": "2022-08-25T05:39:32.884030164Z",
    "target": "projects/PROJECT_NUMBER/locations/us-west2/channels/lab-live-channel",
    "verb": "start",
    "requestedCancellation": false,
    "apiVersion": "v1"
  },
  "done": false
}

2. Check the Status of the Channel

Run the following curl command to get the status of the Channel:

Command

curl -s -X GET -H "Authorization: Bearer "$(gcloud auth application-default print-access-token) "https://livestream.googleapis.com/v1/projects/$PROJECT_NUMBER/locations/$LOCATION/channels/$CHANNEL_ID" | grep "streamingState"

Output Example

"streamingState": "AWAITING_INPUT",

Re-run the command until you see "AWAITING_INPUT" indicating the Channel is running and is ready to receive a signal.

11. Configure Media CDN

In this section we will deploy Media CDN - the CDN infrastructure. We will create the following resources:

  1. Edge Cache Origin
  2. Edge Cache Service

1. Create a Edge Cache Origin

An Edge Cache Origin represents a content location, such as a Cloud Storage bucket, a third party storage location, or a load balancer. In CDN terms, the origin (or origin server) is the location where the source of the content we want to distribute is located - e.g. all CSS, Javascripts, HTML, images, etc. For this lab, we will create an origin that maps to the Cloud Storage bucket that we created in the beginning of the lab. We'll call the Edge Cache Origin cme-origin. The origin of a CDN is where all the source content is stored before being distributed to the edge cache servers.

We will use the gcloud edge-cache origins create command to create the origin. The command will take a few minutes to complete.

Command

gcloud edge-cache origins create cme-origin \
--origin-address="gs://live-streaming-storage-$LOGNAME"

Example Output

Create request issued for: cme-origin
Waiting for operation [projects/my-project/locations/global/operations/operation-1612121774168-5ba3759af1919-
3fdcd7b1-99f59223] to complete...done
Created origin cme-origin

2. Create a Edge Cache Service

Now that we have an Edge Cache Origin set up, we can create the Edge Cache Service itself.

Create the cme-demo.yaml file

The Edge Cache Service configuration is done via a YAML file. In Cloud Shell, create a local file called cme-demo.yaml. Use vi, nano or any other editor and paste the following lines in the YAML file:

name: cme-demo
routing:
  hostRules:
    - hosts:
        - demo.cme.com
      pathMatcher: routes
  pathMatchers:
    - name: routes
      routeRules:
        - headerAction:
            responseHeadersToAdd:
              - headerName: x-cache-status
                headerValue: "{cdn_cache_status}"
          matchRules:
            - prefixMatch: /
          origin: cme-origin
          priority: 100
          routeAction:
            cdnPolicy:
              cacheKeyPolicy: {}
              cacheMode: FORCE_CACHE_ALL
              defaultTtl: 3600s
              signedRequestMode: DISABLED
        - headerAction:
            responseHeadersToAdd:
              - headerName: x-cache-status
                headerValue: "{cdn_cache_status}"
          matchRules:
            - pathTemplateMatch: /**.m3u8
          origin: cme-origin
          priority: 25
          routeAction:
            cdnPolicy:
              cacheKeyPolicy: {}
              cacheMode: FORCE_CACHE_ALL
              defaultTtl: 1s
              signedRequestMode: DISABLED
        - headerAction: {}
          matchRules:
            - pathTemplateMatch: /**.ts
          origin: cme-origin
          priority: 50
          routeAction:
            cdnPolicy:
              cacheKeyPolicy: {}
              cacheMode: FORCE_CACHE_ALL
              defaultTtl: 2s
              signedRequestMode: DISABLED

We will leave all Edge Cache Service configuration defaults. In the file above there are 3 field values users might want to update:

  • name: the name of the Media CDN instance - here: cme-demo
  • hosts: the list of domain names that will be resolved by this Media CDN Service - here: demo.cme.com. We will use this during this demo. We will use the IP address of the Media CDN instance.
  • Origin: this is the Edge Cache Origin we just created in the previous step. Set it to cme-origin - name of the Media CDN Origin.

For more information on the different variables you can use in the YAML file, see the Edge Cache Service configuration guide.

Create the Edge Cache Service

We will create a Edge Cache Service named cme-demo, on Edge Cache Origin cme-origin, with host demo.cme.com. To create the service run the following command in Cloud Shell:

Command

gcloud edge-cache services import cme-demo \
    --source=cme-demo.yaml

It may take a few minutes to create the Edge Cache Service.

Output Example

Request issued for: [cme-demo]
Waiting for operation [projects/PROJECT_ID/locations/global/operations/operation-1670476252264-5ef4a0f9f36ce-dd380af5-321be9a0] to complete...done.     
createTime: '2022-12-07T18:08:54.403446942Z'
ipv4Addresses:
- 34.104.35.152
ipv6Addresses:
- '2600:1900:4110:d18::'
name: projects/PROJECT_ID/locations/global/edgeCacheServices/cme-demo
routing:
  hostRules:
  - hosts:
    - demo.cme.com
    - 34.104.35.152
    pathMatcher: routes
  pathMatchers:
  - name: routes
    routeRules:
    - headerAction:
        responseHeadersToAdd:
        - headerName: x-cache-status
          headerValue: '{cdn_cache_status}'
      matchRules:
      - prefixMatch: /
      origin: projects/123456789/locations/global/edgeCacheOrigins/cme-origin
      priority: '100'
      routeAction:
        cdnPolicy:
          cacheKeyPolicy: {}
          cacheMode: FORCE_CACHE_ALL
          defaultTtl: 3600s
          signedRequestMode: DISABLED
    - headerAction:
        responseHeadersToAdd:
        - headerName: x-cache-status
          headerValue: '{cdn_cache_status}'
      matchRules:
      - pathTemplateMatch: /**.m3u8
      origin: projects/123456789/locations/global/edgeCacheOrigins/cme-origin
      priority: '25'
      routeAction:
        cdnPolicy:
          cacheKeyPolicy: {}
          cacheMode: FORCE_CACHE_ALL
          defaultTtl: 1s
          signedRequestMode: DISABLED
    - headerAction: {}
      matchRules:
      - pathTemplateMatch: /**.ts
      origin: projects/123456789/locations/global/edgeCacheOrigins/cme-origin
      priority: '50'
      routeAction:
        cdnPolicy:
          cacheKeyPolicy: {}
          cacheMode: FORCE_CACHE_ALL
          defaultTtl: 2s
          signedRequestMode: DISABLED
updateTime: '2022-12-08T05:11:31.598744308Z'

Note and copy the ipv4Addresses of the Edge Cache Service instance - here 34.104.36.157. We'll use it to update the cme-demo.yaml file and later to stream the transcoded video.

Update the Edge Cache Service

At this point it is a good idea to update the Edge Cache Service configuration in order to be able to use the IP of the service to stream the video later on. The Edge Cache Service YAML file allows us to list all hosts name/IPs that Edge Cache Service will accept requests from. At this point we only specified demo.cme.com as a host. To provide name resolution for this domain, you cloud configure a DNS zone. However, an easier solution would be to add the IP address to the hosts list in the yaml file. Edit the YAML file again and edit it to look like the on below:

name: cme-demo
routing:
  hostRules:
    - hosts:
        - demo.cme.com
        - IPADDRESS
      pathMatcher: routes
  pathMatchers:
    - name: routes
      routeRules:
        - headerAction:
            responseHeadersToAdd:
              - headerName: x-cache-status
                headerValue: "{cdn_cache_status}"
          matchRules:
            - prefixMatch: /
          origin: cme-origin
          priority: 100
          routeAction:
            cdnPolicy:
              cacheKeyPolicy: {}
              cacheMode: FORCE_CACHE_ALL
              defaultTtl: 3600s
              signedRequestMode: DISABLED
        - headerAction:
            responseHeadersToAdd:
              - headerName: x-cache-status
                headerValue: "{cdn_cache_status}"
          matchRules:
            - pathTemplateMatch: /**.m3u8
          origin: cme-origin
          priority: 25
          routeAction:
            cdnPolicy:
              cacheKeyPolicy: {}
              cacheMode: FORCE_CACHE_ALL
              defaultTtl: 1s
              signedRequestMode: DISABLED
        - headerAction: {}
          matchRules:
            - pathTemplateMatch: /**.ts
          origin: cme-origin
          priority: 50
          routeAction:
            cdnPolicy:
              cacheKeyPolicy: {}
              cacheMode: FORCE_CACHE_ALL
              defaultTtl: 2s
              signedRequestMode: DISABLED

To reflect the changes we just need to re-import the YAML file. In the Cloud Shell terminal run the following command:

Command

gcloud edge-cache services import cme-demo \
    --source=cme-demo.yaml

Check the output of the command and verify that the IP appears in the list of hosts.

At this point the Edge Cache Service instance will accept requests with either "demo.cme.com" or the IP address as the host.

12. Generate the Input Signal

Now that we have configured all the required services, let's generate the live stream input signal. In this section, we will do the following:

  1. Install FFmpeg, a free open-source software
  2. Send a test live signal to the Input/Channel

1. Install FFmpeg

FFmpeg is a free and open-source software project consisting of a suite of libraries and programs for handling video, audio, and other multimedia files and streams. In the Cloud Shell terminal, use the following command to install FFmpeg:

Command

sudo apt install ffmpeg -y

When the install is done let's verify FFmpeg was properly installed by checking its version:

Command

ffmpeg -version

Output Example

ffmpeg version 4.3.4-0+deb11u1 Copyright (c) 2000-2021 the FFmpeg developers
built with gcc 10 (Debian 10.2.1-6)
...

FFmpeg was properly installed.

2. Start the Live Stream Signal to the Input/Channel

Now that the FFmpeg is installed, we will send a test input stream to the input endpoint to generate the live stream.

In the Cloud Shell terminal run the following command, using the URI environment variable that we created in the "Create and Configure the Input Endpoint" section.

Command

ffmpeg -re -f lavfi -i "testsrc=size=1280x720 [out0]; sine=frequency=500 [out1]" \
  -acodec aac -vcodec h264 -f flv $URI

You should see FFmpeg sending the test live signal. The command will not return the prompt. The signal will be generated until you stop it. You will need to open a new Cloud Shell window for the remainder of the lab.

13. Open New Cloud Shell

At this point you will need to open a new Cloud Shell window in order to continue the lab as FFmpeg is going to run permanently until you <CTRL+C> the command to stop it and thus stop the live signal generation..

Click on the "+" sign next to the name of the current Cloud Shell terminal. It will open an additional Cloud Shell window.

b3c7b0be6276c194.png

Run the rest of the lab in the newly opened Cloud Shell window.

Setup the Environment Variables

Since this is a new CloudShell, we need to set the environment variables again. We will use the source command to set the environment variables.

Command

source ~/env_variables.txt

Verify that the variables are set

Let's verify that all the required environment variables are set. We should see a total of 6 environment variables in the output.

Command

env | grep -E 'DEVSHELL_PROJECT_ID=|LOGNAME|PROJECT_NUMBER|LOCATION|INPUT_ID|CHANNEL_ID'

Output Example

LOCATION=us-west2
DEVSHELL_PROJECT_ID=<YOUR_PROJECT_ID>
LOGNAME=<YOUR_USERNAME>
PROJECT_NUMBER=<YOUR_PROJECT_NUMBER>
INPUT_ID=lab-live-input
CHANNEL_ID=lab-live-channel

14. Verify the live signal is getting transcoded

We will run a curl to describe the channel. We should see in the output that the streamingState changed from "AWAITING_INPUT" to "STREAMING"

Command

curl -s -X GET -H "Authorization: Bearer "$(gcloud auth application-default print-access-token) "https://livestream.googleapis.com/v1/projects/$PROJECT_NUMBER/locations/$LOCATION/channels/$CHANNEL_ID" | grep "streamingState"

In the output JSON file response you should see "streamingState": "STREAMING" - indicating that the Channel is streaming and the live signal is getting transcoded.

Let's also verify the content of the bucket where we should see a manifest file and several TS video segments. Run the following command in Cloud Shell to list the content of the bucket we created at the beginning of the lab and used by the Live Streaming API to output the transcoded live signal manifest and TS video segments:

Command

gcloud storage ls --recursive gs://live-streaming-storage-$LOGNAME/**

Output Example

gs://live-streaming-storage-$LOGNAME/
gs://live-streaming-storage-$LOGNAME/main.m3u8
gs://live-streaming-storage-$LOGNAME/mux_video_ts/index-1.m3u8
gs://live-streaming-storage-$LOGNAME/mux_video_ts/segment-0000000016.ts
gs://live-streaming-storage-$LOGNAME/mux_video_ts/segment-0000000017.ts
gs://live-streaming-storage-$LOGNAME/mux_video_ts/segment-0000000018.ts
gs://live-streaming-storage-$LOGNAME/mux_video_ts/segment-0000000019.ts
gs://live-streaming-storage-$LOGNAME/mux_video_ts/segment-0000000020.ts
gs://live-streaming-storage-$LOGNAME/mux_video_ts/segment-0000000021.ts
gs://live-streaming-storage-$LOGNAME/mux_video_ts/segment-0000000022.ts
...

You should see:

  • the HLS manifest file: main.m3u8
  • The corresponding TS video segments: a series of numbered files segment-000000000X.ts

At this point we're done with the following:

  • Live Streaming API: the live signal is generated and transcoded into a bucket via the Live Streaming API
  • Media CDN: configured Media CDN with the Live Streaming storage bucket as Media CDN's origin.

In the next sections, we will validate the Edge Cache Service and then we will stream the transcoded video using the Media CDN anycast IP address.

15. Verify that the Edge Cache Service instance works

In this section we will verify that the Edge Cache Service instance works as expected. To do so, we will attempt to access a file from the Edge Cache Service instance using the IP address of the Edge Cache Service Service. The first time an object is accessed, it is not cached yet. We should observe a cache MISS. For the first request, the object is read from the origin and cached at the edge. All the following attempts to access the same file, will return a cache HIT since the object is now cached at the edge. Let's verify this behavior:

Run the following curl command in Cloud Shell to access the transcoded video manifest file that is stored in the Edge Cache Origin:

Command

curl -svo /dev/null --resolve demo.cme.com:80:<Replace_With_Edge_Cache_IP> \
"http://demo.cme.com/main.m3u8"

Notice the resolve where we use the IP address of the Edge Cache Service instance to resolve its name. Make use that you use demo.cme.com:<IP> where IP is the IP of the Edge Cache Service instance we just created.

Look for the x-cache-status header in the output.

Output Example

Added demo.cme.com:80:34.104.35.152 to DNS cache
* Hostname demo.cme.com was found in DNS cache
*   Trying 34.104.35.152:80...
* Connected to demo.cme.com (34.104.35.152) port 80 (#0)
> GET /main.m3u8 HTTP/1.1
> Host: demo.cme.com
> User-Agent: curl/7.74.0
> Accept: */*
>
* Mark bundle as not supporting multiuse
< HTTP/1.1 200 OK
< x-guploader-uploadid: ADPycdtKtflWt4Kha5YxXNNRwO-Eu6fGSPs-T-XY4HJmNMo46VJyWlD4EAk-8a6SegxjWq3o1gTPqZbpkU_sjW__HPAdDw
< date: Wed, 07 Dec 2022 18:23:46 GMT
< last-modified: Wed, 07 Dec 2022 18:23:45 GMT
< etag: "6bff620ccca4a9849ba4e17fa7c521fb"
< x-goog-generation: 1670437425805400
< x-goog-metageneration: 1
< x-goog-stored-content-encoding: identity
< x-goog-stored-content-length: 193
< content-type: application/x-mpegURL
< x-goog-hash: crc32c=sPO3zw==
< x-goog-hash: md5=a/9iDMykqYSbpOF/p8Uh+w==
< x-goog-storage-class: STANDARD
< accept-ranges: bytes
< content-length: 193
< server: Google-Edge-Cache
< x-request-id: fd25285b-fc1a-4fd4-981a-c50ead2c85ed
< x-xss-protection: 0
< x-frame-options: SAMEORIGIN
< x-cache-status: den;miss
< cache-control: public,max-age=3600
<
{ [193 bytes data]
* Connection #0 to host demo.cme.com left intact

Notice the cache miss as the object hasn't been cached yet and is read from the origin.

Now we will make multiple requests for the m3u8 file, and if everything is configured correctly, Media CDN should start serving the content from its cache. The command below will make 10 curl requests and only print the x-cache-status header.

Command

for i in {1..10};do curl -Is --resolve demo.cme.com:80:<Replace_With_Edge_Cache_IP> "http://demo.cme.com/main.m3u8" | grep x-cache-status;done

The output should be a mix of cache hit and miss. If you see cache hits in the output, then Media CDN is working as expected.

Output Example

x-cache-status: den;miss
x-cache-status: den;hit
x-cache-status: den;hit
x-cache-status: den;hit
x-cache-status: den;hit
x-cache-status: den;hit
x-cache-status: den;hit
x-cache-status: den;hit
x-cache-status: den;hit
x-cache-status: den;hit

Notice the cache hit as the object is now cached at the edge. The Cloud Medie Edge Service is working as expected.

16. Stream transcoded live signal video with VLC

This is the part where we connect the dots and link all the steps we've been working on so far:

  • We created a bucket called live-streaming-storage-$LOGNAME that receives the result of the live signal transcoded to HLS content by the Live Streaming API.
  • We set up the Live Streaming API.
  • We started a RTMP live signal with FFmpeg that feeds the Live Streaming API input/Channel.
  • We verified that the live signal was fed to the Channel and verified the Channel was in streaming mode.
  • We verified that the resulting transcoded files (manifest + TS segments) were generated and stored in the bucket live-streaming-storage-$LOGNAME.
  • An Edge Cache Origin called cme-origin was set up with GCS bucket live-streaming-storage-$LOGNAME as the origin.
  • An Edge Cache instance called cme-demo was set up with cme-origin as its origin.
  • We verified the behavior (cache miss, cache hit) of the Edge Cache Service instance.

We're now at a point where we can use a video player to stream the transcoded live signal via the Media CDN cache. In order to do so we'll use the VLC Player. The VLC Player is a free and open source cross-platform multimedia player and framework that plays most multimedia files. It plays adaptive media formats (such as DASH and HLS). It uses the principle of Adaptive Streaming - according to the quality of your network connection and the available bandwidth, the player will adapt the quality of the video played. With the transcoding job we just did, we used the default presets and generated two qualities "only": SD and HD. As we start playing the video in the player you should see it start playing the SD format and quickly switch to the HD format if your network connection is good enough.

We will stream the HLS (widely supported Apple video format) transcoded live signal. The corresponding file is called main.m3u8 - which is the HLS manifest. The manifest points to the TS video segments.

To use the VLC Player, go to https://www.videolan.org/vlc/ and download a version of the player for your Laptop operating system - VLC is available for Windows, MacOSX, Linux, Android and iOS.

2a2d19abe728d222.png

Install the Player on your laptop and launch it. We will use the MacOSX version of the player for the next few steps.

In order to play a video, go to "File" / "Open Network":

f85565301f7c68dc.png

Set it up with:

  • URL: http://<Replace_With_Edge_Cache_IP>/main.m3u8. This is the URL of the video we want to stream. Notice:
  • The IP of the Media CDN instance: 34.105.35.246. Replace with the IP of the Cloud Media Service you deployed.
  • The path to the manifest video file: "/". This is the path we used in the live-streaming-storage-$LOGNAME bucket to store the transcoded live signal files. The path is the root path here: "/".
  • The name of the manifest video file: the HLS manifest file, main.m3u8.

And click "Open". You should see the transcoded live video starting to play. The video will look like the screenshot below. The counter on-screen will run in increments of 1 and you should be able to hear a continuous beep.

It is a basic RTMP test live signal generated by FFmpeg, transcoded to HLS by the Live Streaming API and served via the Media CDN cache:

28fc359b49d44ec2.png

You can use any other HLS and MPEG DASH player if you wish to do so. Here are some you might want to consider:

  • Quicktime player - installed by default on Macs. Same here: open a network connection to http://34.104.36.157/main.m3u8 - replace the IP address with the one of your Edge Cache Service instance.

17. Monitoring Media CDN

A Media CDN dashboard template was created by the SME team - https://gist.github.com/elithrar/1c511d00f5cd3736fb2a3897867209c1.

To install it, run the following commands in the Cloud Shell window:

Download the YAML file:

curl https://gist.githubusercontent.com/elithrar/1c511d00f5cd3736fb2a3897867209c1/raw/3cb70855304f29e5c06b8d63063196354db0dec3/media-edge-20210208-dashboard --output media-edge-20210208-dashboard.yaml

Create the dashboard for Cloud Monitoring:

gcloud monitoring dashboards create --config-from-file media-edge-20210208-dashboard.yaml

It might take a few minutes to set up. Go to the Google Cloud Console and click on the 3 bars > Operations > Monitoring > Dashboards. You should see a dashboard called "Media Edge Metrics". Click on that and you will see the metrics:

d0821d84a88a928d.png

18. Clean up the Lab Environment

Congratulations on completing the lab. In this section we will delete all the resources that we created throughout the lab.

Stop the FFmpeg signal:

Hit <CTRL+C> on the Cloud Shell terminal in which FFmpeg is running.

Stop the Live Streaming Channel:

Command

curl -X POST \
-H "Authorization: Bearer "$(gcloud auth application-default print-access-token) \
-H "Content-Type: application/json; charset=utf-8" \
-d "" \
"https://livestream.googleapis.com/v1/projects/$PROJECT_NUMBER/locations/$LOCATION/channels/$CHANNEL_ID:stop"

Delete the Live Streaming Channel:

Command

curl -X DELETE -H "Authorization: Bearer "$(gcloud auth application-default print-access-token) "https://livestream.googleapis.com/v1/projects/$PROJECT_NUMBER/locations/$LOCATION/channels/$CHANNEL_ID"

Delete the Live Streaming Input Endpoint:

Command

curl -X DELETE \
-H "Authorization: Bearer "$(gcloud auth application-default print-access-token) \
"https://livestream.googleapis.com/v1/projects/$PROJECT_NUMBER/locations/$LOCATION/inputs/$INPUT_ID"

Delete the GCS bucket:

Command

gsutil rm -r gs://live-streaming-storage-$LOGNAME

Delete the Edge Cache Service Instance:

Command

gcloud edge-cache services delete cme-demo

Confirm delete by entering "Y" when prompted

Delete the Edge Cache Origin:

Command

gcloud edge-cache origins delete cme-origin

Confirm delete by entering "Y" when prompted

Delete the Custom Dashboard

Command

gcloud monitoring dashboards delete $(gcloud monitoring dashboards list --filter="displayName:Media Edge Metrics" --format="value(name)")