1. Introduction
Cloud Run allows you to run stateless containers in a fully managed environment. It is built from open-source Knative, letting you choose to run your containers either fully managed with Cloud Run, or in your Google Kubernetes Engine cluster with Cloud Run for Anthos.
Eventarc makes it easy to connect Cloud Run services with events from a variety of sources. It allows you to build event-driven architectures in which microservices are loosely coupled and distributed. It also takes care of event ingestion, delivery, security, authorization, and error-handling for you which improves developer agility and application resilience.
Check out Trigger Cloud Run with events from Eventarc codelab for an introduction to Eventarc and learn about its long term vision.
In this codelab, you will use Eventarc for Cloud Run for Anthos to read events from Cloud Pub/Sub and Cloud Audit Logs and pass them to Cloud Run for Anthos.
What you'll learn
- Create a Cloud Run for Anthos sink
- Create a Pub/Sub trigger for Cloud Run for Anthos
- Create an Audit Logs trigger for Cloud Run for Anthos
- Create a Kubernetes service sink
- Create a Pub/Sub trigger for a Kubernetes service
2. Setup and requirements
Self-paced environment setup
- Sign-in to the Google Cloud Console and create a new project or reuse an existing one. If you don't already have a Gmail or Google Workspace account, you must create one.
- The Project name is the display name for this project's participants. It is a character string not used by Google APIs, and you can update it at any time.
- The Project ID must be unique across all Google Cloud projects and is immutable (cannot be changed after it has been set). The Cloud Console auto-generates a unique string; usually you don't care what it is. In most codelabs, you'll need to reference the Project ID (and it is typically identified as
PROJECT_ID
), so if you don't like it, generate another random one, or, you can try your own and see if it's available. Then it's "frozen" after the project is created. - There is a third value, a Project Number which some APIs use. Learn more about all three of these values in the documentation.
- Next, you'll need to enable billing in the Cloud Console in order to use Cloud resources/APIs. Running through this codelab shouldn't cost much, if anything at all. To shut down resources so you don't incur billing beyond this tutorial, follow any "clean-up" instructions found at the end of the codelab. New users of Google Cloud are eligible for the $300 USD Free Trial program.
Start Cloud Shell
While Google Cloud can be operated remotely from your laptop, in this codelab you will be using Google Cloud Shell, a command line environment running in the Cloud.
From the GCP Console click the Cloud Shell icon on the top right toolbar:
It should only take a few moments to provision and connect to the environment. When it is finished, you should see something like this:
This virtual machine is loaded with all the development tools you'll need. It offers a persistent 5GB home directory, and runs on Google Cloud, greatly enhancing network performance and authentication. All of your work in this lab can be done with simply a browser.
3. Before you begin
Before creating an event sink and event triggers, go through some setup steps to enable APIs, set regions, create service accounts and so on.
Enable APIs
Inside Cloud Shell, make sure that your project ID is setup:
gcloud config set project [YOUR-PROJECT-ID] PROJECT_ID=$(gcloud config get-value project)
Enable all necessary services:
gcloud services enable run.googleapis.com gcloud services enable eventarc.googleapis.com gcloud services enable logging.googleapis.com gcloud services enable cloudbuild.googleapis.com
You also need to enable Data Access Audit Logs (Admin read, Data read, Data write) for all Google Cloud services from which you intend to receive events. In a later step, we will show you how to enable audit logs for Google Cloud Storage.
Set region and platform
Eventarc is available in the multiple Google Cloud regions and globally. You can see the list of regions with this command:
gcloud eventarc locations list
In Cloud Shell, set the Cloud Run region to one of the supported regions, the Cloud Run platform to gke
and the location of the Eventarc trigger:
CLUSTER_NAME=events-cluster CLUSTER_LOCATION=us-central1 gcloud config set run/cluster $CLUSTER_NAME gcloud config set run/cluster_location $CLUSTER_LOCATION gcloud config set run/platform gke gcloud config set eventarc/location $CLUSTER_LOCATION
You can check that the configuration is set:
gcloud config list ... [eventarc] location = us-central1 [run] cluster = events-cluster cluster_location = us-central1 platform = gke
Create a GKE cluster
Create a GKE cluster with the Cloud Run for Anthos add-on and also with Workload Identity (WI) enabled. WI is the recommended way to access Google Cloud services from applications running within GKE due to its improved security properties and manageability. It is needed to properly set up the Event Forwarder of Eventarc. See Using Workload Identity for more details.
gcloud beta container clusters create $CLUSTER_NAME \ --addons=HttpLoadBalancing,HorizontalPodAutoscaling,CloudRun \ --machine-type=n1-standard-4 \ --enable-autoscaling --min-nodes=3 --max-nodes=10 \ --no-issue-client-certificate --num-nodes=3 \ --logging=SYSTEM,WORKLOAD \ --monitoring=SYSTEM \ --scopes=cloud-platform,logging-write,monitoring-write,pubsub \ --zone us-central1 \ --release-channel=rapid \ --workload-pool=$PROJECT_ID.svc.id.goog
Optional: Authenticate to Google Cloud
This is optional for the codelab but if you intend to deploy apps using Google Cloud APIs, you need to make sure apps can authenticate to Google Cloud using Workload Identity. To do this, configure a Kubernetes service account to act as a Google service account.
Allow the default Kubernetes service account to impersonate the default Google compute service account by creating an IAM policy binding the two:
PROJECT_NUMBER="$(gcloud projects describe $(gcloud config get-value project) --format='value(projectNumber)')" gcloud iam service-accounts add-iam-policy-binding \ --role roles/iam.workloadIdentityUser \ --member "serviceAccount:$PROJECT_ID.svc.id.goog[default/default]" \ $PROJECT_NUMBER-compute@developer.gserviceaccount.com
Add the iam.gke.io/gcp-service-account
annotation to the Kubernetes service account, using the email address of the default Google compute service account:
kubectl annotate serviceaccount \ --namespace default \ default \ iam.gke.io/gcp-service-account=$PROJECT_NUMBER-compute@developer.gserviceaccount.com
Enable GKE destinations in Eventarc
Enable GKE destinations in Eventarc by creating a service account and binding the required roles with this command:
gcloud eventarc gke-destinations init
Configure a service account
Create another service account with the roles/pubsub.subscriber
and roles/monitoring.metricWriter
roles. This is the minimum needed for Pub/Sub triggers. If you intend to use AuditLog triggers as well, you also need the roles/eventarc.eventReceiver
role:
TRIGGER_GSA=eventarc-crfa-triggers gcloud iam service-accounts create $TRIGGER_GSA gcloud projects add-iam-policy-binding $PROJECT_ID \ --member "serviceAccount:$TRIGGER_GSA@$PROJECT_ID.iam.gserviceaccount.com" \ --role "roles/pubsub.subscriber" gcloud projects add-iam-policy-binding $PROJECT_ID \ --member "serviceAccount:$TRIGGER_GSA@$PROJECT_ID.iam.gserviceaccount.com" \ --role "roles/monitoring.metricWriter" gcloud projects add-iam-policy-binding $PROJECT_ID \ --member "serviceAccount:$TRIGGER_GSA@$PROJECT_ID.iam.gserviceaccount.com" \ --role "roles/eventarc.eventReceiver"
You will use this service account in the Pub/Sub and Audit Log triggers later.
4. Event discovery
You can discover what the registered sources are, the types of events they can emit, and how to configure triggers in order to consume them.
To see the list of different types of events:
gcloud beta eventarc attributes types list NAME DESCRIPTION google.cloud.audit.log.v1.written Cloud Audit Log written google.cloud.pubsub.topic.v1.messagePublished Cloud Pub/Sub message published
To get more information about each event type:
gcloud beta eventarc attributes types describe google.cloud.pubsub.topic.v1.messagePublished attributes: type description: Cloud Pub/Sub message published name: google.cloud.pubsub.topic.v1.messagePublished
5. Create a Cloud Run for Anthos sink
As an event sink, you can deploy Cloud Run's Hello container (that already logs the contents of CloudEvents) to Cloud Run for Anthos.
Deploy to Cloud Run for Anthos
Deploy your containerized application to Cloud Run:
SERVICE_NAME=hello gcloud run deploy $SERVICE_NAME \ --image gcr.io/cloudrun/hello
6. Create a Pub/Sub trigger for Cloud Run for Anthos
One way of receiving events is through Cloud Pub/Sub. Any application can publish messages to Pub/Sub and these messages can be delivered to Cloud Run sinks via Eventarc.
Create a trigger
Before creating the trigger, get more details on the parameters you'll need to construct a trigger for events from Cloud Pub/Sub:
gcloud beta eventarc attributes types describe google.cloud.pubsub.topic.v1.messagePublished
Create a trigger to filter events published to the Pub/Sub topic to our deployed Cloud Run service:
TRIGGER_NAME=crfa-trigger-pubsub gcloud eventarc triggers create $TRIGGER_NAME \ --destination-gke-cluster=$CLUSTER_NAME \ --destination-gke-location=$CLUSTER_LOCATION \ --destination-gke-namespace=default \ --destination-gke-service=$SERVICE_NAME \ --destination-gke-path=/ \ --event-filters="type=google.cloud.pubsub.topic.v1.messagePublished" \ --service-account=$TRIGGER_GSA@$PROJECT_ID.iam.gserviceaccount.com
Find the topic
Pub/Sub trigger creates a Pub/Sub topic under the covers. Let's find it out and assign to a variable:
TOPIC_ID=$(gcloud eventarc triggers describe $TRIGGER_NAME --format='value(transport.pubsub.topic)')
Test the trigger
You can check that the trigger is created by listing all triggers:
gcloud eventarc triggers list
In order to simulate a custom application sending message, you can use gcloud
to to fire an event:
gcloud pubsub topics publish $TOPIC_ID --message="Hello World"
The Cloud Run sink we created logs the body of the incoming message. You can view this in the Logs section of your Cloud Run instance:
Delete the trigger
Optionally, you can delete the trigger once done testing.
gcloud eventarc triggers delete $TRIGGER_NAME
Bring your own Pub/Sub topic
By default, when you create a Pub/Sub trigger, Eventarc creates a Pub/Sub topic under the covers for you to use as a transport topic between your application and a Cloud Run service. This is useful to easily and quickly create a Pub/Sub backed trigger but it is also limiting. For example, you cannot set up a fanout from a single Pub/Sub topic to multiple Cloud Run services.
There is a way to create triggers from an existing Pub/Sub topic. Eventarc allows you to specify an existing Pub/Sub topic in the same project with --transport-topic
gcloud flag.
To see how this works, create a Pub/Sub topic to use as transport topic:
TOPIC_ID=my-topic gcloud pubsub topics create $TOPIC_ID
Create a trigger:
TRIGGER_NAME=crfa-trigger-pubsub-existing gcloud eventarc triggers create $TRIGGER_NAME \ --destination-gke-cluster=$CLUSTER_NAME \ --destination-gke-location=$CLUSTER_LOCATION \ --destination-gke-namespace=default \ --destination-gke-service=$SERVICE_NAME \ --destination-gke-path=/ \ --event-filters="type=google.cloud.pubsub.topic.v1.messagePublished" \ --service-account=$TRIGGER_GSA@$PROJECT_ID.iam.gserviceaccount.com \ --transport-topic=projects/$PROJECT_ID/topics/$TOPIC_ID
You can test the trigger by sending a message to the topic:
gcloud pubsub topics publish $TOPIC_ID --message="Hello again"
7. Create an Audit Logs trigger for Cloud Run for Anthos
You can also create a trigger to listen for events from Cloud Storage via Cloud Audit Logs.
Create a bucket
First, create a Cloud Storage bucket in the same region as the deployed Cloud Run service:
BUCKET_NAME=eventarc-auditlog-$PROJECT_ID gsutil mb -l $REGION gs://$BUCKET_NAME
Enable Cloud Audit Logs
In order to receive events from a service, you need to enable Cloud Audit Logs. From the Cloud Console, select IAM & Admin
and Audit Logs
from the upper left-hand menu. In the list of services, check Google Cloud Storage:
On the right hand side, make sure Admin, Read and Write are selected. Click save:
Test Cloud Audit Logs
To learn how to identify the parameters you'll need to set up an actual trigger, perform an actual operation.
For example, create a random text file and upload it to the bucket:
echo "Hello World" > random.txt gsutil cp random.txt gs://$BUCKET_NAME/random.txt
Now, let's see what kind of audit log this update generated. From the Cloud Console, select Logging
and Logs Viewer
from the upper left-hand menu.
Under Query Builder,
choose GCS Bucket
and choose your bucket and its location. Click Add
.
Once you run the query, you'll see logs for the storage bucket and one of those should be storage.objects.create
:
Note the serviceName
, methodName
and resourceName
. We'll use these in creating the trigger.
Create a trigger
You are now ready to create an event trigger for Cloud Audit Logs.
You can get more details on the parameters you'll need to construct the trigger:
gcloud beta eventarc attributes types describe google.cloud.audit.log.v1.written
Create the trigger with the right filters and the service account:
TRIGGER_NAME=crfa-trigger-auditlog gcloud eventarc triggers create $TRIGGER_NAME \ --destination-gke-cluster=$CLUSTER_NAME \ --destination-gke-location=$CLUSTER_LOCATION \ --destination-gke-namespace=default \ --destination-gke-service=$SERVICE_NAME \ --destination-gke-path=/ \ --event-filters="type=google.cloud.audit.log.v1.written" \ --event-filters="serviceName=storage.googleapis.com" \ --event-filters="methodName=storage.objects.create" \ --service-account=$TRIGGER_GSA@$PROJECT_ID.iam.gserviceaccount.com
Test the trigger
List all triggers to confirm that the trigger was successfully created:
gcloud eventarc triggers list
Wait for up to 10 minutes for the trigger creation to be propagated and for it to begin filtering events. Once ready, it will filter create events and send them to the service.
You're now ready to fire an event.
Upload the same file to the Cloud Storage bucket as you did earlier:
gsutil cp random.txt gs://$BUCKET_NAME/random.txt
If you check the logs of the Cloud Run service in Cloud Console, you should see the received event:
Delete the trigger
Optionally, you can delete the trigger once done testing:
gcloud eventarc triggers delete $TRIGGER_NAME
8. Create a Kubernetes service sink
You can also use a Kubernetes service as an event sink. Let's deploy Cloud Run's Hello container as a Kubernetes service and create an Eventarc Pub/Sub trigger for it.
Deploy to Kubernetes
Create a Kubernetes deployment:
SERVICE_NAME=hello-gke kubectl create deployment $SERVICE_NAME \ --image=gcr.io/cloudrun/hello
Expose it as a Kubernetes service:
kubectl expose deployment $SERVICE_NAME \ --type LoadBalancer --port 80 --target-port 8080
9. Create a Pub/Sub trigger for a Kubernetes service
Create a trigger
Create a trigger to filter events published to the Pub/Sub topic to our deployed Kubernetes service:
TRIGGER_NAME=gke-trigger-pubsub gcloud eventarc triggers create $TRIGGER_NAME \ --destination-gke-cluster=$CLUSTER_NAME \ --destination-gke-location=$CLUSTER_LOCATION \ --destination-gke-namespace=default \ --destination-gke-service=$SERVICE_NAME \ --destination-gke-path=/ \ --event-filters="type=google.cloud.pubsub.topic.v1.messagePublished" \ --service-account=$TRIGGER_GSA@$PROJECT_ID.iam.gserviceaccount.com
Find the topic
Pub/Sub trigger creates a Pub/Sub topic under the covers. Let's find it out and assign it to a variable:
TOPIC_ID=$(gcloud eventarc triggers describe $TRIGGER_NAME --format='value(transport.pubsub.topic)')
Test the trigger
You can check that the trigger is created by listing all triggers:
gcloud eventarc triggers list
In order to simulate a custom application sending message, you can use gcloud
to fire an event:
gcloud pubsub topics publish $TOPIC_ID --message="Hello World"
The Kubernetes pod logs the body of the incoming message. You can view this by first finding the pod id:
kubectl get pods NAME READY STATUS RESTARTS AGE hello-gke-645964f578-gfs5m 1/1 Running 0 12m
Then, checking the logs of the pod:
kubectl logs hello-gke-645964f578-gfs5m 2021/10/11 13:32:15 Hello from Cloud Run! The container started successfully and is listening for HTTP requests on $PORT {"severity":"INFO","eventType":"google.cloud.pubsub.topic.v1.messagePublished","message":"Received event of type google.cloud.pubsub.topic.v1.messagePublished. Event data: Hello World",...
Delete the trigger
Optionally, you can delete the trigger once done testing.
gcloud eventarc triggers delete $TRIGGER_NAME
10. Congratulations!
Congratulations for completing the codelab.
What we've covered
- Create a Cloud Run for Anthos sink
- Create a Pub/Sub trigger for Cloud Run for Anthos
- Create an Audit Logs trigger for Cloud Run for Anthos
- Create a Kubernetes service sink
- Create a Pub/Sub trigger for a Kubernetes service