About this codelab
1. Overview
In this lab, you will create an Eventarc trigger that connects a Pub/Sub topic to the Workflows service. Eventarc allows you to decouple service to service communication - making your solution more extensible and event-driven. You will create a workflow that includes multiple steps to execute a business process to calculate customer reward points for ordering at Cymbal Eats. Workflow will send multiple requests to an application running on GKE Autopilot and publish a message to Pub/Sub topic to notify Order Service application about calculated reward points.
What is GKE Autopilot?
GKE Autopilot is a mode of operation in GKE in which Google manages your cluster configuration, including your nodes, scaling, security, and other preconfigured settings. Autopilot clusters are optimized to run most production workloads, and provision compute resources based on your Kubernetes manifests. The streamlined configuration follows GKE best practices and recommendations for cluster and workload setup, scalability, and security. For a list of built-in settings, refer to the Autopilot and Standard comparison table.
With GKE Standard, users are responsible to manage worker nodes and node pool configuration while the rest is taken care of by GKE.
Customer's vs Google's responsibilities when running in GKE Standard mode
With GKE Autopilot, node pool configuration and management is Google's responsibility. This allows you to focus on applications and services that run on top of the cluster.
What is Eventarc?
Eventarc allows you to build event-driven architectures without having to implement, customize, or maintain the underlying infrastructure. Eventarc offers a standardized solution to manage the flow of state changes, called events, between decoupled microservices. When triggered, Eventarc routes these events through Pub/Sub subscriptions to various destinations (ex. Workflows, Cloud Run) while managing delivery, security, authorization, observability, and error-handling for you.
Google Event providers
- More than 90 Google Cloud providers. These providers send events either directly from the source (Cloud Storage, for example) or through Cloud Audit Logs entries.
- Pub/Sub providers. These providers send events to Eventarc using Pub/Sub messages.
Third-party providers
Third-party providers are non-Google entities that offer an Eventarc source.
Eventarc triggers
- Cloud Pub/Sub events. Eventarc can be triggered by messages published to Pub/Sub topics.
- Cloud Audit Logs (CAL) events. Cloud Audit Logs provide Admin Activity and Data Access audit logs for each Cloud project, folder, and organization.
- Direct events. Eventarc can be triggered by various direct events such as an update to a Cloud Storage bucket or an update to a Firebase Remote Config template.
Event destinations
- Workflows
- Cloud Run
- GKE
- Cloud Functions( 2nd gen)
What is Workflows?
Workflows is a fully managed service that lets you integrate microservices, tasks and APIs. Workflows is serverless service and will scale to meet your demand.
Workflows use cases:
- Event-driven workflows execute on defined triggers. For example, when a new order is submitted and you want to calculate customer loyalty points. Or when an order is canceled, the event can be published and all interested services will process the event.
- Batch jobs workflows run jobs on a regular basis using Cloud Scheduler. For example, a nightly job to check for menu items in failed status and deleting them.
Workflows is ideal for workflows that orchestrate services. You can automate processes that include waiting and retries for up to one year.
Workflows benefits:
- Configuration over code: Reduce technical debt by moving the logic to configuration rather than writing code.
- Simplify your architecture. Stateful Workflows allow you to visualize and monitor complex service integrations without additional dependencies.
- Incorporate reliability and fault tolerance. Control failures with default or custom retry logic and error handling even when other systems fail—checkpointing every step to Cloud Spanner to help you keep track of progress.
- Zero maintenance. Scale as needed: There's nothing to patch or maintain. Pay only when your workflows run, with no cost while waiting or inactive.
In this lab, you will configure an event-driven workflow.
What you will learn
In this lab, you will learn how to do the following:
- Configure Pub/Sub topic and Eventarc to trigger Workflows
- Configure Workflow to make API calls to application running on GKE Autopilot
- Configure Workflow to publish messages to Pub/Sub
- How to query Workflows structured logs in the Cloud Logging and using gcloud CLI
Prerequisites
- This lab assumes familiarity with the Cloud Console and Cloud Shell environments.
- Prior GKE and Cloud Pub/Sub experience is helpful but not required.
2. Setup and Requirements
Cloud Project setup
- Sign-in to the Google Cloud Console and create a new project or reuse an existing one. If you don't already have a Gmail or Google Workspace account, you must create one.
- The Project name is the display name for this project's participants. It is a character string not used by Google APIs. You can update it at any time.
- The Project ID is unique across all Google Cloud projects and is immutable (cannot be changed after it has been set). The Cloud Console auto-generates a unique string; usually you don't care what it is. In most codelabs, you'll need to reference the Project ID (it is typically identified as
PROJECT_ID
). If you don't like the generated ID, you may generate another random one. Alternatively, you can try your own and see if it's available. It cannot be changed after this step and will remain for the duration of the project. - For your information, there is a third value, a Project Number which some APIs use. Learn more about all three of these values in the documentation.
- Next, you'll need to enable billing in the Cloud Console to use Cloud resources/APIs. Running through this codelab shouldn't cost much, if anything at all. To shut down resources so you don't incur billing beyond this tutorial, you can delete the resources you created or delete the whole project. New users of Google Cloud are eligible for the $300 USD Free Trial program.
Environment Setup
Activate Cloud Shell by clicking on the icon to the right of the search bar.
Clone the repository and navigate to the directory, copy and paste command below into the terminal and hit Enter:
git clone https://github.com/GoogleCloudPlatform/cymbal-eats.git && cd cymbal-eats/customer-service
Deploy required dependencies by running the gke-lab-setup.sh
Following resources will be created:
- AlloyDB cluster and instance
- GKE Autopilot cluster
./gke-lab-setup.sh
If prompted to authorize, click "Authorize" to continue.
The setup will take about 10 minutes.
Wait until the script is done and you see the output below before running other steps.
NAME: client-instance ZONE: us-central1-c MACHINE_TYPE: e2-medium PREEMPTIBLE: INTERNAL_IP: 10.128.0.9 EXTERNAL_IP: 35.232.109.233 STATUS: RUNNING
3. GKE Autopilot Cluster
Review GKE Autopilot cluster
Set Project environment variables:
export PROJECT_ID=$(gcloud config get-value project)
export PROJECT_NUMBER=$(gcloud projects describe $PROJECT_ID --format='value(projectNumber)')
export PROJECT_NAME=$(gcloud projects describe $PROJECT_ID --format='value(name)')
Part of the initial setup, cluster was created using command below (You don't need to run this command):
gcloud container clusters create-auto $CLUSTER_NAME --region $REGION
Run the command to view created GKE Autopilot cluster:
gcloud container clusters list
Sample output:
Run the command to store credentials for the cluster:
CLUSTER_NAME=rewards-cluster
REGION=us-central1
gcloud container clusters get-credentials $CLUSTER_NAME --region=$REGION
Deploy an application
Next you will deploy a Customer Service application. This is a java based microservice that uses the Quarkus framework.
Navigate to cymbal-eats/customer-service
folder and run the commands below to build and upload container image:
./mvnw clean package -DskipTests
export CUSTOMER_SERVICE_IMAGE=gcr.io/$PROJECT_ID/customer-service:1.0.0
gcloud builds submit --tag $CUSTOMER_SERVICE_IMAGE .
Set AlloyDB Private IP address:
export DB_HOST=$(gcloud beta alloydb instances describe customer-instance \
--cluster=customer-cluster \
--region=$REGION \
--format=json | jq \
--raw-output ".ipAddress")
echo $DB_HOST
Run the commands below to create Kubernetes secrets object to store database credentials that will be used by the Customer Service application to connect to the database:
DB_NAME=customers
DB_USER=postgres
DB_PASSWORD=password123
kubectl create secret generic gke-alloydb-secrets \
--from-literal=database=$DB_NAME \
--from-literal=username=$DB_USER \
--from-literal=password=$DB_PASSWORD \
--from-literal=db_host=$DB_HOST
Run the command to replace CUSTOMER_SERVICE_IMAGE in the deployment.yaml file:
sed "s@CUSTOMER_SERVICE_IMAGE@$CUSTOMER_SERVICE_IMAGE@g" deployment.yaml.tmpl > customer-service-deployment.yaml
Run the command to deploy the application:
kubectl apply -f customer-service-deployment.yaml
It will take a few moments for the application to transition to RUNNING state.
Review deployment specification file:
deployment.yaml.tmpl
Here's the part of configuration that specifies the resources required to run this application.
spec: containers: - name: customer-service image: CUSTOMER_SERVICE_IMAGE resources: requests: cpu: 250m memory: 512Mi ephemeral-storage: 512Mi limits: cpu: 500m memory: 1024Mi ephemeral-storage: 1Gi
Run the command to create external IP that will be used in the workflow:
SERVICE_NAME=customer-service
kubectl expose deployment $SERVICE_NAME \
--type LoadBalancer --port 80 --target-port 8080
Run the command to verify created resources:
kubectl get all
Sample output:
4. Review Workflow
Workflows Core Concepts
A workflow consists of a series of steps described using the Workflows syntax( YAML or JSON).
After a workflow is created, it is deployed, which makes the workflow ready for execution.
An execution is a single run of the logic contained in a workflow's definition. A workflow that hasn't been executed generates no charges. All workflow executions are independent, and the product's rapid scaling allows for a high number of concurrent executions.
Execution controls
- Steps - To create a workflow, you define the desired
steps
and order of execution using the Workflows syntax. Every workflow must have at least one step. - Conditions - You can use a
switch
block as a selection mechanism that allows the value of an expression to control the flow of a workflow's execution. - Iterations - You can use a
for
loop to iterate over a sequence of numbers or through a collection of data, such as a list or map. - Subworkflows - A subworkflow works similarly to a routine or function in a programming language, allowing you to encapsulate a step or set of steps that your workflow will repeat multiple times.
Triggering executions
- Manual - You can manage workflows from either the Google Cloud console or from the command line using the Google Cloud CLI.
- Programmatic - The Cloud Client Libraries for the Workflows API, or the REST API, can be used to manage workflows.
- Scheduled - You can use Cloud Scheduler to run a workflow on a particular schedule.
Runtime Arguments
Data passed at runtime can be accessed by adding a params
field to your main workflow (placed in a main block). The main block accepts a single argument that is any valid JSON data type. The params field names the variable that the workflow uses to store the data you pass in.
Workflow Logic
If a customer does not exist, the workflow will make an API call to create a customer first and then update rewards points. Based on the order total amount, workflow will select a multiplier to calculate rewards points for the customer. See sample below for details.
- calculate_multiplier: switch: - condition: ${totalAmount < 10} steps: - set_multiplier1: assign: - multiplier: 2 - condition: ${totalAmount >= 10 and totalAmount < 25} steps: - set_multiplier2: assign: - multiplier: 3 - condition: ${totalAmount >= 25} steps: - set_multiplier3: assign: - multiplier: 5 - calculate_rewards: assign: - rewardPoints: ${customerRecord.rewardPoints + multiplier}
5. Configure and deploy Workflow
Run command to view External IP address for the service:
kubectl get svc
Sample output:
Set the environment variable below using the value of External IP from the previous output.
CUSTOMER_SERVICE_URL=http://$(kubectl get svc customer-service -o=jsonpath='{.status.loadBalancer.ingress[0].ip}')
Replace Customer Service application URL in the workflow template:
sed "s@CUSTOMER_SERVICE_URL@$CUSTOMER_SERVICE_URL@g" gkeRewardsWorkflow.yaml.tmpl > gkeRewardsWorkflow.yaml
Set location for Workflows service and project environment variables:
gcloud config set workflows/location ${REGION}
export PROJECT_ID=$(gcloud config get-value project)
export PROJECT_NUMBER=$(gcloud projects describe $PROJECT_ID --format='value(projectNumber)')
export PROJECT_NAME=$(gcloud projects describe $PROJECT_ID --format='value(name)')
Create a custom service account for the workflow with following permissions:
- Call Logging APIs
- Publish messages to PubSub topic
export WORKFLOW_SERVICE_ACCOUNT=workflows-sa
gcloud iam service-accounts create ${WORKFLOW_SERVICE_ACCOUNT}
gcloud projects add-iam-policy-binding $PROJECT_ID \
--member "serviceAccount:${WORKFLOW_SERVICE_ACCOUNT}@$PROJECT_ID.iam.gserviceaccount.com" \
--role "roles/logging.logWriter"
gcloud projects add-iam-policy-binding $PROJECT_ID \
--member "serviceAccount:${WORKFLOW_SERVICE_ACCOUNT}@$PROJECT_ID.iam.gserviceaccount.com" \
--role "roles/pubsub.publisher"
Deploy workflow. The workflow is configured to use the service account created in previous step:
export WORKFLOW_NAME=rewardsWorkflow
gcloud workflows deploy ${WORKFLOW_NAME} \
--source=gkeRewardsWorkflow.yaml \
--service-account=${WORKFLOW_SERVICE_ACCOUNT}@$PROJECT_ID.iam.gserviceaccount.com
Review the workflow source and other details(Triggers tab). Right now there are no triggers configured to execute this workflow. You will set it up in the next step.
6. Configure Pub/Sub topics and Eventarc trigger
Next you will create two Pub/Sub topics and configure one Eventarc trigger.
Order Service application will publish messages to order-topic
with information about new orders.
Workflow will publish messages to order-points-topic
with information about order reward points and total amount. Order Service(not deployed part of this lab) exposes an endpoint that is used by Push subscription for order-points-topic,
to update reward points and total amount per order.
Create new Pub/Sub topics:
export TOPIC_ID=order-topic
export ORDER_POINTS_TOPIC_ID=order-points-topic
gcloud pubsub topics create $TOPIC_ID --project=$PROJECT_ID
gcloud pubsub topics create $ORDER_POINTS_TOPIC_ID --project=$PROJECT_ID
Set location for Eventarc service:
gcloud config set eventarc/location ${REGION}
Create a custom service account that will be used by Eventarc trigger to execute workflows.
export TRIGGER_SERVICE_ACCOUNT=eventarc-workflow-sa
gcloud iam service-accounts create ${TRIGGER_SERVICE_ACCOUNT}
Grant access to the service account to execute workflows.
gcloud projects add-iam-policy-binding ${PROJECT_ID} \
--member="serviceAccount:${TRIGGER_SERVICE_ACCOUNT}@${PROJECT_ID}.iam.gserviceaccount.com" \
--role="roles/workflows.invoker"
Create an Eventarc trigger to listen for Pub/Sub messages and deliver them to Workflows.
gcloud eventarc triggers create new-orders-trigger \
--destination-workflow=${WORKFLOW_NAME} \
--destination-workflow-location=${REGION} \
--event-filters="type=google.cloud.pubsub.topic.v1.messagePublished" \
--service-account="${TRIGGER_SERVICE_ACCOUNT}@${PROJECT_ID}.iam.gserviceaccount.com" \
--transport-topic=$TOPIC_ID
Sample output:
Creating trigger [new-orders-trigger] in project [qwiklabs-gcp-01-1a990bfcadb3], location [us-east1]...done. Publish to Pub/Sub topic [projects/qwiklabs-gcp-01-1a990bfcadb3/topics/order-topic] to receive events in Workflow [rewardsWorkflow]. WARNING: It may take up to 2 minutes for the new trigger to become active.
Review created Eventarc trigger.
Review created subscription for the trigger.
Review changes on the workflow side. A new trigger was added.
7. Test workflow
To simulate Order Service, you will send messages to the Pub/Sub topic from the Cloud Shell and verify Customer Service logs in the Cloud console.
export TOPIC_ID=order-topic
gcloud pubsub topics publish $TOPIC_ID --message '{"userId":"id1","orderNumber":123456,"name":"Angela Jensen","email":"ajensen9090+eats@gmail.com","address":"1845 Denise St","city":"Mountain View","state":"CA","zip":"94043","orderItems":[{"id":7,"createDateTime":"2022-03-17T21:51:44.968584","itemImageURL":"https://images.unsplash.com/photo-1618449840665-9ed506d73a34?ixlib=rb-1.2.1&ixid=MnwxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHx8&auto=format&fit=crop&w=687&q=80","itemName":"Curry Plate","itemPrice":12.5,"itemThumbnailURL":"https://images.unsplash.com/photo-1618449840665-9ed506d73a34?ixlib=rb-1.2.1&ixid=MnwxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHx8&auto=format&fit=crop&w=687&q=80","spiceLevel":0,"status":"Ready","tagLine":"Spicy touch for your taste buds","updateDateTime":"2022-03-18T01:30:29.340584","inventory":8,"quantity":1}]}'
Sample output:
messageIds: - '5063709859203105'
Review workflow execution details and logs.
8. Workflow Structured Logging
The workflow is configured to write structured logs in JSON format. The logs are written using the Cloud Logging API, the workflows.googleapis.com/Workflow
resource, and under the log name projects/${PROJECT_ID}/logs/Workflows
.
Review logging configuration below.
- log_totalAmount: call: sys.log args: json: orderNumber: ${order.orderNumber} totalAmount: ${totalAmount} multiplier: ${multiplier} totalRewardPoints: ${rewardPoints} orderRewardPoints: ${orderRewardPoints} severity: INFO
Open Logs Explorer in the Cloud Console and run a query to find processed orders with a total amount more than $2 dollars.
To show the search query field, click on the "Show query".
resource.type="workflows.googleapis.com/Workflow" AND
jsonPayload.totalAmount > 2 AND
timestamp >= "2023-01-01T00:00:00Z" AND
timestamp <= "2024-12-31T23:59:59Z"
Sample output:
Open Cloud Shell and use gcloud CLI to read logs with commands below.
gcloud logging read 'resource.type="workflows.googleapis.com/Workflow" AND jsonPayload.totalAmount > 2 AND timestamp >= "2023-01-01T00:00:00Z" AND timestamp <= "2023-12-31T23:59:59Z"' --limit 10 --format="table(jsonPayload.orderNumber,jsonPayload.totalAmount,jsonPayload.orderRewardPoints,jsonPayload.totalRewardPoints,jsonPayload.multiplier)"
Sample output using table
format:
Run the command below to return logs in JSON format:
gcloud logging read 'resource.type="workflows.googleapis.com/Workflow" AND jsonPayload.totalAmount > 2 AND timestamp >= "2023-01-01T00:00:00Z" AND timestamp <= "2023-12-31T23:59:59Z"' --limit 10 --format=json | jq
Sample output using json
format:
9. Review Customer Records
(Optional steps)
Run commands below to set Customer Service URL environment variable.
CUSTOMER_SERVICE_URL=http://$(kubectl get svc customer-service -o=jsonpath='{.status.loadBalancer.ingress[0].ip}')
curl $CUSTOMER_SERVICE_URL/customer | jq
Sample output:
[ { "address": "1845 Denise St", "city": "Mountain View", "createDateTime": "2023-01-31T17:22:08.853644", "email": "ajensen9090+eats@gmail.com", "id": "id1", "name": "Angela Jensen", "rewardPoints": 4, "state": "CA", "updateDateTime": "2023-01-31T17:22:09.652117", "zip": "94043" } ]
Run command to publish a new order multiple times and verify customer reward points with curl command.
Publish new order message:
export TOPIC_ID=order-topic
gcloud pubsub topics publish $TOPIC_ID --message '{"userId":"id1","orderNumber":123456,"name":"Angela Jensen","email":"ajensen9090+eats@gmail.com","address":"1845 Denise St","city":"Mountain View","state":"CA","zip":"94043","orderItems":[{"id":7,"createDateTime":"2022-03-17T21:51:44.968584","itemImageURL":"https://images.unsplash.com/photo-1618449840665-9ed506d73a34?ixlib=rb-1.2.1&ixid=MnwxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHx8&auto=format&fit=crop&w=687&q=80","itemName":"Curry Plate","itemPrice":12.5,"itemThumbnailURL":"https://images.unsplash.com/photo-1618449840665-9ed506d73a34?ixlib=rb-1.2.1&ixid=MnwxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHx8&auto=format&fit=crop&w=687&q=80","spiceLevel":0,"status":"Ready","tagLine":"Spicy touch for your taste buds","updateDateTime":"2022-03-18T01:30:29.340584","inventory":8,"quantity":1}]}'
Verify customer reward points:
curl $CUSTOMER_SERVICE_URL/customer | jq
Run the command below to check for latest logs:
gcloud logging read 'resource.type="workflows.googleapis.com/Workflow" AND jsonPayload.totalAmount > 2 AND timestamp >= "2023-01-01T00:00:00Z" AND timestamp <= "2023-12-31T23:59:59Z"' --limit 10 --format="table(jsonPayload.orderNumber,jsonPayload.totalAmount,jsonPayload.orderRewardPoints,jsonPayload.totalRewardPoints,jsonPayload.multiplier)"
10. Congratulations!
Congratulations, you finished the codelab!
What we've covered:
- How to configure Pub/Sub topic and Eventarc to trigger Workflows
- How to configure Workflow to make API calls to application running on GKE Autopilot
- How to configure Workflow to publish messages to Pub/Sub
- How to query Workflows structured logs in the Cloud Logging and using gcloud CLI
What's next:
Explore other Cymbal Eats codelabs:
- Triggering Cloud Workflows with Eventarc
- Triggering Event Processing from Cloud Storage
- Connecting to Private CloudSQL from Cloud Run
- Connecting to Fully Managed Databases from Cloud Run
- Secure Serverless Application with Identity Aware Proxy (IAP)
- Triggering Cloud Run Jobs with Cloud Scheduler
- Securely Deploying to Cloud Run
- Securing Cloud Run Ingress Traffic
Clean up
To avoid incurring charges to your Google Cloud account for the resources used in this tutorial, either delete the project that contains the resources, or keep the project and delete the individual resources.
Deleting the project
The easiest way to eliminate billing is to delete the project that you created for the tutorial.