1. Overview
In this lab, you will create an Eventarc trigger that connects a Pub/Sub topic to the Workflows service. Eventarc allows you to decouple service to service communication - making your solution more extensible and event-driven. You will create a workflow that includes multiple steps to execute a business process to calculate customer reward points for ordering at Cymbal Eats. Workflow will call a private Cloud Run service API to execute business logic. Cloud Run service is configured to only allow internal traffic and it requires authentication. Workflow will publish a message to Pub/Sub topic to notify Order Service about calculated reward points.
What is Eventarc?
Eventarc allows you to build event-driven architectures without having to implement, customize, or maintain the underlying infrastructure. Eventarc offers a standardized solution to manage the flow of state changes, called events, between decoupled microservices. When triggered, Eventarc routes these events through Pub/Sub subscriptions to various destinations (ex. Workflows or Cloud Run) while managing delivery, security, authorization, observability, and error-handling for you.
Google Event providers
- More than 90 Google Cloud providers. These providers send events either directly from the source (Cloud Storage, for example) or through Cloud Audit Logs entries.
- Pub/Sub providers. These providers send events to Eventarc using Pub/Sub messages.
Third-party providers
Third-party providers are non-Google entities that offer an Eventarc source.
Eventarc triggers
- Cloud Pub/Sub events. Eventarc can be triggered by messages published to Pub/Sub topics.
- Cloud Audit Logs (CAL) events. Cloud Audit Logs provide Admin Activity and Data Access audit logs for each Cloud project, folder, and organization.
- Direct events. Eventarc can be triggered by various direct events such as an update to a Cloud Storage bucket or an update to a Firebase Remote Config template.
Event destinations
- Workflows
- Cloud Run
- GKE
- Cloud Functions( 2nd gen)
What is Workflows?
Workflows is a fully managed service that lets you integrate microservices, tasks and APIs. Workflows is serverless service and will scale to meet your demand.
Workflows use cases:
- Event-driven workflows execute on defined triggers. For example, when a new order is submitted and you want to calculate customer loyalty points. Or when an order is canceled, the event can be published and all interested services will process the event.
- Batch jobs workflows run jobs on a regular basis using Cloud Scheduler. For example, a nightly job to check for menu items in failed status and deleting them.
Workflows is ideal for workflows that orchestrate services. You can automate processes that include waiting and retries for up to one year.
Workflows benefits:
- Configuration over code: Reduce technical debt by moving the logic to configuration rather than writing code.
- Simplify your architecture. Stateful Workflows allow you to visualize and monitor complex service integrations without additional dependencies.
- Incorporate reliability and fault tolerance. Control failures with default or custom retry logic and error handling even when other systems fail—checkpointing every step to Cloud Spanner to help you keep track of progress.
- Zero maintenance. Scale as needed: There's nothing to patch or maintain. Pay only when your workflows run, with no cost while waiting or inactive.
In this lab, you will configure an event-driven workflow.
What you will learn
In this lab, you will learn how to do the following:
- Configure Pub/Sub topic and Eventarc to trigger Workflows
- Configure Workflow to call Cloud Run service and publish messages to Pub/Sub
- How to query Workflows structured logs in the Cloud Logging and using gcloud CLI
Prerequisites
- This lab assumes familiarity with the Cloud Console and Cloud Shell environments.
- Prior Cloud Run and Cloud Pub/Sub experience is helpful but not required.
2. Setup and Requirements
Cloud Project setup
- Sign-in to the Google Cloud Console and create a new project or reuse an existing one. If you don't already have a Gmail or Google Workspace account, you must create one.
- The Project name is the display name for this project's participants. It is a character string not used by Google APIs. You can update it at any time.
- The Project ID is unique across all Google Cloud projects and is immutable (cannot be changed after it has been set). The Cloud Console auto-generates a unique string; usually you don't care what it is. In most codelabs, you'll need to reference the Project ID (it is typically identified as
PROJECT_ID
). If you don't like the generated ID, you may generate another random one. Alternatively, you can try your own and see if it's available. It cannot be changed after this step and will remain for the duration of the project. - For your information, there is a third value, a Project Number which some APIs use. Learn more about all three of these values in the documentation.
- Next, you'll need to enable billing in the Cloud Console to use Cloud resources/APIs. Running through this codelab shouldn't cost much, if anything at all. To shut down resources so you don't incur billing beyond this tutorial, you can delete the resources you created or delete the whole project. New users of Google Cloud are eligible for the $300 USD Free Trial program.
Environment Setup
Activate Cloud Shell by clicking on the icon to the right of the search bar.
Clone the repository and navigate to the directory, copy and paste command below into the terminal and hit Enter:
git clone https://github.com/GoogleCloudPlatform/cymbal-eats.git && cd cymbal-eats/customer-service
Deploy required dependencies by running the lab-setup.sh
Following resources will be created:
- AlloyDB cluster and instance
- Artifact Registry to store container images for Cloud Run Job and Customer Service
- VPC Access connector for Cloud Run Service and Job to communicate with AlloyDB database
- Cloud Run Job to create AlloyDB database
- Cloud Run Customer service - java based microservice that uses Quarkus framework.
./lab-setup.sh
If prompted to authorize, click "Authorize" to continue.
The setup will take about 10 minutes.
Wait until the script is done and you see the output below before running other steps.
Deploying container to Cloud Run service [customer-service] in project [cymbal-eats-19227-5681] region [us-east1] OK Deploying new service... Done. OK Creating Revision... OK Routing traffic... Done. Service [customer-service] revision [customer-service-00001-mid] has been deployed and is serving 100 percent of traffic. Service URL: https://customer-service-e4p5zon5rq-ue.a.run.app
You can explore the next section and then come back to resume from the next step.
3. Review Workflow
Workflows Core Concepts
A workflow consists of a series of steps described using the Workflows syntax( YAML or JSON).
After a workflow is created, it is deployed, which makes the workflow ready for execution.
An execution is a single run of the logic contained in a workflow's definition. A workflow that hasn't been executed generates no charges. All workflow executions are independent, and the product's rapid scaling allows for a high number of concurrent executions.
Execution controls
- Steps - To create a workflow, you define the desired
steps
and order of execution using the Workflows syntax. Every workflow must have at least one step. - Conditions - You can use a
switch
block as a selection mechanism that allows the value of an expression to control the flow of a workflow's execution. - Iterations - You can use a
for
loop to iterate over a sequence of numbers or through a collection of data, such as a list or map. - Subworkflows - A subworkflow works similarly to a routine or function in a programming language, allowing you to encapsulate a step or set of steps that your workflow will repeat multiple times.
Triggering executions
- Manual - You can manage workflows from either the Google Cloud console or from the command line using the Google Cloud CLI.
- Programmatic - The Cloud Client Libraries for the Workflows API, or the REST API, can be used to manage workflows.
- Scheduled - You can use Cloud Scheduler to run a workflow on a particular schedule.
Runtime Arguments
Data passed at runtime can be accessed by adding a params
field to your main workflow (placed in a main block). The main block accepts a single argument that is any valid JSON data type. The params field names the variable that the workflow uses to store the data you pass in.
Workflow to Cloud Run Service Authentication
The Customer service is a sample application included in this repo that runs on Cloud Run and is configured to allow authenticated requests coming from internal networks only. You will configure Workflows to add a Google-signed OpenID Connect(OIDC) token to the request to authenticate with Cloud Run service.
Review documentation to learn more about service-to-service authentication.
The workflow authentication is configured under the args
block using auth
section.
rewardsWorkflow.yaml.tmpl
- create_customer: call: http.post args: url: CUSTOMER_SERVICE_URL/customer auth: type: OIDC
Workflow Logic
If a customer does not exist, this workflow will make an API call to create a customer first and then update rewards points. Based on the order total amount, workflow will select a multiplier to calculate rewards points for the customer. See sample below for details.
- calculate_multiplier: switch: - condition: ${totalAmount < 10} steps: - set_multiplier1: assign: - multiplier: 2 - condition: ${totalAmount >= 10 and totalAmount < 25} steps: - set_multiplier2: assign: - multiplier: 3 - condition: ${totalAmount >= 25} steps: - set_multiplier3: assign: - multiplier: 5 - calculate_rewards: assign: - rewardPoints: ${customerRecord.rewardPoints * multiplier}
4. Configure and deploy Workflow
Setup environment variables:
export REGION=us-east1
export CUSTOMER_SERVICE_URL=$(gcloud run services describe customer-service \
--platform managed \
--region $REGION \
--format=json | jq \
--raw-output ".status.url")
echo $CUSTOMER_SERVICE_URL
Replace service URL in the workflow template:
sed "s@CUSTOMER_SERVICE_URL@$CUSTOMER_SERVICE_URL@g" rewardsWorkflow.yaml.tmpl > rewardsWorkflow.yaml
Set location for Workflows service and project environment variables:
gcloud config set workflows/location ${REGION}
export PROJECT_ID=$(gcloud config get-value project)
export PROJECT_NUMBER=$(gcloud projects describe $PROJECT_ID --format='value(projectNumber)')
export PROJECT_NAME=$(gcloud projects describe $PROJECT_ID --format='value(name)')
Create a custom service account for the workflow with following permissions:
- Invoke Cloud Run service
- Call Logging APIs
- Publish messages to PubSub topic
export WORKFLOW_SERVICE_ACCOUNT=workflows-cloudrun-sa
gcloud iam service-accounts create ${WORKFLOW_SERVICE_ACCOUNT}
gcloud projects add-iam-policy-binding $PROJECT_ID \
--member "serviceAccount:${WORKFLOW_SERVICE_ACCOUNT}@$PROJECT_ID.iam.gserviceaccount.com" \
--role "roles/run.invoker"
gcloud projects add-iam-policy-binding $PROJECT_ID \
--member "serviceAccount:${WORKFLOW_SERVICE_ACCOUNT}@$PROJECT_ID.iam.gserviceaccount.com" \
--role "roles/logging.logWriter"
gcloud projects add-iam-policy-binding $PROJECT_ID \
--member "serviceAccount:${WORKFLOW_SERVICE_ACCOUNT}@$PROJECT_ID.iam.gserviceaccount.com" \
--role "roles/pubsub.publisher"
Deploy workflow. The workflow is configured to use the service account created in previous step:
export WORKFLOW_NAME=rewardsWorkflow
gcloud workflows deploy ${WORKFLOW_NAME} \
--source=rewardsWorkflow.yaml \
--service-account=${WORKFLOW_SERVICE_ACCOUNT}@$PROJECT_ID.iam.gserviceaccount.com
Review the workflow source and other details(Triggers tab). Right now there are no triggers configured to execute this workflow. You will set it up in the next step.
5. Configure Pub/Sub topics and Eventarc trigger
Next you will create two Pub/Sub topics and configure one Eventarc trigger.
Order Service will publish messages to order-topic
with information about new orders.
Workflow will publish messages to order-points-topic
with information about order reward points and total amount. Order Service(not deployed part of this lab) exposes an endpoint that is used by Push subscription for order-points-topic,
to update reward points and total amount per order.
Create new Pub/Sub topics:
export TOPIC_ID=order-topic
export ORDER_POINTS_TOPIC_ID=order-points-topic
gcloud pubsub topics create $TOPIC_ID --project=$PROJECT_ID
gcloud pubsub topics create $ORDER_POINTS_TOPIC_ID --project=$PROJECT_ID
Set location for Eventarc service:
gcloud config set eventarc/location ${REGION}
Create a custom service account that will be used by Eventarc trigger to execute workflows.
export TRIGGER_SERVICE_ACCOUNT=eventarc-workflow-sa
gcloud iam service-accounts create ${TRIGGER_SERVICE_ACCOUNT}
Grant access to the service account to execute workflows.
gcloud projects add-iam-policy-binding ${PROJECT_ID} \
--member="serviceAccount:${TRIGGER_SERVICE_ACCOUNT}@${PROJECT_ID}.iam.gserviceaccount.com" \
--role="roles/workflows.invoker"
Create an Eventarc trigger to listen for Pub/Sub messages and deliver them to Workflows.
gcloud eventarc triggers create new-orders-trigger \
--destination-workflow=${WORKFLOW_NAME} \
--destination-workflow-location=${REGION} \
--event-filters="type=google.cloud.pubsub.topic.v1.messagePublished" \
--service-account="${TRIGGER_SERVICE_ACCOUNT}@${PROJECT_ID}.iam.gserviceaccount.com" \
--transport-topic=$TOPIC_ID
Sample output:
Creating trigger [new-orders-trigger] in project [qwiklabs-gcp-01-1a990bfcadb3], location [us-east1]...done. Publish to Pub/Sub topic [projects/qwiklabs-gcp-01-1a990bfcadb3/topics/order-topic] to receive events in Workflow [rewardsWorkflow]. WARNING: It may take up to 2 minutes for the new trigger to become active.
Review created Eventarc trigger.
Review created subscription for the trigger.
Review changes on the workflow side. A new trigger was added.
6. Test workflow
To simulate Order Service, you will send messages to the Pub/Sub topic from the Cloud Shell and verify Cloud Run Customer service logs in the Cloud console.
export TOPIC_ID=order-topic
gcloud pubsub topics publish $TOPIC_ID --message '{"userId":"id1","orderNumber":123456,"name":"Angela Jensen","email":"ajensen9090+eats@gmail.com","address":"1845 Denise St","city":"Mountain View","state":"CA","zip":"94043","orderItems":[{"id":7,"createDateTime":"2022-03-17T21:51:44.968584","itemImageURL":"https://images.unsplash.com/photo-1618449840665-9ed506d73a34?ixlib=rb-1.2.1&ixid=MnwxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHx8&auto=format&fit=crop&w=687&q=80","itemName":"Curry Plate","itemPrice":12.5,"itemThumbnailURL":"https://images.unsplash.com/photo-1618449840665-9ed506d73a34?ixlib=rb-1.2.1&ixid=MnwxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHx8&auto=format&fit=crop&w=687&q=80","spiceLevel":0,"status":"Ready","tagLine":"Spicy touch for your taste buds","updateDateTime":"2022-03-18T01:30:29.340584","inventory":8,"quantity":1}]}'
Sample output:
messageIds: - '5063709859203105'
Review workflow execution details and logs.
7. Workflow Structured Logging
The workflow is configured to write structured logs in JSON format. The logs are written using the Cloud Logging API, the workflows.googleapis.com/Workflow
resource, and under the log name projects/${PROJECT_ID}/logs/Workflows
.
Review logging configuration below.
- log_totalAmount: call: sys.log args: json: orderNumber: ${order.orderNumber} totalAmount: ${totalAmount} multiplier: ${multiplier} totalRewardPoints: ${rewardPoints} orderRewardPoints: ${orderRewardPoints} severity: INFO
Open Logs Explorer in the Console and run a query for processed orders with a total amount more than $2 dollars.
Use query below, replace the project id(qwiklabs-gcp-01-1a990bfcadb3
) with your current project id:
resource.type="workflows.googleapis.com/Workflow" AND
logName=projects/qwiklabs-gcp-01-1a990bfcadb3/logs/Workflows AND
jsonPayload.totalAmount > 2 AND
timestamp >= "2022-11-01T23:59:59Z" AND
timestamp <= "2023-11-05T00:00:00Z"
Sample output:
Open Cloud Shell and use gcloud CLI to read logs with commands below.
Replace the project id(qwiklabs-gcp-01-1a990bfcadb3
) with your current project id.
gcloud logging read 'resource.type="workflows.googleapis.com/Workflow" AND logName=projects/qwiklabs-gcp-01-1a990bfcadb3/logs/Workflows AND jsonPayload.totalAmount > 2 AND timestamp >= "2022-11-01T23:59:59Z" AND timestamp <= "2023-11-05T00:00:00Z"' --limit 10 --format="table(jsonPayload.orderNumber,jsonPayload.totalAmount,jsonPayload.orderRewardPoints,jsonPayload.totalRewardPoints,jsonPayload.multiplier)"
Sample output using table
format:
Replace the project id(qwiklabs-gcp-01-1a990bfcadb3
) with your current project id.
gcloud logging read 'resource.type="workflows.googleapis.com/Workflow" AND logName=projects/qwiklabs-gcp-01-1a990bfcadb3/logs/Workflows AND jsonPayload.totalAmount > 2 AND timestamp >= "2022-11-01T23:59:59Z" AND timestamp <= "2023-11-05T00:00:00Z"' --limit 10 --format=json | jq
Sample output using json
format:
8. Review Customer Records
(Optional steps) Right now customer-service
is configured to accept traffic from internal networks only.
Run commands below to save service URL and call customer-service.
export REGION=us-east1
CUSTOMER_SERVICE_URL=$(gcloud run services describe customer-service \
--region=$REGION \
--format=json | jq \
--raw-output ".status.url")
curl -H "Authorization: Bearer $(gcloud auth print-identity-token)" $CUSTOMER_SERVICE_URL/customer
You will get an error message that access is forbidden.
<html><head> <meta http-equiv="content-type" content="text/html;charset=utf-8"> <title>403 Forbidden</title> </head> <body text=#000000 bgcolor=#ffffff> <h1>Error: Forbidden</h1> <h2>Access is forbidden.</h2> <h2></h2> </body></html>
To view existing customer records, change Cloud Run customer-service
ingress settings to "Allow all traffic
" option and click "Save".
This will make the endpoint public and you can call Customer Service API from Cloud Shell using curl.
Run commands below to save service URL and list existing customers.
CUSTOMER_SERVICE_URL=$(gcloud run services describe customer-service \
--region=$REGION \
--format=json | jq \
--raw-output ".status.url")
curl -H "Authorization: Bearer $(gcloud auth print-identity-token)" $CUSTOMER_SERVICE_URL/customer | jq
Sample output:
[ { "id": "id1", "rewardPoints": 3, "address": "1845 Denise St", "city": "Mountain View", "createDateTime": "2022-11-11T15:56:45.487566", "email": "ajensen9090+eats@gmail.com", "name": "Angela Jensen", "state": "CA", "updateDateTime": "2022-11-11T15:56:45.866125", "zip": "94043" } ]
Run command to publish a new order multiple times and verify customer reward points with curl command.
Publish new order message:
export TOPIC_ID=order-topic
gcloud pubsub topics publish $TOPIC_ID --message '{"userId":"id1","orderNumber":123456,"name":"Angela Jensen","email":"ajensen9090+eats@gmail.com","address":"1845 Denise St","city":"Mountain View","state":"CA","zip":"94043","orderItems":[{"id":7,"createDateTime":"2022-03-17T21:51:44.968584","itemImageURL":"https://images.unsplash.com/photo-1618449840665-9ed506d73a34?ixlib=rb-1.2.1&ixid=MnwxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHx8&auto=format&fit=crop&w=687&q=80","itemName":"Curry Plate","itemPrice":12.5,"itemThumbnailURL":"https://images.unsplash.com/photo-1618449840665-9ed506d73a34?ixlib=rb-1.2.1&ixid=MnwxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHx8&auto=format&fit=crop&w=687&q=80","spiceLevel":0,"status":"Ready","tagLine":"Spicy touch for your taste buds","updateDateTime":"2022-03-18T01:30:29.340584","inventory":8,"quantity":1}]}'
Verify customer reward points:
curl -H "Authorization: Bearer $(gcloud auth print-identity-token)" $CUSTOMER_SERVICE_URL/customer | jq
Verify logs. Replace the project id(qwiklabs-gcp-01-1a990bfcadb3
) with your current project id.
gcloud logging read 'resource.type="workflows.googleapis.com/Workflow" AND logName=projects/qwiklabs-gcp-01-1a990bfcadb3/logs/Workflows AND jsonPayload.totalAmount > 2 AND timestamp >= "2022-11-01T23:59:59Z" AND timestamp <= "2023-11-05T00:00:00Z"' --limit 10 --format="table(jsonPayload.orderNumber,jsonPayload.totalAmount,jsonPayload.orderRewardPoints,jsonPayload.totalRewardPoints,jsonPayload.multiplier)"
9. Congratulations!
Congratulations, you finished the codelab!
What we've covered:
- How to configure Workflows
- How to configure Eventarc trigger for Workflows
- How to call Cloud Run service from Workflows
- How to query structured logs in the Cloud Logging and using gcloud CLI
What's next:
Explore other Cymbal Eats codelabs:
- Triggering Event Processing from Cloud Storage
- Connecting to Private CloudSQL from Cloud Run
- Connecting to Fully Managed Databases from Cloud Run
- Secure Serverless Application with Identity Aware Proxy (IAP)
- Triggering Cloud Run Jobs with Cloud Scheduler
- Securely Deploying to Cloud Run
- Securing Cloud Run Ingress Traffic
- Connecting to private AlloyDB from GKE Autopilot
Clean up
To avoid incurring charges to your Google Cloud account for the resources used in this tutorial, either delete the project that contains the resources, or keep the project and delete the individual resources.
Deleting the project
The easiest way to eliminate billing is to delete the project that you created for the tutorial.