About this codelab
1. Objectives
The purpose of this workshop is to provide hands-on Duet AI education to users and practitioners.
In this codelab, you learn the following:
- Activate Duet AI in your GCP project and configure it for use in an IDE and Cloud Console.
- Use Duet AI for code generation, completion and explanation.
- Use Duet AI to explain and troubleshoot an application issue.
- Duet AI features like IDE chat and multi-turn chat, chat vs inline code generation, smart actions like code explanation and recitation acknowledgement and more.
Narrative
To show how Duet AI for Developers is used authentically in day-to-day development, this workshop's activities take place in a narrative context.
A new developer joins an ecommerce company. They are tasked with adding a new service to the existing ecommerce application (that is composed of multiple services). The new service provides additional information (dimensions, weight etc.) about products in the product catalog. This service will allow better/cheaper shipping costs based on product dimensions and weights.
As the developer is new to the company, they will use Duet AI for Code generation, explanation and documentation.
After the service is coded, a platform administrator will use Duet AI (chat) to help create the artifact (docker container), and the resources needed to deploy the artifact to GCP (for example Artifact Registry, IAM permissions, a code repository, compute infrastructure i.e. GKE or CloudRun etc.)
Once the application is deployed to GCP, An application operator/SRE will use Duet AI (and Cloud Ops) to help troubleshoot an error in the new service.
Persona
The workshop covers the following persona:
- Application Developer - Some knowledge of programming and software development is required.
This variation of the Duet AI workshop is for developers only. No knowledge of GCP cloud resources is required. The scripts for how to build the required GCP resources to run this application can be found here. You can follow the instructions in this guide to deploy the required GCP resources.
2. Preparing the environment
Activating Duet AI
You can activate Duet AI in a GCP project either via API (gcloud or IaC tools like Terraform) or via the Cloud Console UI.
To activate Duet AI in a Google Cloud project, you enable the Cloud AI Companion API and grant the Cloud AI Companion User and the Service Usage Viewer Identity and Access Management (IAM) roles to users.
Via gcloud
Activate Cloud Shell:
Configure your PROJECT_ID
, USER
and enable the Cloud AI Companion API.
export PROJECT_ID=<YOUR PROJECT ID> export USER=<YOUR USERNAME> # Use your full LDAP, e.g. name@example.com gcloud config set project ${PROJECT_ID} gcloud services enable cloudaicompanion.googleapis.com --project ${PROJECT_ID}
The output is like the following:
Updated property [core/project]. Operation "operations/acat.p2-60565640195-f37dc7fe-b093-4451-9b12-934649e2a435" finished successfully.
Grant the Cloud AI Companion User and the Service Usage Viewer Identity and Access Management (IAM) roles to the USER account. The Cloud Companion API sits behind the features in both the IDE and console that we will be using. The service Usage Viewer permission is used as a quick check before enabling the UI in the console (so that the Duet UI only appears in projects in which the API is enabled).
gcloud projects add-iam-policy-binding ${PROJECT_ID} \ --member=user:${USER} --role=roles/cloudaicompanion.user gcloud projects add-iam-policy-binding ${PROJECT_ID} \ --member=user:${USER} --role=roles/serviceusage.serviceUsageViewer
The output is like the following:
... - members: - user:<YOUR USER ACCOUNT> role: roles/cloudaicompanion.user ... - members: - user:<YOUR USER ACCOUNT> role: roles/serviceusage.serviceUsageViewer
Via Cloud Console
To enable the API, go to the Cloud AI Companion API page in the Google Cloud console.
In the project selector, select a project.
Click Enable.
The page updates and shows a status of Enabled. Duet AI is now available in the selected Google Cloud project to all users who have the required IAM roles.
To grant the IAM roles that are required to use Duet AI, go to the IAM page.
In the Principal column, find your USER for which you want to enable access to Duet AI, and then click the pencil icon ✏️ Edit principal in that row.
In the Edit access pane, click add Add another role.
In Select a role, select Cloud AI Companion User.
Click Add another role and select Service Usage Viewer.
Click Save.
Setting up the IDE
Developers can choose from a variety of IDEs that best suit their needs. Duet AI code assistance is available in multiple IDEs such as Visual Studio Code, JetBrains IDEs (IntelliJ, PyCharm, GoLand, WebStorm, and more), Cloud Workstations, Cloud Shell Editor.
In this lab, you can use either Cloud Workstations or Cloud Shell Editor.
This workshop uses the Cloud Shell Editor.
Note that Cloud Workstations can take 20-30 minutes to set up.
To use immediately, use Cloud Shell Editor.
Open Cloud Shell Editor by clicking on the pencil icon ✏️ in the top menu bar of your Cloud Shell.
Cloud Shell Editor has a very similar UI and UX to VSCode.
Click CTRL (in Windows)/CMD (in Mac) + , (comma) to enter the Settings pane.
In the Search bar, type "duet ai".
Ensure or enable Cloudcode › Duet AI: Enable and Cloudcode › Duet AI › Inline Suggestions: Enable Auto
In the bottom Status Bar, click on Cloud Code - Sign In and follow the sign in workflow.
If you're already signed in, the status bar shows Cloud Code - No project.
Click on Cloud Code - No project and an action dropdown pane will appear on the top. Click on Select a Google Cloud project.
Start typing your PROJECT ID and your project should appear in the list.
Select your PROJECT_ID from the list of projects.
The bottom status bar updates to show your project ID. If it does not, you may need to refresh your Cloud Shell Editor tab.
Click on the Duet AI icon in the left hand menu bar and the Duet AI chat window will appear. If you get a message saying Select GCP Project. Click and re-select the project.
You now see the Duet AI chat window
3. Setting up the infrastructure
In order to run the new shipping service in GCP, you need the following GCP resources:
- A Cloud SQL Instance, with a database.
- A GKE cluster to run the containerized service.
- An Artifact Registry to store the Docker image.
- A Cloud Source Repository for the code.
In the Cloud Shell terminal, clone the following repo and run the following commands to set up the infrastructure in your GCP project.
# Set your project export PROJECT_ID=<INSERT_YOUR_PROJECT_ID> gcloud config set core/project ${PROJECT_ID} # Enable Cloudbuild and grant Cloudbuild SA owner role export PROJECT_NUMBER=$(gcloud projects describe ${PROJECT_ID} --format 'value(projectNumber)') gcloud services enable cloudbuild.googleapis.com gcloud projects add-iam-policy-binding ${PROJECT_ID} --member serviceAccount:${PROJECT_NUMBER}@cloudbuild.gserviceaccount.com --role roles/owner # Clone the repo git clone https://github.com/duetailabs/dev.git ~/duetaidev cd ~/duetaidev # Run Cloudbuild to create the necessary resources gcloud builds submit --substitutions=_PROJECT_ID=${PROJECT_ID} # To destroy all GCP resources, run the following # gcloud builds submit --substitutions=_PROJECT_ID=${PROJECT_ID} --config=cloudbuild_destroy.yaml
4. Developing a python Flask service
The service we will be creating will ultimately consist of the following files. You do not need to create these files now and will create these one at a time following the instructions below:
package-service.yaml
- An Open API spec for the package service that has data such as height, width, weight and special handling instructions.data_model.py
- Data model for the package-service API spec. Also creates thepackages
table in the product_details DB.connect_connector.py
- CloudSQL connection (defines engine, Session and Base ORM)db_init.py
- Generates sample data into thepackages
table.main.py
- A Python Flask service with aGET
endpoint to retrieve package details from thepackages
data based on product_id.test.py
- Unit testrequirement.txt
- Python requirementsDockerfile
- To containerize this application
If you run into any sticky problems during the exercises, the final files are all located in the APPENDIX of this codelab for reference.
In the previous step, you created a Cloud Source Repository. Clone the repository. You will build the application files in the cloned repository folder.
In the Cloud Shell terminal, run the following command to clone the repository.
cd ~ gcloud source repos clone shipping shipping cd ~/shipping
Open the Duet AI chat sidebar from the Cloud Shell Editor left hand menu. The icon looks like . You can now use Duet AI for code assistance.
package-service.yaml
Without any files open, ask Duet to generate an Open API spec for the shipping service.
Prompt 1: Generate an OpenAPI yaml specification for a service that provides shipping and package information given a numerical product id. The service should include information about the packages height, width, depth, weight and any special handling instructions.
There are three options listed in the top right of the generated code window.
You can either COPY
the code and PASTE it into a file.
You can ADD
the code to the currently opened file in the Editor.
Or you can OPEN
the code in a new file.
Click the OPEN
the code in a new file.
Click CTRL/CMD + s
to save the file, and store the file in the application folder with the file name called package-service.yaml
. Click OK.
The final file is in the APPENDIX section of this codelab. If it does not, manually make the appropriate changes.
You can also try various prompts to see Duet AI's responses.
Reset Duet AI chat history by clicking the trash icon on the top of the Duet AI sidebar.
data_model.py
Next, you create the data model python file for the service based on the OpenAPI spec.
With the package-service.yaml
file open, enter the following prompt.
Prompt 1: Using the python sqlalchemy ORM, generate a data model for this API service. Also include a separate function and a main entrypoint that creates the database tables.
Let's look at each part that was generated. Duet AI is still an assistant, and while it can help quickly author code, you should still be reviewing generated content and understanding it as you go.
First, there is a Class called Package
of kind Base
that defines the data model for the packages
database like the following:
class Package(Base):
__tablename__ = 'packages'
id = Column(Integer, primary_key=True)
product_id = Column(String(255))
height = Column(Float)
width = Column(Float)
depth = Column(Float)
weight = Column(Float)
special_handling_instructions = Column(String(255))
Next, you need a function that creates the table in the database like the following:
def create_tables(engine):
Base.metadata.create_all(engine)
Finally, you need a main function that runs the create_tables
function to actually build the table in the CloudSQL database, like the following:
if __name__ == '__main__':
from sqlalchemy import create_engine
engine = create_engine('sqlite:///shipping.db')
create_tables(engine)
print('Tables created successfully.')
Note that the main
function is creating an engine using a local sqlite
database. In order to use CloudSQL, you will need to change it. You do that a bit later.
Using the OPEN
the code in a new file workflow as before. Save the code in a file called
data_model.py
(note the underscore in the name and not a dash).
Reset Duet AI chat history by clicking the trash icon on the top of the Duet AI sidebar.
connect-connector.py
Create the CloudSQL connector.
With the data_model.py
file open, enter the following prompts.
Prompt 1: Using the cloud-sql-python-connector library, Generate a function that Initializes a connection pool for a Cloud SQL instance of Postgres.
Note that the response does not use the cloud-sql-python-connector
library. You can refine prompts, to give Duet a bit of a nudge, by adding specifics to the same chat thread.
Let's use another prompt.
Prompt 2: Must use the cloud-sql-python-connector library.
Make sure that it uses the cloud-sql-python-connector
library.
Using the OPEN
the code in a new file workflow as before. Save the code in a file called
connect_conector.py
. You may need to manually import the pg8000
library, please see the file below.
Clear the Duet AI chat history, and with the connect_connector.py
file open, generate the DB engine
, sessionmaker
and base
ORM to be used in the application.
Prompt 1: Create an engine, sessionmaker class and Base ORM using the connect_with_connector method
The response may append the engine
, Session
and Base
to the connect_connector.py
file.
The final file is in the APPENDIX section of this codelab. If it does not, manually make the appropriate changes.
You can also try various prompts to see the potential variation Duet AI's responses.
Reset Duet AI chat history by clicking the trash icon on the top of the Duet AI sidebar.
Updating data_model.py
You need to use the engine you created in the previous step (in the connect_connector.py
file) in order to create a table in the CloudSQL database.
Clear the Duet AI chat history. Open the data_model.py
file. Try the following prompt.
Prompt 1: In the main function, import and use the engine from connect_connector.py
You should see the response importing engine
from connect_connector
(for CloudSQL). The create_table
uses that engine (instead of the default sqlite
local DB).
Update data_model.py
file.
The final file is in the APPENDIX section of this codelab. If it does not, manually make the appropriate changes.
You can also try various prompts to see various Duet AI's responses.
Reset Duet AI chat history by clicking the trash icon on the top of the Duet AI sidebar.
requirements.txt
Create a requirements.txt
file for the application.
Open both connect_connector.py
and the data_model.py
file and enter the following prompt.
Prompt 1: Generate a pip requirements file for this data model and service
Prompt 2: Generate a pip requirements file for this data model and service using latest versions
Verify the names and versions are correct. For example, in the response above, the google-cloud-sql-connecter
name and version are both incorrect. Manually fix the versions and create a requirements.txt
file that looks like this:
cloud-sql-python-connector==1.2.4
sqlalchemy==1.4.36
pg8000==1.22.0
In the command terminal run the following:
pip3 install -r requirements.txt
Reset Duet AI chat history by clicking the trash icon on the top of the Duet AI sidebar.
Creating packages table in CloudSQL
Set the environment variables for the CloudSQL database connector.
export INSTANCE_NAME=$(gcloud sql instances list --format='value(name)') export INSTANCE_CONNECTION_NAME=$(gcloud sql instances describe ${INSTANCE_NAME} --format="value(connectionName)") export DB_USER=evolution export DB_PASS=evolution export DB_NAME=product_details
Now run data_model.py.
python data_model.py
The output is similar to the following (check the code to see what is actually expected):
Tables created successfully.
Connect to the CloudSQL instance and check the database has been created.
gcloud sql connect ${INSTANCE_NAME} --user=evolution --database=product_details
After entering the password (also evolution), get the tables.
product_details=> \dt
The output is similar to the following:
List of relations Schema | Name | Type | Owner --------+----------+-------+----------- public | packages | table | evolution (1 row)
You can also check the data model and table details.
product_details=> \d+ packages
The output is similar to the following:
Table "public.packages" Column | Type | Collation | Nullable | Default | Storage | Compression | Stats target | Description -------------------------------+-------------------+-----------+----------+--------------------------------------+----------+-------------+--------------+------------- id | integer | | not null | nextval('packages_id_seq'::regclass) | plain | | | product_id | integer | | not null | | plain | | | height | double precision | | not null | | plain | | | width | double precision | | not null | | plain | | | depth | double precision | | not null | | plain | | | weight | double precision | | not null | | plain | | | special_handling_instructions | character varying | | | | extended | | | Indexes: "packages_pkey" PRIMARY KEY, btree (id) Access method: heap
Type \q
to exit CloudSQL.
db_init.py
Next, let's add some sample data to the packages
table.
Clear the Duet AI chat history. With the data_model.py
file open, try the following prompts.
Prompt 1: Generate a function that creates 10 sample packages rows and commits them to the packages table
Prompt 2: Using the session from connect_connector, generate a function that creates 10 sample packages rows and commits them to the packages table
Using the OPEN
the code in a new file workflow as before. Save the code in a file called
db_init.py
.
The final file is in the APPENDIX section of this codelab. If it does not, manually make the appropriate changes.
You can also try various prompts to see various Duet AI's responses.
Reset Duet AI chat history by clicking the trash icon on the top of the Duet AI sidebar.
Creating sample packages data
Run the db_init.py
from the command line.
python db_init.py
The output is similar to the following:
Packages created successfully.
Connect to the CloudSQL instance again and verify the sample data is added to the packages table.
Connect to the CloudSQL instance and check the database has been created.
gcloud sql connect ${INSTANCE_NAME} --user=evolution --database=product_details
After entering the password (also evolution), get all data from the packages table.
product_details=> SELECT * FROM packages;
The output is similar to the following:
id | product_id | height | width | depth | weight | special_handling_instructions ----+------------+--------+-------+-------+--------+----------------------------------- 1 | 0 | 10 | 10 | 10 | 10 | No special handling instructions. 2 | 1 | 10 | 10 | 10 | 10 | No special handling instructions. 3 | 2 | 10 | 10 | 10 | 10 | No special handling instructions. 4 | 3 | 10 | 10 | 10 | 10 | No special handling instructions. 5 | 4 | 10 | 10 | 10 | 10 | No special handling instructions. 6 | 5 | 10 | 10 | 10 | 10 | No special handling instructions. 7 | 6 | 10 | 10 | 10 | 10 | No special handling instructions. 8 | 7 | 10 | 10 | 10 | 10 | No special handling instructions. 9 | 8 | 10 | 10 | 10 | 10 | No special handling instructions. 10 | 9 | 10 | 10 | 10 | 10 | No special handling instructions. (10 rows)
Type \q
to exit CloudSQL.
main.py
With data_model.py
, package-service.yaml
, and connect_connector.py
files open, create a main.py
for the application.
Prompt 1: Using the python flask library - create an implementation that uses http rest endpoints for this service
Prompt 2: Using the python flask library - create an implementation that uses http rest endpoints for this service. import and use the SessionMaker from connect_conector.py to for packages data.
Prompt 3: Using the python flask library - create an implementation that uses http rest endpoints for this service. import and use Package from the data_model.py and the SessionMaker from connect_conector.py to for packages data.
Prompt 4: Using the python flask library - create an implementation that uses http rest endpoints for this service. import and use Package from the data_model.py and the SessionMaker from connect_conector.py to for packages data. Use host IP 0.0.0.0 for app.run
Update the requirements for main.py
.
Prompt: Create requirements file for main.py
Append this to requirements.txt
file. Make sure to use Flask version 3.0.0.
Using the OPEN
the code in a new file workflow as before. Save the code in a file called
main.py
.
The final file is in the APPENDIX section of this codelab. If it does not, manually make the appropriate changes.
Reset Duet AI chat history by clicking the trash icon on the top of the Duet AI sidebar.
5. Testing and running the application
Install the requirements.
pip3 install -r requirements.txt
Run main.py
.
python main.py
The output is similar to the following:
* Serving Flask app 'main' * Debug mode: off WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead. * Running on all addresses (0.0.0.0) * Running on http://127.0.0.1:5000 * Running on http://10.88.0.3:5000 Press CTRL+C to quit
From a second terminal, test the /packages/<product_id>
endpoint.
curl localhost:5000/packages/1
The output is similar to the following:
{"depth":10.0,"height":10.0,"special_handling_instructions":"No special handling instructions.","weight":10.0,"width":10.0}
You can also test any other product ID in your sample data.
Enter CTRL_C
to exit out of the running docker container in the terminal.
Generating unit tests
With the main.py
file open, generate unit tests.
Prompt 1: Generate unit tests.
Using the OPEN
the code in a new file workflow as before. Save the code in a file called
test.py
.
In the test_get_package
function, a product_id
must be defined. You can manually add it.
The final file is in the APPENDIX section of this codelab. If it does not, manually make the appropriate changes.
Reset Duet AI chat history by clicking the trash icon on the top of the Duet AI sidebar.
Running unit tests
Run the unit test.
python test.py
The output is similar to the following:
. ---------------------------------------------------------------------- Ran 1 test in 1.061s OK
Close all files in Cloud Shell Editor and clear the chat history by clicking the trash icon in the top status bar.
Dockerfile
Create a Dockerfile
for this application.
Open main.py
and try the following prompts.
Prompt 1: Generate a Dockerfile for this application.
Prompt 2: Generate a Dockerfile for this application. Copy all files to the container.
You also need to set the ENVARS
for INSTANCE_CONNECTION_NAME
, DB_USER
, DB_PASS
, and DB_NAME
. You can do that manually. Your Dockerfile should look like the following:
FROM python:3.10-slim
WORKDIR /app
COPY . ./
RUN pip install -r requirements.txt
# Add these manually for your project
ENV INSTANCE_CONNECTION_NAME=YOUR_INSTANCE_CONNECTION_NAME
ENV DB_USER=evolution
ENV DB_PASS=evolution
ENV DB_NAME=product_details
CMD ["python", "main.py"]
Using the OPEN
the code in a new file workflow as before. Save the code in a file called Dockerfile.
The final file is in the APPENDIX section of this codelab. If it does not, manually make the appropriate changes.
Locally running the application
With the Dockerfile
open, try the following prompt.
Prompt 1: How do I locally run a container using this Dockerfile
Follow the instructions.
# Build docker build -t shipping . # And run docker run -p 5000:5000 -it shipping
The output is similar to the following:
* Serving Flask app 'main' * Debug mode: off WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead. * Running on all addresses (0.0.0.0) * Running on http://127.0.0.1:5000 * Running on http://172.17.0.2:5000 Press CTRL+C to quit
From a second terminal window, access the container.
curl localhost:5000/packages/1
The output is similar to the following:
{"depth":10.0,"height":10.0,"special_handling_instructions":"No special handling instructions.","weight":10.0,"width":10.0}
The containerized application is working.
Enter CTRL_C
to exit out of the running docker container in the terminal.
Building container image in the Artifact Registry
Build the container image and push to the Artifact Registry.
cd ~/shipping gcloud auth configure-docker us-central1-docker.pkg.dev docker build -t us-central1-docker.pkg.dev/${PROJECT_ID}/shipping/shipping . docker push us-central1-docker.pkg.dev/${PROJECT_ID}/shipping/shipping
The application container is now located at us-central1-docker.pkg.dev/${PROJECT_ID}/shipping/shipping
which can be deployed to GKE.
6. Deploying application to the GKE cluster
A GKE Autopilot cluster was created when you built the GCP resources for this workshop. Connect to the GKE cluster.
gcloud container clusters get-credentials gke1 \ --region=us-central1
Annotate the Kubernetes default service account with the Google service account.
kubectl annotate serviceaccount default iam.gke.io/gcp-service-account=cloudsqlsa@${PROJECT_ID}.iam.gserviceaccount.com
The output is similar to the following:
serviceaccount/default annotated
Prepare and apply the k8s.yaml file.
cp ~/duetaidev/k8s.yaml_tmpl ~/shipping/. export INSTANCE_NAME=$(gcloud sql instances list --format='value(name)') export INSTANCE_CONNECTION_NAME=$(gcloud sql instances describe ${INSTANCE_NAME} --format="value(connectionName)") export IMAGE_REPO=us-central1-docker.pkg.dev/${PROJECT_ID}/shipping/shipping envsubst < ~/shipping/k8s.yaml_tmpl > k8s.yaml kubectl apply -f k8s.yaml
The output is similar to the following:
deployment.apps/shipping created service/shipping created
Wait until the Pods are running and the Service has an external load balancer IP address assigned.
kubectl get pods kubectl get service shipping
The output is similar to the following:
# kubectl get pods NAME READY STATUS RESTARTS AGE shipping-f5d6f8d5-56cvk 1/1 Running 0 4m47s shipping-f5d6f8d5-cj4vv 1/1 Running 0 4m48s shipping-f5d6f8d5-rrdj2 1/1 Running 0 4m47s # kubectl get service shipping NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE shipping LoadBalancer 34.118.225.125 34.16.39.182 80:30076/TCP 5m41s
For GKE Autopilot clusters, wait a few moments until the resources are ready.
Access the service through the EXTERNAL-IP
address.
export EXTERNAL_IP=$(kubectl get svc shipping --output jsonpath='{.status.loadBalancer.ingress[0].ip}') curl http://${EXTERNAL_IP}/packages/1
The output is similar to the following:
{"depth":10.0,"height":10.0,"special_handling_instructions":"No special handling instructions.","weight":10.0,"width":10.0}
7. Extra Credit: Troubleshooting the application
Remove the CloudSQL Client IAM role from the cloudsqlsa
service account. This causes an error connecting to the CloudSQL database.
gcloud projects remove-iam-policy-binding ${PROJECT_ID} \ --member="serviceAccount:cloudsqlsa@${PROJECT_ID}.iam.gserviceaccount.com" \ --role="roles/cloudsql.client"
Restart the shipping Pod.
kubectl rollout restart deployment shipping
After the Pod restarts, try accessing the shipping
service again.
export EXTERNAL_IP=$(kubectl get svc shipping --output jsonpath='{.status.loadBalancer.ingress[0].ip}') curl http://${EXTERNAL_IP}/packages/1
The output is similar to the following:
... <title>500 Internal Server Error</title> <h1>Internal Server Error</h1> <p>The server encountered an internal error and was unable to complete your request. Either the server is overloaded or there is an error in the application.</p>
Inspect the Logs by navigating to Kubernetes Engine > Workloads
Click on the shipping
deployment and then Logs tab.
Click on the View in Log Explorer icon on the right side of the status bar. This opens a new Log Explorer window.
Click on one of the Traceback
error entries, and then click Explain this Log Entry.
You can read the explanation of the error.
Next, let's get Duet AI to help troubleshoot the error.
Try the following prompt.
Prompt 1: Help me troubleshoot this error
Enter the error message in the prompt.
Prompt 2: Forbidden: Authenticated IAM principal does not seem authorized to make API request. Verify ‘Cloud SQL Admin API' is enabled within your GCP project and ‘Cloud SQL Client' role has been granted to IAM principal
And then.
Prompt 3: How do I assign the Cloud SQL Client role to a google service account using gcloud?
Assign the Cloud SQL Client role to the cloudsqlsa
.
gcloud projects add-iam-policy-binding ${PROJECT_ID} \ --member="serviceAccount:cloudsqlsa@${PROJECT_ID}.iam.gserviceaccount.com" \ --role="roles/cloudsql.client"
Wait a few moments and try accessing the application again.
export EXTERNAL_IP=$(kubectl get svc shipping --output jsonpath='{.status.loadBalancer.ingress[0].ip}') curl http://${EXTERNAL_IP}/packages/1
The output is similar to the following:
{"depth":10.0,"height":10.0,"special_handling_instructions":"No special handling instructions.","weight":10.0,"width":10.0}
You have successfully used Duet AI in Cloud Logging, Log Explorer and the Log Explainer feature to troubleshoot the issue.
8. Conclusion
Congratulations! You have successfully completed this codelab.
In this codelab, you learned the following:
- Activate Duet AI in your GCP project and configure it for use in an IDE and Cloud Console.
- Use Duet AI for code generation, completion and explanation.
- Use Duet AI to explain and troubleshoot an application issue.
- Duet AI features like IDE chat and multi-turn chat, chat vs inline code generation, smart actions like code explanation and recitation acknowledgement and more.
9. Appendix
package-service.yaml
swagger: "2.0"
info:
title: Shipping and Package Information API
description: This API provides information about shipping and packages.
version: 1.0.0
host: shipping.googleapis.com
schemes:
- https
produces:
- application/json
paths:
/packages/{product_id}:
get:
summary: Get information about a package
description: This method returns information about a package, including its height, width, depth, weight, and any special handling instructions.
parameters:
- name: product_id
in: path
required: true
type: integer
format: int64
responses:
"200":
description: A successful response
schema:
type: object
properties:
height:
type: integer
format: int64
width:
type: integer
format: int64
depth:
type: integer
format: int64
weight:
type: integer
format: int64
special_handling_instructions:
type: string
"404":
description: The product_id was not found
data_model.py
from sqlalchemy import Column, Integer, String, Float
from sqlalchemy.ext.declarative import declarative_base
from connect_connector import engine
Base = declarative_base()
class Package(Base):
__tablename__ = 'packages'
id = Column(Integer, primary_key=True)
product_id = Column(Integer, nullable=False)
height = Column(Float, nullable=False)
width = Column(Float, nullable=False)
depth = Column(Float, nullable=False)
weight = Column(Float, nullable=False)
special_handling_instructions = Column(String, nullable=True)
def create_tables():
Base.metadata.create_all(engine)
if __name__ == '__main__':
create_tables()
print('Tables created successfully.')
connect_connector.py
import os
from google.cloud.sql.connector import Connector, IPTypes
import sqlalchemy
# You may need to manually import pg8000 and Base as follows
import pg8000
from sqlalchemy.ext.declarative import declarative_base
def connect_with_connector() -> sqlalchemy.engine.base.Engine:
"""Initializes a connection pool for a Cloud SQL instance of Postgres."""
# Note: Saving credentials in environment variables is convenient, but not
# secure - consider a more secure solution such as
# Cloud Secret Manager (https://cloud.google.com/secret-manager) to help
# keep secrets safe.
instance_connection_name = os.environ[
"INSTANCE_CONNECTION_NAME"
] # e.g. 'project:region:instance'
db_user = os.environ["DB_USER"] # e.g. 'my-database-user'
db_pass = os.environ["DB_PASS"] # e.g. 'my-database-password'
db_name = os.environ["DB_NAME"] # e.g. 'my-database'
ip_type = IPTypes.PRIVATE if os.environ.get("PRIVATE_IP") else IPTypes.PUBLIC
connector = Connector()
def getconn() -> sqlalchemy.engine.base.Engine:
conn: sqlalchemy.engine.base.Engine = connector.connect(
instance_connection_name,
"pg8000",
user=db_user,
password=db_pass,
db=db_name,
ip_type=ip_type,
)
return conn
pool = sqlalchemy.create_engine(
"postgresql+pg8000://",
creator=getconn,
# ...
)
return pool
# Create a connection pool
engine = connect_with_connector()
# Create a sessionmaker class to create new sessions
SessionMaker = sqlalchemy.orm.sessionmaker(bind=engine)
# Create a Base class for ORM
# You may need to manually fix the following
Base = declarative_base()
db_init.py
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from connect_connector import engine
from data_model import Package
def create_packages():
# Create a session
session = sessionmaker(bind=engine)()
# Create 10 sample packages
for i in range(10):
package = Package(
product_id=i,
height=10.0,
width=10.0,
depth=10.0,
weight=10.0,
special_handling_instructions="No special handling instructions."
)
# Add the package to the session
session.add(package)
# Commit the changes
session.commit()
if __name__ == '__main__':
create_packages()
print('Packages created successfully.')
main.py
from flask import Flask, request, jsonify
from data_model import Package
from connect_connector import SessionMaker
app = Flask(__name__)
session_maker = SessionMaker()
@app.route("/packages/<int:product_id>", methods=["GET"])
def get_package(product_id):
"""Get information about a package."""
session = session_maker
package = session.query(Package).filter(Package.product_id == product_id).first()
if package is None:
return jsonify({"message": "Package not found."}), 404
return jsonify(
{
"height": package.height,
"width": package.width,
"depth": package.depth,
"weight": package.weight,
"special_handling_instructions": package.special_handling_instructions,
}
), 200
if __name__ == "__main__":
app.run(host="0.0.0.0")
test.py
import unittest
from data_model import Package
from connect_connector import SessionMaker
from main import app
class TestPackage(unittest.TestCase):
def setUp(self):
self.session_maker = SessionMaker()
def tearDown(self):
self.session_maker.close()
def test_get_package(self):
"""Test the `get_package()` function."""
package = Package(
product_id=11, # Ensure that the product_id different from the sample data
height=10,
width=10,
depth=10,
weight=10,
special_handling_instructions="Fragile",
)
session = self.session_maker
session.add(package)
session.commit()
response = app.test_client().get("/packages/11")
self.assertEqual(response.status_code, 200)
self.assertEqual(
response.json,
{
"height": 10,
"width": 10,
"depth": 10,
"weight": 10,
"special_handling_instructions": "Fragile",
},
)
if __name__ == "__main__":
unittest.main()
requirements.txt
cloud-sql-python-connector==1.2.4
sqlalchemy==1.4.36
pg8000==1.22.0
Flask==3.0.0
gunicorn==20.1.0
psycopg2-binary==2.9.3
Dockerfile
FROM python:3.10-slim
WORKDIR /app
COPY . ./
RUN pip install -r requirements.txt
# Add these manually for your project
ENV INSTANCE_CONNECTION_NAME=YOUR_INSTANCE_CONNECTION_NAME
ENV DB_USER=evolution
ENV DB_PASS=evolution
ENV DB_NAME=product_details
CMD ["python", "main.py"]