About this codelab
1. Overview
This lab demonstrates features and capabilities designed to streamline the development workflow for software engineers tasked with developing Python applications in a containerized environment. Typical container development requires the user to understand details of containers and the container build process. Additionally, developers typically have to break their flow, moving out of their IDE to test and debug their applications in remote environments. With the tools and technologies mentioned in this tutorial, developers can work effectively with containerized applications without leaving their IDE.
What is Cloud Workstations?
Cloud Workstations provides managed development environments on Google Cloud with built-in security and pre-configured yet customizable development environments. Access Cloud Workstations through a browser-based IDE, from multiple local code editors (such as VSCode or JetBrains IDEs such as IntelliJ IDEA Ultimate and PyCharm Professional), or through SSH.
Cloud Workstations uses the following resources:
- Administrators create workstation clusters
- In each workstation cluster, administrators create one or more workstation configurations that act as templates for workstations.
- Developers can create workstations that define development environments providing a Cloud IDE, language tooling, libraries, and more.
Cloud Workstations enables IT and security administrators to easily provision, scale, manage and secure their development environments and allows developers to access development environments with consistent configurations and customizable tooling.
Cloud Workstations helps with shifting security left by enhancing the security posture of your application development environments. It has security features such as VPC Service Controls, private ingress or egress, forced image update and Identity and Access Management access policies.
What is Cloud Code?
Cloud Code provides IDE support for the full development cycle of Kubernetes and Cloud Run applications, from creating and customizing a new application from sample templates to running your finished application. Cloud Code supports you along the way with run-ready samples, out-of-the-box configuration snippets, and a tailored debugging experience — making developing with Kubernetes and Cloud Run a whole lot easier!
Here are some of the Cloud Code feature:
- Continuously build and run applications
- Debugging support for your Kubernetes application under development
- Log streaming and viewing
Learn more about other Cloud Code features.
What you will learn
In this lab you will learn methods for developing with containers in GCP including:
- Review Cloud Workstations
- Launch Workstation
- Review Cloud Code
- Debug on Kubernetes
2. Setup and Requirements
Self-paced environment setup
- Sign-in to the Google Cloud Console and create a new project or reuse an existing one. If you don't already have a Gmail or Google Workspace account, you must create one.
- The Project name is the display name for this project's participants. It is a character string not used by Google APIs. You can update it at any time.
- The Project ID is unique across all Google Cloud projects and is immutable (cannot be changed after it has been set). The Cloud Console auto-generates a unique string; usually you don't care what it is. In most codelabs, you'll need to reference the Project ID (it is typically identified as
PROJECT_ID
). If you don't like the generated ID, you may generate another random one. Alternatively, you can try your own and see if it's available. It cannot be changed after this step and will remain for the duration of the project. - For your information, there is a third value, a Project Number which some APIs use. Learn more about all three of these values in the documentation.
- Next, you'll need to enable billing in the Cloud Console to use Cloud resources/APIs. Running through this codelab shouldn't cost much, if anything at all. To shut down resources so you don't incur billing beyond this tutorial, you can delete the resources you created or delete the whole project. New users of Google Cloud are eligible for the $300 USD Free Trial program.
Environment Setup
In Cloud Shell, set your project ID and the project number for your project. Save them as PROJECT_ID
and PROJECT_ID
variables.
export PROJECT_ID=$(gcloud config get-value project)
export PROJECT_NUMBER=$(gcloud projects describe $PROJECT_ID \
--format='value(projectNumber)')
In this lab you will deploy code to GKE. You will also use Cloud workstations as the IDE.
The setup script below prepares this infrastructure for you.
- Download the setup script and make it executable.
wget https://raw.githubusercontent.com/GoogleCloudPlatform/container-developer-workshop/main/labs/python/setup_with_cw.sh
chmod +x setup_with_cw.sh
- Open
setup_with_cw.sh
file and edit the values of passwords that are currently set to CHANGEME - Run the setup script to stand up a GKE cluster that you will use in this lab. This setup will take about 20 minutes.
./setup_with_cw.sh &
- Open Cloud Workstations in the Cloud Console. Wait for the cluster to be in
READY
status before moving to the next steps. - If your Cloud Shell session was disconnected, click "Reconnect" and then run the gcloud cli command to set the project ID. Replace sample project id below with your qwiklabs project ID before running the command.
gcloud config set project qwiklabs-gcp-project-id
- Download and run the script below in the terminal to create Cloud Workstations configuration.
wget https://raw.githubusercontent.com/GoogleCloudPlatform/container-developer-workshop/main/labs/python/workstation_config_setup.sh
chmod +x workstation_config_setup.sh
./workstation_config_setup.sh
Cloud Workstations Cluster and Configuration
Open Cloud Workstations in the Cloud Console. Verify that the cluster is in READY
status.
Verify status for existing Configurations.
Create a new workstation.
Change name to my-workstation
and select existing configuration: codeoss-python
.
Launch Workstation
- Start and launch the workstation. It will take a few moment to start the Workstation.
- Allow 3rd party cookies by clicking on the icon in the address bar.
- Click "Site not working?".
- Click "Allow cookies".
- Once the workstation launches you will see Code OSS IDE come up.
Click on "Mark Done" on the Getting Started page one the workstation IDE
3. Cloud Code Overview
Review different sections that are available in Cloud Code.
- Kubernetes development. Get a fully integrated Kubernetes development and debugging environment within your IDE. Create and manage clusters directly from within the IDE.
- Debug running applications. Debug the code within your IDEs using Cloud Code for VS Code and Cloud Code for IntelliJ by leveraging built-in IDE debugging features.
- Explore deployments. View underlying resources and metadata for your Kubernetes clusters and Cloud Run services. You can fetch a description, view logs, manage secrets, or get a terminal directly into a pod.
- Simplify Kubernetes local development. Under the covers, Cloud Code for IDEs uses popular tools such as Skaffold, Jib, and kubectl to provide continuous feedback on your code in real time.
Sign in to Google Cloud
- Click on Cloud Code icon and Select "Sign in to Google Cloud":
- Click "Proceed to sign in".
- Check the output in the Terminal and open the link:
- Login with your Qwiklabs students credentials.
- Select "Allow":
- Copy verification code and return to the Workstation tab.
- Paste the verification code and hit Enter.
Click on the "Allow" button if you see this message, so that you can copy paste into the workstation.
4. Create a new Python starter application
In this section you'll create a new Python application.
- Open a new Terminal.
- Make a new directory and open it as a workspace
mkdir music-service && cd music-service
code-oss-cloud-workstations -r --folder-uri="$PWD"
- Create a file called
requirements.txt
and copy the following contents into it
Flask
gunicorn
ptvsd==4.3.2
- Create a file named
app.py
and paste the following code into it
import os
from flask import Flask, request, jsonify
app = Flask(__name__)
@app.route("/")
def hello_world():
message="Hello, World!"
return message
if __name__ == '__main__':
server_port = os.environ.get('PORT', '8080')
app.run(debug=False, port=server_port, host='0.0.0.0')
- Create a file named
Dockerfile
and paste the following into it
FROM python:3.8
ARG FLASK_DEBUG=0
ENV FLASK_DEBUG=$FLASK_DEBUG
ENV FLASK_APP=app.py
WORKDIR /app
COPY requirements.txt .
RUN pip install --trusted-host pypi.python.org -r requirements.txt
COPY . .
ENTRYPOINT ["python3", "-m", "flask", "run", "--port=8080", "--host=0.0.0.0"]
Note: FLASK_DEBUG=1 allows you to auto reload code changes to a Python flask app. This Dockerfile allows you to pass this value as a build argument.
Generate Manifests
In your terminal execute the following command to generate a default skaffold.yaml
and deployment.yaml
- Initialize Skaffold with the following command
skaffold init --generate-manifests
When prompted, use the arrows to move your cursor and the spacebar to select the options.
Choose:
8080
for the porty
to save the configuration
Update Skaffold Configurations
- Change default application name
- Open
skaffold.yaml
- Select the image name currently set as
dockerfile-image
- Right click and choose Change All Occurrences
- Type in the new name as
python-app
- Further edit the build section to
- add
docker.buildArgs
to passFLASK_DEBUG=1
- Sync settings to load any changes to
*.py
files from IDE to running container
After the edits, the build section in the skaffold.yaml
file would be as under:
build:
artifacts:
- image: python-app
docker:
buildArgs:
FLASK_DEBUG: "1"
dockerfile: Dockerfile
sync:
infer:
- '**/*.py'
Modify Kubernetes Configuration File
- Change the default Name
- Open
deployment.yaml
file - Select the image name currently set as
dockerfile-image
- Right click and choose Change All Occurrences
- Type in the new name as
python-app
5. Walking through the development process
With the business logic added you can now deploy and test your application. The following section will showcase the use of the Cloud Code plugin. Among other things, this plugin integrates with skaffold to streamline your development process. When you deploy to GKE in the following steps, Cloud Code and Skaffold will automatically build your container image, push it to a Container Registry, and then deploy your
application to GKE. This happens behind the scenes abstracting the details away from the developer flow.
Add Kubernetes Cluster
- Add a Cluster
- Select Google Kubernetes Engine:
- Select project.
- Select "python-cluster" that was created in the initial setup.
- The cluster now shows up in the Kubernetes clusters list under Cloud Code. Navigate and explore the cluster from here.
Deploy to Kubernetes
- In the pane at the bottom of Cloud Shell Editor, select Cloud Code 
- In the panel that appears at the top, select Run on Kubernetes.
If prompted, select Yes to use the current Kubernetes context.
This command starts a build of the source code and then runs the tests. The build and tests will take a few minutes to run. These tests include unit tests and a validation step that checks the rules that are set for the deployment environment. This validation step is already configured, and it ensures that you get warning of deployment issues even while you're still working in your development environment.
- The first time you run the command a prompt will appear at the top of the screen asking if you want the current kubernetes context, select "Yes" to accept and use the current context.
- Next a prompt will be displayed asking which container registry to use. Press enter to accept the default value provided
- Select the "Output" tab in the lower pane to view progress and notifications. Using dropdown select "Kubernetes: Run/Debug"
- Select "Kubernetes: Run/Debug - Detailed" in the channel drop down to the right to view additional details and logs streaming live from the containers
When the build and tests are done, the Output tab logs will have the URL http://localhost:8080 is listed in the "Kubernetes: Run/Debug" view.
- In the Cloud Code terminal, hover over the first URL in the output (http://localhost:8080), and then in the tool tip that appears select Open Web Preview.
- A new browser tab will open and display the message:
Hello, World!
Hot Reload
- Open the
app.py
file - Change the greeting message to
Hello from Python
Notice immediately that in the Output
window, Kubernetes: Run/Debug
view, the watcher syncs the updated files with the container in Kubernetes
Update initiated Build started for artifact python-app Build completed for artifact python-app Deploy started Deploy completed Status check started Resource pod/python-app-6f646ffcbb-tn7qd status updated to In Progress Resource deployment/python-app status updated to In Progress Resource deployment/python-app status completed successfully Status check succeeded ...
- If you switch to
Kubernetes: Run/Debug - Detailed
view, you will notice it recognizes file changes then builds and redeploys the app
files modified: [app.py]
Syncing 1 files for gcr.io/veer-pylab-01/python-app:3c04f58-dirty@sha256:a42ca7250851c2f2570ff05209f108c5491d13d2b453bb9608c7b4af511109bd
Copying files:map[app.py:[/app/app.py]]togcr.io/veer-pylab-01/python-app:3c04f58-dirty@sha256:a42ca7250851c2f2570ff05209f108c5491d13d2b453bb9608c7b4af511109bd
Watching for changes...
[python-app] * Detected change in '/app/app.py', reloading
[python-app] * Restarting with stat
[python-app] * Debugger is active!
[python-app] * Debugger PIN: 744-729-662
- Refresh your browser tab where you saw previous results to see the updated results.
Debugging
- Go to the Debug view and stop the current thread
. If it asks, you can choose to clean up after each run.
- Click on
Cloud Code
in the bottom menu and selectDebug on Kubernetes
to run the application indebug
mode.
- In the
Kubernetes Run/Debug - Detailed
view ofOutput
window, notice that skaffold will deploy this application in debug mode.
- The first time this is run a prompt will ask where the source is inside the container. This value is related to the directories in the Dockerfile.
Press Enter to accept the default
It will take a couple of minutes for the application to build and deploy. If the debug session is disconnected, re-run steps to "Debug on Kubernetes" from the "Development Sessions" section.
- When the process completes. You'll notice a debugger attached and the Output tab says:
Attached debugger to container "python-app-8476f4bbc-h6dsl" successfully.
, and the URL http://localhost:8080 is listed.
Port forwarding pod/python-app-8bd64cf8b-cskfl in namespace default, remote port 5678 -> http://127.0.0.1:5678
- The bottom status bar changes its color from blue to orange indicating that it is in Debug mode.
- In the
Kubernetes Run/Debug
view, notice that a Debuggable container is started
**************URLs***************** Forwarded URL from service python-app: http://localhost:8080 Debuggable container started pod/python-app-8bd64cf8b-cskfl:python-app (default) Update succeeded ***********************************
Utilize Breakpoints
- Open the
app.py
file - Locate the statement which reads
return message
- Add a breakpoint to that line by clicking the blank space to the left of the line number. A red indicator will show to note the breakpoint is set
- Reload your browser and note the debugger stops the process at the breakpoint and allows you to investigate the variables and state of the application which is running remotely in GKE
- Click down into the VARIABLES section
- Click Locals there you'll find the
"message"
variable. - Double click on the variable name "message" and in the popup, change the value to something different like
"Greetings from Python"
- Click the Continue button in the debug control panel
- Review the response in your browser which now shows the updated value you just entered.
- Stop the "Debug" mode by pressing the stop button
and remove the breakpoint by clicking on the breakpoint again.
6. Cleanup
Congratulations! In this lab you've created a new Python application from scratch and configured it to work effectively with containers. You then deployed and debugged your application to a remote GKE cluster following the same developer flow found in traditional application stacks.
To clean up after completing the lab:
- Delete the files used in the lab
cd ~ && rm -rf ~/music-service
- Delete the project to remove all related infrastructure and resources
—
Last update: 3/22/23