1. Overview
In this code lab, you'll learn how to deploy the Pet Passport app, an AI agent that uses the Model Context Protocol (MCP) to combine data analysis and location services.
The app helps users plan a perfect day out with their dog based on breed popularity in New York City. The agent uses a "Macro-to-Micro" reasoning chain:
- Strategic Discovery (BigQuery): Identifies the NYC Zip Code with the highest population for a specific breed.
- Local Execution (Maps): Uses that Zip Code as a location bias to find "pet friendly cafes" and "dog parks".
- Itinerary Generation: Combines the data to create a "Pet Passport" itinerary with clickable links and images.
The agent is built using the google-adk framework and powered by Gemini.
Note: The complete project code, including the frontend UI, is available on GitHub. In this codelab, we will focus on the core agent logic and infrastructure setup.
2. Setup and requirements
First, let's make sure your development environment is set up correctly.
1. Authenticate with Google Cloud
Set your active Google Cloud project and authenticate. This is required for the agent to access BigQuery and other services.
gcloud config set project [YOUR-PROJECT-ID] gcloud auth application-default login --project [YOUR-PROJECT-ID]
Note: If you encounter errors about a different project during authentication, you can bypass it by disabling the quota project and setting it manually:
gcloud auth application-default login --disable-quota-project gcloud auth application-default set-quota-project [YOUR-PROJECT-ID]
2. Software Requirements
You need to have the following software installed on your local machine:
- Python (version 3.13 or higher is required)
- Git (to download the repository)
Download the Repository
The code for this project is available in the Google MCP repository. Clone the repository and navigate to the project folder:
git clone https://github.com/google/mcp.git cd examples/petpassport
3. Installation
Now that you have the files, let's set up the Python environment.
- Create a virtual environment: This keeps your dependencies isolated.
python3 -m venv .venv
- Activate the virtual environment:
- On Linux/macOS:
source .venv/bin/activate
- On Windows:
.venv\Scripts\activate
- On Linux/macOS:
- Install dependencies:
pip install google-adk==1.28.0 python-dotenv google-genai pillow uvicorn
Enable the Cloud APIs
Enable the following APIs in your project:
gcloud services enable \ bigquery.googleapis.com \ aiplatform.googleapis.com \ artifactregistry.googleapis.com \ cloudbuild.googleapis.com \ run.googleapis.com \ storage.googleapis.com
Choose a region
Set the region as an environment variable in your shell:
export REGION=us-central1
4. Acquire API Keys
To use the Maps and Gemini services, you need to acquire API keys and store them in a .env file in the project root.
1. Google Maps API Key
- Go to the Google Cloud Console.
- Navigate to APIs & Services > Credentials.
- Click Create Credentials > API key.
- Copy the generated key and add it to your
.envfile asMAPS_API_KEY=[YOUR_KEY]. - (Recommended) Restrict the key to allow only the Maps APIs used by the MCP server.
2. Gemini API Key (AI Studio)
- Go to Google AI Studio.
- Click Get API key or navigate to the API keys section.
- Click Create API key.
- Copy the key and add it to your
.envfile asGEMINI_API_KEY=[YOUR_KEY].
5. Install Dependencies
Create a requirements.txt file in the petpassport/ folder:
google-adk==1.28.0
python-dotenv
google-genai
pillow
6. Authenticate MCP Servers
This application relies on Model Context Protocol (MCP) servers to interact with Google Maps and BigQuery. To authenticate these servers, you need to configure the appropriate environment variables and headers.
- Google Maps MCP: Requires a valid Maps API key passed in the
X-Goog-Api-Keyheader. - BigQuery MCP: Requires OAuth credentials with access to the BigQuery service. The agent uses the default compute service account when running on Cloud Run, or your local credentials when running locally.
We provide a setup script setup/setup_env.sh in the repository that helps configure these variables in your .env file.
7. Creating the BigQuery Table
Before the agent can query dog license data, we need to create the dataset and table in BigQuery and load the data.
We provide a setup script setup/setup_bigquery.sh that performs the following steps:
- Creates a Cloud Storage bucket named
pet-passport-data-[PROJECT_ID]to store the raw data. - Downloads the public NYC Dog Licensing dataset (CSV).
- Uploads the CSV to the bucket.
- Creates a BigQuery dataset named
nyc_dogs. - Loads the data from the bucket into a table named
licensesin the dataset.
To run the setup script, execute the following command in your terminal:
bash setup/setup_bigquery.sh
8. Connect to MCP Servers
A key part of this app is using MCP to connect to data and services. In this section, you'll configure the MCP toolsets for BigQuery and Google Maps in a file called petpassport/tools.py.
Complete tools.py Code
Here is the full implementation for tools.py, including MCP toolsets and custom tools for image and data persistence. We have optimized this code to reduce redundancies by moving the bucket resolution to the module level:
import os
import dotenv
import google.auth
import time
import datetime
from google.cloud import storage
from PIL import Image
from google import genai
from google.adk.tools.mcp_tool.mcp_toolset import MCPToolset
from google.adk.tools.mcp_tool.mcp_session_manager import StreamableHTTPConnectionParams
MAPS_MCP_URL = "https://mapstools.googleapis.com/mcp"
BIGQUERY_MCP_URL = "https://bigquery.googleapis.com/mcp"
PROJECT_ID = os.getenv('GOOGLE_CLOUD_PROJECT', 'project_not_set')
BUCKET_NAME = f"pet-passport-data-{PROJECT_ID}"
def get_maps_mcp_toolset():
dotenv.load_dotenv()
maps_api_key = os.getenv('MAPS_API_KEY', 'no_api_found')
tools = MCPToolset(
connection_params=StreamableHTTPConnectionParams(
url=MAPS_MCP_URL,
headers={
"X-Goog-Api-Key": maps_api_key
},
timeout=30.0,
sse_read_timeout=300.0
)
)
print("Maps MCP Toolset configured.")
return tools
def get_bigquery_mcp_toolset():
credentials, project_id = google.auth.default(
scopes=["https://www.googleapis.com/auth/bigquery"]
)
credentials.refresh(google.auth.transport.requests.Request())
oauth_token = credentials.token
HEADERS_WITH_OAUTH = {
"Authorization": f"Bearer {oauth_token}",
"x-goog-user-project": project_id
}
tools = MCPToolset(
connection_params=StreamableHTTPConnectionParams(
url=BIGQUERY_MCP_URL,
headers=HEADERS_WITH_OAUTH,
timeout=30.0,
sse_read_timeout=300.0
)
)
print("BigQuery MCP Toolset configured.")
return tools
def generate_pet_passport_photo(prompt: str, image_path: str = None) -> str:
"""Generates an image using gemini-3.1-flash-image-preview based on a prompt and a reference image."""
client = genai.Client()
output_path = f"/tmp/pet_passport_{int(time.time())}.png"
try:
image = Image.open(image_path)
response = client.models.generate_content(
model="gemini-3.1-flash-image-preview",
contents=[prompt, image],
)
for part in response.parts:
if part.inline_data is not None:
generated_image = part.as_image()
generated_image.save(output_path)
# Upload to GCS and generate signed URL
try:
storage_client = storage.Client()
bucket = storage_client.bucket(BUCKET_NAME)
blob_name = os.path.basename(output_path)
blob = bucket.blob(blob_name)
blob.upload_from_filename(output_path)
url = blob.generate_signed_url(
version="v4",
expiration=datetime.timedelta(hours=24),
method="GET",
)
return url
except Exception as e:
print(f"Error uploading image to GCS: {e}")
return output_path
raise ValueError("No image was returned by the model.")
except Exception as e:
print(f"Error generating image: {e}")
raise
def save_pet_passport(user_id: str, breed: str, postal_code: str, route_details: str, image_paths: list[str] = None) -> str:
"""Appends the generated itinerary to the user's history in GCS."""
try:
storage_client = storage.Client()
bucket = storage_client.bucket(BUCKET_NAME)
blob = bucket.blob(f"user-{user_id}.json")
# Download existing or start fresh
# ... (Implementation details hidden for brevity) ...
return "Success"
except Exception as e:
print(f"Error saving path: {e}")
raise
Code Explanation: tools.py
get_maps_mcp_toolsetandget_bigquery_mcp_toolsetconfigure the MCP clients with correct endpoints and authentication headers.generate_pet_passport_photouses Gemini to create a scene and uploads the result to Google Cloud Storage, returning a Signed URL to the frontend to survive server restarts.
9. Creating the Agent
With your tools configured, it's time to build the agent's "brain." You will use the Agent Development Kit (ADK) to create an agent in a file called petpassport/agent.py.
Complete agent.py Code
Here is the full implementation for agent.py, where we define the agent and its instructions:
import os
import dotenv
import tools
from google.adk.agents import LlmAgent
dotenv.load_dotenv()
PROJECT_ID = os.getenv('GOOGLE_CLOUD_PROJECT', 'project_not_set')
maps_toolset = tools.get_maps_mcp_toolset()
bigquery_toolset = tools.get_bigquery_mcp_toolset()
root_agent = LlmAgent(
model='gemini-2.5-pro',
name='root_agent',
instruction=f"""
You are the Pet Passport Agent. Your goal is to help users find a fun walking route for their dog in NYC.
When given a breed and a postal code, follow this flow:
1. **Strategic Discovery:** Use BigQuery to find the most popular neighborhood for that breed in NYC.
2. **Local Execution:** Use Maps to build a walking route with specific places (parks, cafes) in that area.
**NO DIRECTIONS LINKS:** You must NOT include a Google Maps directions link (e.g., `https://www.google.com/maps/dir/...`) in your final response. Only provide links to individual places.
After generating the itinerary, you MUST call the `save_pet_passport` tool to save this path to the user's profile. Pass a clean summary of the itinerary as `route_details`. The summary should include details (like rating, description from maps).
""",
tools=[maps_toolset, bigquery_toolset, tools.generate_pet_passport_photo, tools.save_pet_passport]
)
Code Explanation: agent.py
- We import
toolsdirectly (flattened structure) to support the container environment. - The agent is initialized with
gemini-2.5-pro. - The instructions define a strict multi-step chain of thought (BigQuery first, then Maps) and strictly forbid the hallucination or rendering of walking routes that lead to clutter.
10. Running the Application Locally
Before deploying to Cloud Run, it's a good idea to test the application locally.
- Ensure you are in the project directory:
cd examples/petpassport
- Start the FastAPI server: We use
uvicornto run the app. The entry point ismain.pyinside thepetpassportfolder.uvicorn petpassport.main:app --reload
- Open the UI: Navigate to
http://127.0.0.1:8000/ui/in your browser to interact with the Pet Passport interface.
11. Deploying to Cloud Run
With your agent ready, it's time to deploy it to Cloud Run. We use the standard gcloud command directly to maintain strict control over the container environment.
From the project directory, run the following command:
gcloud run deploy petpassport \ --source petpassport \ --region $REGION \ --allow-unauthenticated \ --labels dev-tutorial=google-mcp
Configure Environment Variables
After deployment, navigate to the Cloud Run service in the Google Cloud Console and set the following environment variables under the Variables & Secrets tab:
MAPS_API_KEY: Your Google Maps API key.GOOGLE_CLOUD_PROJECT: Your project ID.PROJECT_ID: Your project ID (redundancy supported for legacy modules).
12. Sample Prompts
Try interacting with the deployed agent using these prompts:
- Standard: "I want to go for a walk with my Golden Retriever in NYC near 10021. Find a route for us that has a cafe."
- Different Breed: "I have a French Bulldog and we are in the Upper West Side (near 10024). Suggest a short walk that stops at a popular dog park."
- With Image: (Upload a photo of your dog) "Here is a photo of my Corgi! We are near 10013. Plan a perfect day out for us."
13. Clean up
To avoid incurring charges for the resources used in this tutorial:
- Delete the Cloud Run service:
gcloud run services delete petpassport --region=$REGION - Delete the GCS bucket:
gcloud storage rm -r gs://pet-passport-data-$PROJECT_ID