Multi-Cloud Data Streaming

1. Introduction


Last Updated: 2022-02-08

Why the need for Multi-Cloud Streaming?

Businesses are actively looking to spread their workloads across multiple cloud platforms as there are several advantages of adopting multiple cloud providers into a business's operational workflow like:

  • Vendor lock-in
  • Mitigate Business Continuity Risk
  • Dependencies and Reliance on one Technology Stack

What you'll build

In this codelab, you're going to set-up a multi-cloud data streaming between AWS and GCP leveraging Confluent Cloud. You will:

  • Set-up a Dedicated Kafka Cluster in Confluent Cloud on GCP.
  • Set-up a VPC peering between Confluent Cloud and your GCP Project so that it can be accessed on a private network.
  • Setup a VPN tunnel between your AWS project and GCP project so the resources on both hyperscalers can connect.
  • Access Confluent Kafka from AWS and set-up a streaming pipeline from AWS to GCP leveraging Confluent Kafka Cloud.

What you'll learn

  • How to set-up VPC peering between Confluent Cloud Kafka and your GCP project.
  • How to set-up a VPN tunnel between AWS and GCP projects.
  • How to access GCP hosted Confluent Cloud from your AWS project.

This codelab is focused on building multi-cloud data streaming. Non-relevant concepts and code blocks are glossed over and are provided for you to simply copy and paste.

What you'll need

  • AWS project access.
  • GCP project access.
  • Experience of GCP & AWS Cloud.
  • Confluent Kafka Subscription from GCP Marketplace.

2. Getting set up

Setup the VPN connection between AWS and GCP

  • This demo uses the default VPC (asia-southeast1) in GCP and ng-vpc-103-mum (ap-south1) in AWS.



  • On AWS, Create the Customer Gateway and the Virtual Private Gateway (VPG). Attach the VPG to the AWS VPC.



  • Create the site to site VPN connection on AWS.

4c59066457f3ff9.png a2dde7c3e7cf82e9.png

  • Download the configuration file for the tunnel created in AWS. Select
  • Vendor : Generic
  • Platform : Generic
  • Software : Vendor Agnostic
  • Ike Version : Ikev2
  • Create the VPN Gateway and the VPN Tunnel on GCP.


Provide the IP address and the Ikev2 key from the downloaded AWS Configuration file for both the tunnels.

  • Once complete, the tunnel should be UP and Running on AWS and GCP.



The tunnel setup is complete now.

  • Select a CIDR block that will be used to configure the Confluent Cloud. Add this to the VPN tunnel as a static route on AWS.


Confluent Kafka on GCP marketplace

  • Subscribe to Confluent Kafka from GCP marketplace.


  • Login to Confluent cloud and create a Dedicated Cluster



  • Provide the GCP Project ID along with VPC Network details for peering.


  • Complete the VPC Peering at GCP end with the provided Confluent Cloud cluster network details.


  • Confluent Cloud Cluster is Active and Peered with GCP now.


  • Test the connectivity to the Confluent Cluster from GCP.
  • Provision a GCE Instance and install Python3.
  • Generate the Python Client Key in Confluent Cloud


  • Run the test script "" from GCP instance:


  • Now, download the example codebase on the AWS instance and execute "" script to test the hybrid connectivity to Confluent Cloud from GCP.


3. Congratulations

Congratulations, you've successfully built your multi-cloud cross-regional Confluent Kafka streaming platform between AWS & GCP.

Helpful Codelabs**?**

Check out some of these codelabs...

Further reading