Internet of Things (IoT), the embedding of connected computing into everyday devices, has already emerged as a major trend and is projected to continue to expand rapidly in the coming years. However, basic requirements such as maintaining security, managing a large-scale deployment and handling the data ingestion requirements of many distributed devices remain as significant challenges. Google Cloud's IoT Core addresses simplifies these tasks which makes it easier to focus on gaining value and insight from an IoT project.

What you will build

In this codelab, you are going to build a data pipeline that starts with an Internet of Things (IoT) device that captures heart rate, leverages IoT Core to securely publish the data to a message queue where it will then be transported into a data warehouse. A Raspberry Pi with a heart rate sensor will be used for the IoT device and several components of the Google Cloud Platform will form the data pipeline. Building out the Raspberry Pi, while beneficial, is an optional portion of this codelab and the streaming data can be simulated with a script.

After completing the steps in this codelab, you will have a streaming data pipeline feeding a data warehouse where it can be retrieved to graphically display heart rate data over time.

What you'll learn

What you'll need

If you want to build the IoT sensor portion of this codelab instead of leveraging sample data and a script, you'll also need the following (which can be ordered as a complete kit or as individual parts here)...

In addition, having access to a computer monitor or TV with HDMI input, HDMI cable, keyboard and a mouse is assumed.

Self-paced environment setup

If you don't already have a Google account (Gmail or G Suite), you must create one. Regardless of whether you already have a Google account or not, make sure to take advantage of the $300 free trial!

Sign-in to Google Cloud Platform console ( You can use the default project ("My First Project") for this lab or you can choose to create a new project. If you'd like to create a new project, you can use the Manage resources page. The project ID needs to be a unique name across all Google Cloud projects (the one shown below has already been taken and won't work for you). Take note of your project ID (i.e. Your project ID will be _____) as it will be needed later.

Running through this codelab shouldn't cost more than a few dollars, but it could be more if you decide to use more resources or if you leave them running. Make sure to go through the Cleanup section at the end of the codelab.

Enable APIs

In this codelab, we will be using Pub/Sub, Dataflow, Compute Engine and IoT Core, so we'll need to enable those APIs.

Select APIs & Services from the Cloud Console menu.

Click on Enable APIs and Services

In the Search Bar, type in PubSub

Click on the resulting tile that says "Cloud Pub/Sub API" and, on the following page, "Enable API"

Repeat this process for Dataflow, Compute Engine and IoT Core

BigQuery is a serverless, highly scalable, low cost enterprise data warehouse that will be an ideal option to store data being streamed from IoT devices.

Let's create a table that will hold all of the IoT heart rate data. Select BigQuery from the Cloud console. This will open BigQuery in a new window (don't close the original window as you'll need to access it again).

Click on the down arrow icon next to your project name and then select "Create new dataset"

Enter "heartRateData" for the Dataset, select a location where it will be stored and Click "OK"

Click the "+" sign next to your Dataset to create a new table

For Source Data, select Create empty table. For Destination table name, enter heartRateDataTable. Under Schema, click the Add Field button until there are a total of 4 fields. Fill in the fields as shown below making sure to also select the appropriate Type for each field. When everything is complete, click on the Create Table button.

You should see a result like this...

You now have a data warehouse setup to receive your heart rate data.

Cloud Pub/Sub is a simple, reliable, scalable foundation for streaming data and event-driven computing systems. As a result, it is perfect for handling incoming IoT messages and then allowing downstream systems to process them which is why it is tightly linked with Cloud IoT Core.

If you are still in the window for BigQuery, switch back to the Cloud Console. If you closed the Cloud Console, go to

From the Cloud Console, select Pub/Sub and then Topics.

If you see an Enable API prompt, click the Enable API button.

Click on the Create a topic button

Enter "heartratedata" as the topic name and then click Create

You should see the newly created topic

You now have a Pub/Sub topic to publish IoT messages to. Let's create a subscription to allow other Google Cloud services to access those messages by clicking on the three dot menu to the right of the topic that was created and then on New Subscription.

Enter "heartratedata" as the subscription name, leave the delivery type as "Pull" and then click Create

You now have a Pub/Sub subscription to pull IoT messages from.

Cloud computing has made possible fully managed models of computing where logic can be created easily. For this lab, a Dataflow template will be used to create a process that monitors a Pub/Sub topic for incoming messages and then moves them to BigQuery.

Dataflow will need a location to store temporary files, so we will provide a location in Google Cloud Storage. From the Cloud Console, select Storage and then Browser.

Click the Create Bucket button

Choose a name for the storage bucket (it must be a name that is globally unique across all of Google Cloud). Select Regional and make sure it is in the same region as the rest of your project's services. Click on the Create button

From the Cloud Console, select Dataflow

Click on Create Job from Template. Give the job a name and select the PubSub to BigQuery template. Make sure the Regional endpoint matches where the rest of your project resources are located. Fill in the remainder of the field making sure that they align with the name of your storage bucket, Pub/Sub topic and BigQuery dataset and tablename. Click on Run Job.

Your Dataflow job has started.

Congratulations! You just connected Pub/Sub to BigQuery via Dataflow.

Cloud IoT Core is a fully managed service that allows you to easily and securely connect, manage, and ingest data from millions of globally dispersed devices. Cloud IoT Core is the central service that, when combined with other services on Google Cloud IoT platform, provides a complete solution for collecting, processing, analyzing, and visualizing IoT data in real time.

From the Cloud Console, select Cloud IoT Core.

If you see an Enable API prompt, click the Enable API button.

Click on Create a device registry.

Enter "heartrate" as the Registry ID, select a Region close to you, disable the HTTP protocol (since we are only using the MQTT protocol, limiting access is a good security practice) and select the Pub/Sub topic that was created in a previous step. Click on the Create button.

The registry is now ready for devices to be added.

Assemble the Raspberry Pi and heart rate receiver

If there are more than 3 pins, trim the header down to only 3 pins. Solder the header pins to the sensor board.

Carefully install the hammer header pins into the Raspberry Pi.

Format the SD card and install the NOOBS (New Out Of Box Software) installer by following the steps here. Insert the SD card into the Raspberry Pi and place the Raspberry Pi into its case.

Use the breadboard wires to connect the heart rate receiver to the Raspberry Pi according to the diagram below.

Raspberry Pi pin

Receiver connection

Pin 16 (GPIO23)

<not labelled>

Pin 17 (3V3)

<not labelled>

Pin 20 (GND)


Connect the monitor (using the mini-HDMI connector), keyboard/mouse (with the USB hub) and finally, power adapter.

Configure the Raspberry Pi

After the Raspberry Pi finishes booting up, select Raspbian for the desired operating system, make certain your desired language is correct and then click on Install (hard drive icon on the upper left portion of the window).

Click on the Wifi icon (top right of the screen) and select a network. If it is a secured network, enter the password (pre shared key).

Click on the raspberry icon (top left of the screen), select Preferences and then Raspberry Pi Configuration. From the Interfaces tab, enable I2C and SSH. From the Localisation tab, set the Locale and the Timezone. After setting the Timezone, allow the Raspberry Pi to reboot.

After the reboot has completed, click on the Terminal icon to open a terminal window.

Install the necessary software

Make sure that all the software on the Raspberry Pi is up to date and that needed packages are installed

  sudo apt-get update
  sudo apt-get install git

Clone the project code for the heart rate receiver

  git clone
  cd iotcore-heartrate

Make sure the required core packages are all installed

  chmod +x

Create a security certificate

In order to communicate with Google Cloud, a security certificate must be generated and then registered with IoT core.

  chmod +x

To transfer the public key to your computer so that it can later be registered with IoT Core, use SFTP (secure file transfer protocol).

You now have a completed IoT heart rate sensor.

From the Cloud Console, return to IoT Core

Click the heartrate registry

Click Add device

For the Device ID, enter raspberryHeartRate. For public key format, select ES256. For authentication, either choose manual and then copy/paste the value from the key file into the Public key value area or choose upload and then upload the key from your computer. Click Add.

IoT Core is now ready to receive communication from the Raspberry Pi.

Data streaming from a Raspberry Pi

If you constructed a Raspberry Pi IoT heart rate sensor, put on the heart rate strap and then start the script that will receive the heartbeat signals, calculate heart rate in beats per minute and push the data to Google Cloud.

If you aren't in the /home/pi/iotcore-heartrate directory on the Raspberry Pi, move there first

  cd /home/pi/iotcore-heartrate

Start the heart rate script by changing the following to match your project, registry and device

  python --project_id=myproject --registry_id=myregistry --device_id=mydevice

You should see the terminal window echo the heart rate data results about every 10 seconds. With data flowing, you can skip to the next section (Check that Data is Flowing).

You now have a completed IoT heart rate sensor that is streaming data to Google Cloud.

Simulated data streaming

If you didn't build the IoT heart rate sensor, you can simulate data streaming by using a sample dataset, reading it with a Python script running on a VM in Compute Engine and feeding it into IoT Core.

From the Cloud Console menu, select Compute Engine.

Click on Create

Keep the default options, but make certain that the Region matches where you have chosen to place your other services.

Once the VM has started, click the SSH link to connect to the terminal of the virtual machine.

Install the necessary software

Make sure that all the software on VM is up to date and that needed packages are installed

  sudo apt-get update
  sudo apt-get install git

Clone the project code for the heart rate receiver which contains the script to simulate heart rate use along with the sample data

  git clone
  cd iotcore-heartrate

Make sure the required core packages are all installed

  chmod +x

Create a security certificate

In order to communicate with Google Cloud, a security certificate must be generated and then registered with IoT core.

  chmod +x

View the contents of the security key

  cat ../.ssh/ec_public.pem

Highlight the contents of the key -- this will automatically copy the contents.

Keep the VM terminal open. Return to the Cloud Console and go to IoT Core.

Click on the existing Registry.

Add a new device

Give it a name (e.g. myVM) and select ES256 for the public key format. Paste the value of the key into the Public key value window. Click the Add button.

You are now ready to receive data from the simulation script in the VM.

Run the simulation script

Return to the terminal window for your VM. Run the simulation script, but replace the values below with ones that match your project and IoT Core registry settings.

  python --project_id=myproject --registry_id=myregistry --device_id=myVM --private_key_file=../.ssh/ec_private.pem

You should see the data being sent via MQTT from your VM terminal.

BigQuery data

Check to make sure that data is flowing into the BigQuery table. From the Cloud Console, go to BigQuery (

Under the project name (on the left hand side of the window), click on the Dataset (heartRateData), then on the table (heartRateDataTable) and then click on the Query Table button

Add an asterisk to the SQL statement so it reads SELECT * FROM... then add "ORDER BY timecollected ASC" as shown below and then click the RUN QUERY button

If prompted, click on the Run query button

If you see results, then data is flowing properly.

With data flowing, you are now ready to visualize the data.

Google Sheets can be used directly from BigQuery to easily visualize query results. From your query results, click on Save to Google Sheets.

When the results have been saved to Google Sheets, click on the link that says Click to View.

The data will open in Google Sheets.

Highlight the two columns that contain the timecollected and the heartrate. Then select Insert and Chart from the top menu.

If the chart appears as a histogram, use the Chart Editor on the right side of the screen to select the drop down options for Chart type.

Then select a line graph (top left option).

The chart should now display a visualization of heart rate over time

You've created an entire data pipeline! In doing so, you've used Google IoT Core to secure IoT devices and to allow data to flow into Google Pub/Sub, deployed Dataflow from a template and pushed data into BigQuery and then used the integration with Google Sheets to perform a quick data visualization.


Once you are done experimenting with the heart rate data pipeline, you can remove the running resources.

If you built the IoT sensor, shut it down. Hit Ctrl-C in the terminal window to stop the script and then type the following to power down the Raspberry Pi

  shutdown -h now

Go to Dataflow, click on the link to heartrate-streaming and then click on the Stop Job button followed by selecting Cancel and Stop Job

Go to Pub/Sub, click on Topic, click on the checkbox next to the heartratedata topic and then click on Delete

Go to Storage, click on the checkboxes next to the storage buckets and then click on Delete

Go to BigQuery, click the down arrow next to your project name, click the down arrow to the right of the heartRateData dataset and then click on Delete dataset.

When prompted, type in the dataset ID (heartRateData) in order to finish deleting the data.

Go to IoT Core and click on the registry. Click on each device that was added and then click on Delete. Once all devices have been deleted, delete the registry.