Using IoT Core to Stream Heart Rate Data

Internet of Things (IoT), the embedding of connected computing into everyday devices, has already emerged as a major trend and is projected to continue to expand rapidly in the coming years. However, basic requirements such as maintaining security, managing a large-scale deployment and handling the data ingestion requirements of many distributed devices remain as significant challenges. Google Cloud's IoT Core addresses simplifies these tasks which makes it easier to focus on gaining value and insight from an IoT project.

What you will build

In this codelab, you are going to build a data pipeline that starts with an Internet of Things (IoT) device that captures heart rate, leverages IoT Core to securely publish the data to a message queue where it will then be transported into a data warehouse. A Raspberry Pi with a heart rate sensor will be used for the IoT device and several components of the Google Cloud Platform will form the data pipeline. Building out the Raspberry Pi, while beneficial, is an optional portion of this codelab and the streaming data can be simulated with a script.

5e7f8392c1231de9.png

After completing the steps in this codelab, you will have a streaming data pipeline feeding a data warehouse where it can be retrieved to graphically display heart rate data over time.

ec0b257abfa3c900.png

What you'll learn

  • How to use Google IoT Core
  • How to utilize Google Pub/Sub
  • How to create a data pipeline with zero code using a Google Cloud Dataflow template
  • How to leverage Google BigQuery
  • How to visualize data using the Google Sheets integration with BigQuery

What you'll need

  • A Google Cloud Platform account. New users of Google Cloud Platform are eligible for a $300 free trial.

If you want to build the IoT sensor portion of this codelab instead of leveraging sample data and a script, you'll also need the following ( which can be ordered as a complete kit or as individual parts here)...

  • Raspberry Pi Zero W with power supply, SD memory card and case
  • USB card reader
  • USB hub (to allow for connecting a keyboard and mouse into the sole USB port on the Raspberry Pi)
  • GPIO Hammer Headers
  • Female-to-male breadboard wires
  • Polar T34 Heart Rate Transmitter and Polar Heart Rate receiver
  • Soldering iron with solder

In addition, having access to a computer monitor or TV with HDMI input, HDMI cable, keyboard and a mouse is assumed.

Self-paced environment setup

If you don't already have a Google account (Gmail or G Suite), you must create one. Regardless of whether you already have a Google account or not, make sure to take advantage of the $300 free trial!

Sign-in to Google Cloud Platform console ( console.cloud.google.com). You can use the default project ("My First Project") for this lab or you can choose to create a new project. If you'd like to create a new project, you can use the Manage resources page. The project ID needs to be a unique name across all Google Cloud projects (the one shown below has already been taken and won't work for you). Take note of your project ID (i.e. Your project ID will be _____) as it will be needed later.

c9020c990a8098f4.png

38e3407fb21de524.png

Running through this codelab shouldn't cost more than a few dollars, but it could be more if you decide to use more resources or if you leave them running. Make sure to go through the Cleanup section at the end of the codelab.

Enable APIs

In this codelab, we will be using Pub/Sub, Dataflow, Compute Engine and IoT Core, so we'll need to enable those APIs.

Select APIs & Services from the Cloud Console menu.

5a7fc95fb6d9f3d6.png

Click on Enable APIs and Services

73b15205b6c06490.png

In the Search Bar, type in PubSub

cbe17163486224f2.png

Click on the resulting tile that says "Cloud Pub/Sub API" and, on the following page, "Enable API"

50306465bccfef6b.png

Repeat this process for Dataflow, Compute Engine and IoT Core

BigQuery is a serverless, highly scalable, low cost enterprise data warehouse that will be an ideal option to store data being streamed from IoT devices.

Let's create a table that will hold all of the IoT heart rate data. Select BigQuery from the Cloud console. This will open BigQuery in a new window (don't close the original window as you'll need to access it again).

12a838f78a10144a.png

Click on the down arrow icon next to your project name and then select "Create new dataset"

16fbe809352b0a4e.png

Enter "heartRateData" for the Dataset, select a location where it will be stored and Click "OK"

4da211f3407d97df.png

Click the "+" sign next to your Dataset to create a new table

a315c93c1fe31cc1.png

For Source Data, select Create empty table. For Destination table name, enter heartRateDataTable. Under Schema, click the Add Field button until there are a total of 4 fields. Fill in the fields as shown below making sure to also select the appropriate Type for each field. When everything is complete, click on the Create Table button.

8f8f3232e0587fe9.png

You should see a result like this...

f8876750373f7e65.png

You now have a data warehouse setup to receive your heart rate data.

Cloud Pub/Sub is a simple, reliable, scalable foundation for streaming data and event-driven computing systems. As a result, it is perfect for handling incoming IoT messages and then allowing downstream systems to process them which is why it is tightly linked with Cloud IoT Core.

If you are still in the window for BigQuery, switch back to the Cloud Console. If you closed the Cloud Console, go to https://console.cloud.google.com

From the Cloud Console, select Pub/Sub and then Topics.

331ad71e8a1ea7b.png

If you see an Enable API prompt, click the Enable API button.

9f6fca9dc8684801.png

Click on the Create a topic button

643670164e9fae12.png

Enter "heartratedata" as the topic name and then click Create

57bd1edeed73578b.png

You should see the newly created topic

90c77b7f986a6727.png

You now have a Pub/Sub topic to publish IoT messages to. Let's create a subscription to allow other Google Cloud services to access those messages by clicking on the three dot menu to the right of the topic that was created and then on New Subscription.

a33ad3dda71e1934.png

Enter "heartratedata" as the subscription name, leave the delivery type as "Pull" and then click Create

326dc3616f3d8e05.png

You now have a Pub/Sub subscription to pull IoT messages from.

376daccd2eeb253e.png

Cloud computing has made possible fully managed models of computing where logic can be created easily. For this lab, a Dataflow template will be used to create a process that monitors a Pub/Sub topic for incoming messages and then moves them to BigQuery.

Dataflow will need a location to store temporary files, so we will provide a location in Google Cloud Storage. From the Cloud Console, select Storage and then Browser.

c4414fe61be320a9.png

Click the Create Bucket button

cde91311b267fc65.png

Choose a name for the storage bucket (it must be a name that is globally unique across all of Google Cloud). Select Regional and make sure it is in the same region as the rest of your project's services. Click on the Create button

f4757935da4a655a.png

From the Cloud Console, select Dataflow

76e997f4f5dfb28d.png

Click on Create Job from Template. Give the job a name and select the Pub/Sub Subscription to Big/Query template. Make sure the Regional endpoint matches where the rest of your project resources are located. Fill in the remainder of the field making sure that they align with the name of your storage bucket, Pub/Sub subscription and BigQuery dataset and tablename. Click on Run Job.

2127c9a4bbc8eeb6.png

Your Dataflow job has started.

a175ff8628cb557b.png

Congratulations! You just connected Pub/Sub to BigQuery via Dataflow.

Cloud IoT Core is a fully managed service that allows you to easily and securely connect, manage, and ingest data from millions of globally dispersed devices. Cloud IoT Core is the central service that, when combined with other services on Google Cloud IoT platform, provides a complete solution for collecting, processing, analyzing, and visualizing IoT data in real time.

From the Cloud Console, select Cloud IoT Core.

5d4e072140ef895d.png

If you see an Enable API prompt, click the Enable API button.

fb7fafeb180e2a8d.png

Click on Create a device registry.

146e6c46f6fd7328.png

Enter "heartrate" as the Registry ID, select a Region close to you, disable the HTTP protocol (since we are only using the MQTT protocol, limiting access is a good security practice) and select the Pub/Sub topic that was created in a previous step. Click on the Create button.

f91bdf2ef5a2acc7.png

The registry is now ready for devices to be added.

12b6b867e73286a7.png

Assemble the Raspberry Pi and heart rate receiver

If there are more than 3 pins, trim the header down to only 3 pins. Solder the header pins to the sensor board.

9f1202d329ae90ae.jpeg

Carefully install the hammer header pins into the Raspberry Pi.

a3a697907fe3c9a9.png

Format the SD card and install the NOOBS (New Out Of Box Software) installer by following the steps here. Insert the SD card into the Raspberry Pi and place the Raspberry Pi into its case.

1e4e2459cd3333ec.png

Use the breadboard wires to connect the heart rate receiver to the Raspberry Pi according to the diagram below.

dd994c3256d0e486.png

Raspberry Pi pin

Receiver connection

Pin 16 (GPIO23)

<not labelled>

Pin 17 (3V3)

<not labelled>

Pin 20 (GND)

GND

ff973bcef73f5dde.jpeg

Connect the monitor (using the mini-HDMI connector), keyboard/mouse (with the USB hub) and finally, power adapter.

Configure the Raspberry Pi

After the Raspberry Pi finishes booting up, select Raspbian for the desired operating system, make certain your desired language is correct and then click on Install (hard drive icon on the upper left portion of the window).

a16f0da19b93126.png

Click on the Wifi icon (top right of the screen) and select a network. If it is a secured network, enter the password (pre shared key).

17f380b2d41751a8.png

Click on the raspberry icon (top left of the screen), select Preferences and then Raspberry Pi Configuration. From the Interfaces tab, enable I2C and SSH. From the Localisation tab, set the Locale and the Timezone. After setting the Timezone, allow the Raspberry Pi to reboot.

14741a77fccdb7e7.png

After the reboot has completed, click on the Terminal icon to open a terminal window.

9df6f228f6a31601.png

Install the necessary software

Make sure that all the software on the Raspberry Pi is up to date and that needed packages are installed

  sudo apt-get update
  sudo apt-get install git

Clone the project code for the heart rate receiver

  git clone https://github.com/googlecodelabs/iotcore-heartrate
  cd iotcore-heartrate

Make sure the required core packages are all installed

  chmod +x initialsoftware.sh
  ./initialsoftware.sh

Create a security certificate

In order to communicate with Google Cloud, a security certificate must be generated and then registered with IoT core.

  chmod +x generate_keys.sh
  ./generate_keys.sh

To transfer the public key to your computer so that it can later be registered with IoT Core, use SFTP (secure file transfer protocol).

You now have a completed IoT heart rate sensor.

From the Cloud Console, return to IoT Core

5d4e072140ef895d.png

Click the heartrate registry

bba0a215c912e13f.png

Click Add device

f2aa2f47b6d53205.png

For the Device ID, enter raspberryHeartRate. For public key format, select ES256. For authentication, either choose manual and then copy/paste the value from the key file into the Public key value area or choose upload and then upload the key from your computer. Click Add.

26c474608e3d3fc8.png

IoT Core is now ready to receive communication from the Raspberry Pi.

b1608d63aef63e60.png

Data streaming from a Raspberry Pi

If you constructed a Raspberry Pi IoT heart rate sensor, put on the heart rate strap and then start the script that will receive the heartbeat signals, calculate heart rate in beats per minute and push the data to Google Cloud.

If you aren't in the /home/pi/iotcore-heartrate directory on the Raspberry Pi, move there first

  cd /home/pi/iotcore-heartrate

Start the heart rate script by changing the following to match your project, registry and device

  python checkHeartRate.py --project_id=myproject --registry_id=myregistry --device_id=mydevice

You should see the terminal window echo the heart rate data results about every 10 seconds. With data flowing, you can skip to the next section (Check that Data is Flowing).

You now have a completed IoT heart rate sensor that is streaming data to Google Cloud.

Simulated data streaming

If you didn't build the IoT heart rate sensor, you can simulate data streaming by using a sample dataset, reading it with a Python script running on a VM in Compute Engine and feeding it into IoT Core.

From the Cloud Console menu, select Compute Engine.

b8d59fc781d7a811.png

Click on Create

df27458f07b61631.png

Keep the default options, but make certain that the Region matches where you have chosen to place your other services.

42bf2bd4a4578940.png

Once the VM has started, click the SSH link to connect to the terminal of the virtual machine.

32eee20eb9a559a5.png

Install the necessary software

Make sure that all the software on VM is up to date and that needed packages are installed

  sudo apt-get update
  sudo apt-get install git

Clone the project code for the heart rate receiver which contains the script to simulate heart rate use along with the sample data

  git clone https://github.com/googlecodelabs/iotcore-heartrate
  cd iotcore-heartrate

Make sure the required core packages are all installed

  chmod +x initialsoftware.sh
  ./initialsoftware.sh

Create a security certificate

In order to communicate with Google Cloud, a security certificate must be generated and then registered with IoT core.

  chmod +x generate_keys.sh
  ./generate_keys.sh

View the contents of the security key

  cat ../.ssh/ec_public.pem

Highlight the contents of the key – this will automatically copy the contents.

Keep the VM terminal open. Return to the Cloud Console and go to IoT Core.

5d4e072140ef895d.png

Click on the existing Registry.

7be848fc031cd7aa.png

Add a new device

6e5778ab7f5e14d1.png

Give it a name (e.g. myVM) and select ES256 for the public key format. Paste the value of the key into the Public key value window. Click the Add button.

206697e2805c6e9c.png

You are now ready to receive data from the simulation script in the VM.

d88564dfa88ef9a9.png

Run the simulation script

Return to the terminal window for your VM. Run the simulation script, but replace the values below with ones that match your project and IoT Core registry settings.

  python heartrateSimulator.py --project_id=myproject --registry_id=myregistry --device_id=myVM --private_key_file=../.ssh/ec_private.pem

You should see the data being sent via MQTT from your VM terminal.

BigQuery data

Check to make sure that data is flowing into the BigQuery table. From the Cloud Console, go to BigQuery (bigquery.cloud.google.com).

85627127d58f1d2e.png

Under the project name (on the left hand side of the window), click on the Dataset (heartRateData), then on the table (heartRateDataTable) and then click on the Query Table button

442618dc9aec1426.png

Add an asterisk to the SQL statement so it reads SELECT * FROM... then add "ORDER BY timecollected ASC" as shown below and then click the RUN QUERY button

65995d4bd90f9074.png

If prompted, click on the Run query button

2c894d091b789ca3.png

If you see results, then data is flowing properly.

760d48e185671591.png

With data flowing, you are now ready to visualize the data.

Google Sheets can be used directly from BigQuery to easily visualize query results. From your query results, click on Save to Google Sheets.

760d48e185671591.png

When the results have been saved to Google Sheets, click on the link that says Click to View.

17f86cb3e17782fe.png

The data will open in Google Sheets.

7b4dc8352d15ca0f.png

Highlight the two columns that contain the timecollected and the heartrate. Then select Insert and Chart from the top menu.

f758e1fb8f3e1e8.png

If the chart appears as a histogram, use the Chart Editor on the right side of the screen to select the drop down options for Chart type.

20dd33df18d29986.png

Then select a line graph (top left option).

86fc1021e840bd0.png

The chart should now display a visualization of heart rate over time

44e598f3de161fd1.png

You've created an entire data pipeline! In doing so, you've used Google IoT Core to secure IoT devices and to allow data to flow into Google Pub/Sub, deployed Dataflow from a template and pushed data into BigQuery and then used the integration with Google Sheets to perform a quick data visualization.

5e7f8392c1231de9.png

Clean-up

Once you are done experimenting with the heart rate data pipeline, you can remove the running resources.

If you built the IoT sensor, shut it down. Hit Ctrl-C in the terminal window to stop the script and then type the following to power down the Raspberry Pi

  shutdown -h now

Go to Dataflow, click on the link to heartrate-streaming and then click on the Stop Job button followed by selecting Cancel and Stop Job

6e5c9397ef1ee6a5.png

Go to Pub/Sub, click on Topic, click on the checkbox next to the heartratedata topic and then click on Delete

741b36a4e450c5b5.png

Go to Storage, click on the checkboxes next to the storage buckets and then click on Delete

436bdc979b24dff1.png

Go to BigQuery, click the down arrow next to your project name, click the down arrow to the right of the heartRateData dataset and then click on Delete dataset.

d13e26aa9c71b102.png

When prompted, type in the dataset ID (heartRateData) in order to finish deleting the data.

d6b4154700212483.png

Go to IoT Core and click on the registry. Click on each device that was added and then click on Delete. Once all devices have been deleted, delete the registry.