Cloud Datalab is an interactive data analysis, visualization and machine learning tool. It enables you to author and run Python code in the form of notebooks. Notebooks bring together code, results of code execution including visualizations and documentation in a single file. They also allow you to capture a history of executions so you can iteratively refine your data analysis by utilizing previous execution results.
Cloud Machine Learning (ML) Engine is a managed service that lets you run TensorFlow-based models in a distributed fashion for training and prediction. It also provides a way to run training locally (e.g. on the VM running Datalab) so you can validate your model against a small sample of data before submitting a long-running training job.
If you don't already have a Google Account (Gmail or Google Apps), you must create one.
Sign-in to Google Cloud Platform console (console.cloud.google.com) and create a new project:
Remember the project ID, a unique name across all Google Cloud projects (the name above has already been taken and will not work for you, sorry!). It will be referred to later in this codelab as
Next, you'll need to enable billing in the Developers Console in order to use Google Cloud resources like Cloud Datastore and Cloud Storage.
Running through this codelab shouldn't cost you more than a few dollars, but it could be more if you decide to use more resources or if you leave them running (see "cleanup" section at the end of this document).
New users of Google Cloud Platform are eligible for a $300 free trial.
We're going to make use of a new feature of Google Cloud Platform called Google Cloud Shell, an interactive shell that can be used to manage your Cloud Resources and to do development work directly from the Google Developers Console.
Google Cloud Shell provides you with command-line access to computing resources hosted on Google Cloud Platform and is available now in the Google Cloud Platform Console. Cloud Shell makes it easy for you to manage your Cloud Platform Console projects and resources without having to install the Google Cloud SDK and other tools on your system. With Cloud Shell, the Cloud SDK
gcloud command and other utilities you need are always available when you need them. It also comes preinstalled with tools you'd often use. E.g. git, maven, java virtual machine (jvm), nodejs, python, npm.
To get started:
A Cloud Shell session opens inside a new frame at the bottom of the console and displays a command-line prompt.
Datalab is set up on a GCE VM. For that we need to specify the project and the zone where the VM is created. Typically Datalab is set up from your client machine (desktop/laptop) with Cloud SDK installed. Here we are going to use Cloud Shell as the client to run the installation commands.
Copy the project id from the left pane using the icon next to the text in Qwiklabs and paste it in place of
PROJECT_ID below if you haven't done so already in a previous section of this lab.
$ gcloud config set core/project PROJECT_ID
You can use the specified zone
us-central1-f. If you don't specify one, the subsequent command will provide a list and prompt you to pick one.
$ gcloud config set compute/zone us-central1-f
Now we can create a Datalab instance on a VM in the project and zone specified above. The code in the following sections will be running on that VM in Google Cloud. In the create command below,
regression is used as the VM and Datalab instance name. For this lab, we do not need a source repository to commit files to and the temporary account does not have permissions to create it so we will turn that off.
$ datalab create --no-create-repository regression
The previous command creates a connection to your instance. Use that connection to open your browser to the Cloud Datalab notebook listing page by selecting Cloud Shell Web preview→Change port→Port 8081.
You will need the following command only if you lose connection to Datalab for some reason.
$ datalab connect regression
In this step, you launched Cloud Shell and called some simple
gcloud commands to set up a Datalab instance.
Is this your first time using a Datalab? (If you're an experienced Datalab user or just finished trying out Datalab in another code lab, you can skip to the next section titled "Regression".)
Here are a few tips to help you get started:
When you start your Datalab, you are instructed to create a new notebook within your lab environment. You will then copy/paste code into your notebook, and run it. To write code with your new iPython notebook:
On the notebook listing page, navigate to the docs folder. Click on the notebook file named Hello World.ipynb. It will open up in a new tab.
In the notebook, click on the cell with Python code for printing hello world. Run the cell by pressing Shift+Enter or by clicking the
Run button in the menu bar at the top. You will see the text printed as a result of execution and a new, empty code cell will be created just below the printed text.
Click the empty cell and type the following code to see how visualization works. This code takes static values of numbers and their squares and plots a line chart.
import matplotlib.pyplot as pl
Press Shift+Enter to run the code cell. Observe the output chart.
Now double click the text "Untitled Notebook" at the top of the notebook editing area. You will see markdown corresponding to the formatted text. Replace the text "Untitled Notebook" with the text "Hello".
Run the markdown cell by pressing Shift+Enter. You will see the formatted text again with the updated title.
In Cloud Datalab notebook listing page, click on the Home icon, and then navigate to datalab/
docs/samples/ML Toolbox/Regression/Census. View the list of notebooks. Then open the notebook titled "1
Local End to End.ipynb".
You will see a notebook with markdown (documentation) and code cells. Code cells are followed by execution results.
In Datalab, click on Clear | All Cells. Now, read the documentation and code in the notebook and execute each cell in turn. Check the output. Some of the cells may take some time to execute if they are doing sizeable amount of data processing. You will see a progress bar while the execution is in progress.
All the cells in this notebook used a local version of the Cloud ML Engine service. This allows one to iterate on preprocessing, model development using a sample of the data and then submit the jobs to the service with the full, unsampled set of data. Essentially same code can be executed with the service version instead of the local version by specifying a different parameter for training and prediction in the toolbox API.
Congratulations! You completed a regression model by preprocessing Census data, training a neural net using Datalab toolbox APIs that in turn use TensorFlow. You also tested the model and evaluated the results by using online and batch prediction.
Go back to the notebook listing tab and open the notebooks with "Service" in their names. These notebooks allow you to perform the same end-to-end steps in stages using Cloud ML Engine service so you can scale to large amounts of data. Read the notebooks and compare contents with the previously executed "Local End to End"notebook in a different browser tab. Execution of code cells in these notebooks is not recommended for a short lab as it is likely to take time and it will require more computing resources than is allowed for your project, especially if you are using a free trial account.