In this lab, you will create and interact with Cloud Storage and conduct typical management tasks such as bucket creation, file transfers, ACL permissions and IAM configuration.

What you need

To complete this lab, you need:

What you learn

In this lab, you:

In this lab you will use the Google Cloud Console UI and command line to interact with GCS (Google Cloud Storage). You will then utilize the transfer service to migrate data to your new storage buckets, configure ACL permissions on buckets and files, setup lifecycle management policies and setup GCS IAM permissions.

Create your first storage bucket:

Step 1

From the GCP console menu (three horizontal bars), select Storage. (Optional) Select the ‘pin' next to Storage to pin the storage icon to the top of your cloud console page for quicker access in the future.

Step 2

Click on Create Bucket

Step 3

Name your bucket, it must be unique across all of GCP, not just your project. Then select Regional as the default storage class. Choose us-central as your Region.

Step 4

You have now created your first bucket! Let's upload some image files. It's recommended you upload image files for later use.

(Optional) If you need image files, you can download some sample files from (You must be logged into your Google account to access this bucket)

Click Upload to upload new files

Choose the files you wish to upload

Step 5

Now to work with the different permissions available to your files and GCS bucket. Choose a file you wish to share and click on the check box under Share Publicly to make your file publicly accessible.

Step 6

Share your file with a specific user. Under the Entity column choose User. Under Name column enter an email address you wish to share the file with. Under Access leave the default permissions as reader (unless you wish to give that person more privileges to your file).

Step 7

Now that you know how to share your files, let's share your newly created bucket with other Googlers. Click on Browser from the left menu. You should now see the bucket you created, along with any other buckets you've created.

Edit the permissions of your bucket by clicking on the 3 dots to the right of the bucket name.

Under the Entity column, choose Domain. Under the Name column, enter Under the Access column, select Reader.

Notice the URL at the top which you will need to share with individuals who want to access the files in your bucket.

These permission changes will grant read access to your bucket to all users within the domain. The URL is highlighted as you will need to physically email users the URL. Sharing in GCS doesn't have email notifications like Google Drive.

To create a second GCS bucket using the gsutil command use your local terminal environment with the GCP SDK installed, or use the Google Cloud Shell which has all of the SDK tools pre-installed.

Step 1

Open the Google Cloud Shell

Step 2

Use gsutil to create a new regional bucket in europe-west1

gsutil mb -c regional -l europe-west1 gs://<bucket-name>

NOTE: If you have just installed the GCP SDK, you may not have set your default project ID. When running the above command just add -p <your project id> to create the bucket under your project.

Quick Tip: Your project ID can always be found by clicking on Home -> Project Info Box

NOTE: When running on the command line, you may not know the exact names of the regions available. You can see from the screenshot, running in the cloud shell while on the Create Bucket screen in the UI, you can scroll through the names of available regions.

If using the command line, you can get a complete list of regions by running the following:

gcloud compute regions list

Step 3

Let's populate this bucket with the files we created from the U.S. based bucket. Run the following command to copy the contents of your first created bucket, to this new bucket.

If you forgot the names of your buckets, list them first so you can avoid typo's later

gsutil ls

Copy all the files from your U.S. based bucket to your EU based bucket.

gsutil cp -r -p gs://<US Bucket Name>/* gs://<EU Bucket Name>/

While the files are copying, let's take this time to explain why the ‘-r' and ‘-p' options were used.

-r will recursively copy all files and directories from one bucket to another. This just makes life easy when you want to mirror the contents of one bucket to another.

-p will ensure the same permissions exist. Remember how we shared permissions on a file and the bucket to external users? This will preserve those shared permissions.

Looking at the files in the EU bucket, your publicly shared file originally shared in the US bucket should now be shared as well in the EU bucket.

Lifecycle management allows you to specify how long files live in a bucket before a specific activity is triggered such as a deletion. This feature is not available through the UI so this section of the lab will briefly cover how we can set our files to be deleted from the EU bucket after 7 days.

Step 1

Check for any existing lifecycle policy on the EU bucket

gsutil lifecycle get gs://<EU Bucket Name>

Step 2

Using your favorite text editor, create a json file specifying the lifecycle configuration for your bucket. In this scenario, we will set the files in the bucket to be deleted after 7 days by creating a json file with the following:





"action": {"type": "Delete"},

"condition": {"age": 7}




Name the file lifecycle.json

vi lifecycle.json

Step 3

Set the lifecycle management policy for the EU bucket specifying the policy created in the lifecycle.json file

gsutil lifecycle set lifecycle.json gs://<EU Bucket Name>

Step 4

Finally, let's verify the lifecycle settings of the EU bucket have changed.

gsutil lifecycle get gs://<EU Bucket Name>

Congrats! Even if you forget to delete your files, they will be deleted in 7 days!

The file transfer service lets you create a file transfer job that will migrate large amounts of data from another GCS bucket, an AWS S3 bucket, or other object based storage from other providers such as Rackspace.

For this lab, we will be utilizing our existing GCS buckets for simplicity and ease of billing since migrating from S3 would incur costs outside of Google. The processes is near exactly the same though.

Step 1

Create a new bucket in the Asia region to receive files from the US. This time, we will make it a Nearline bucket. The cost of a Nearline bucket is half the price of Regional, but even though they perform the same, Nearline buckets have an additional retrieval cost that Regional buckets do not have.

Further reading on storage pricing can be found here.

gsutil mb -c nearline -l asia gs://<Asia Bucket Name>

Step 2

Verify the creation of your Asia bucket

gsutil list -L -b gs://<Asia Bucket Name>

In this command we added the ‘-L' and ‘-b' parameters.

The ‘-L' parameter allows us to get more detailed information about the bucket such as the storage class and details on the bucket permissions.

The ‘-b' parameter lets us specify the bucket name instead of listing all of the buckets in our project.

With this command we are able to verify our bucket was created and it was created with the ‘nearline' storage class.

Step 3

We are now ready to set up our file transfer. Start by clicking on the Transfer button from within the UI.

Next, click on Create Transfer

Step 4

Ensure Google Storage Bucket is selected and then click on Browse to choose the "source" bucket for the transfer. We will be copying all of the files from this bucket to the newly created bucket in Asia.

Choose your US based bucket and click on Select

Step 5

You should now see a green checkbox next to your bucket name. If so, click on Select to to continue.

Step 6

Click on Browse to find your Asia bucket as the intended destination

Highlight the bucket you created for Asia and click on Select

Step 7

If done properly, your asia bucket should also appear with a green checkbox next to it. If so, click on Continue.

Step 8

You are now ready to start your transfer. With the Run Now radio button selected, click on Create.

Step 9

Depending on how many files you have in your bucket, you should see a status indicator showing something like this:

Once the transfer is complete, you should see a full bar under the Status column

Step 10

Finally, let's validate our files made it over but listing the bucket contents using the gsutil command.

gsutil ls gs://<Asia Bucket Name>


For the final task, let's create permissions to access and manage your GCS buckets to your Gmail account, and not just your Google account. IAM (Identity Access Management) permissions are different than GCS bucket permissions. IAM permissions give you access to the GCS cloud resources themselves. Plus, depending on how they are configured, any new buckets or objects will be inherited permissions created as part of this lab instead of explicitly defining permissions on a per bucket or per object basis.

Step 1

From the Google Cloud Console, click on the IAM Menu item.

Step 2

Click on Add to add a new user account to the IAM policies.

Step 3

Now, enter your personal gmail address in the Members box and under the Roles drop down, select Storage -> Storage Admin

Step 4

Click ADD to confirm the assignment of the Storage Admin role to your gmail account.

Step 5

To validate the newly assigned permissions. Open an incognito window in Chrome by pressing alt-ctrl-N for PC and command-ctrl-N for MAC. Now open and login with your gmail account.

Step 6

Logged in with your gmail account, from the project drop down list, choose the project you used for the lab.

Step 7

Click on the Storage icon to browse for the buckets you created earlier in the lab.

Step 8

You should now be able to browse the buckets and contents of the buckets you created earlier, but now with your gmail account as well.

Congratulations on finishing the codelab! Hopefully you've walked away with an understanding of how to get around GCS and manage 95% of what most customers use GCS for. To ensure you no longer incur billing for the items you created, let's walk through the clean up process.

Step 1

Remove the IAM role permissions you granted from your gmail account.

From the cloud console, click on IAM & Admin -> IAM and click the trashcan next to the email account you chose for the storage IAM role.

Step 2

Remove the Storage Buckets you created.

From the cloud console, click on Storage -> Browser and click on the checkbox's next to the buckets you created as part of the lab. Then click on DELETE to remove your buckets.

You have now removed all items created as part of this lab!

©Google, Inc. or its affiliates. All rights reserved. Do not distribute.