HashiCorp Vault is a popular open source tool for secrets management that codifies many of the best practices around secrets management, such as time-based access control, encryption, dynamic credentials and much more. The GCP secrets engine allows Vault users to generate IAM service account credentials with a given set of permissions and a set lifetime, without needing a service account of their own. This codelab teaches you how to use the secrets backend to generate credentials to authorize a call to GCP.

What you'll learn

Self-paced environment setup

If you don't already have a Google Account (Gmail or Google Apps), you must create one. Sign-in to Google Cloud Platform console (console.cloud.google.com) and create a new project:

Remember the project ID, a unique name across all Google Cloud projects (the name above has already been taken and will not work for you, sorry!). It will be referred to later in this codelab as PROJECT_ID.

Next, you'll need to enable billing in the Cloud Console in order to use Google Cloud resources.

Running through this codelab shouldn't cost you more than a few dollars, but it could be more if you decide to use more resources or if you leave them running (see "cleanup" section at the end of this document).

New users of Google Cloud Platform are eligible for a $300 free trial.

Start Cloud Shell

While Google Cloud can be operated remotely from your laptop, in this codelab you will be using Google Cloud Shell, a command line environment running in the Cloud.

From the GCP Console click the Cloud Shell icon on the top right toolbar:

It should only take a few moments to provision and connect to the environment. When it is finished, you should see something like this:

This virtual machine is loaded with all the development tools you'll need. It offers a persistent 5GB home directory, and runs on the Google Cloud, greatly enhancing network performance and authentication. All of your work in this lab can be done with simply a browser.

Before deploying Vault in production, first install Vault locally. This will enable you to use the vault CLI locally and will be used to interact with the cluster later.

You could browse to the Vault website, but this section will teach you how to download, verify, and install Vault securely. Even though Vault is downloaded over a TLS connection, it may still be possible for a skilled attacker to compromise the underlying storage system or network transport. For that reason, in addition to serving the binaries over TLS, HashiCorp also signs the checksums of each release with their private key. Thus, to verify the integrity of a download, we must:

  1. Import and trust HashiCorp's GPG public key
  2. Download the Vault binary
  3. Download the Vault checksums
  4. Download the Vault checksum signature
  5. Verify the signature of the checksum against HashiCorp's GPG key
  6. Verify the checksums of the binary against the file

This way, even if an attacker were able to compromise the network transport and underlying storage component, they wouldn't be able to sign the checksums with HashiCorp's GPG key. If this operation is successful, we have an extremely high degree of confidence that the software is untainted.

Since that process can be tedious, we will leverage a Docker container to do it for us. Execute the following command to install Vault locally. We install Vault into $HOME/bin because that will persist between restarts on Cloud Shell.

$ docker run -v $HOME/bin:/software sethvargo/hashicorp-installer vault 1.0.0-beta1
$ sudo chown -R $(whoami):$(whoami) $HOME/bin/

Add the bin to our path:

$ export PATH=$HOME/bin:$PATH

Finally, optionally, explore the Vault CLI help. Most Vault commands will not work because there is no Vault server running. Do not start a Vault server yet.

$ vault -h

Start a local, development Vault server. This Vault server runs entirely in memory and does not represent a best practices installation. However, it is useful for getting started quickly and exploring Vault's functionality. We will also create an initial token in Vault with the value of "root", which will be used to authenticate to the Vault server.

$ export VAULT_ADDR=http://127.0.0.1:8200
$ export VAULT_DEV_ROOT_TOKEN_ID=root
$ vault server -dev &> vault.log &

Vault is now running in the background. You can query its status to verify:

$ vault status

Key             Value
---             -----
Seal Type       shamir
Initialized     true
Sealed          false
# ...

Because Vault will be managing IAM service accounts and IAM permissions across your GCP resources, you will need to give Vault a service account with the superset of all permissions it may be granting.

First, make sure both the Google Cloud IAM API and the Cloud Resource Manager API are enabled. This only needs to be done once per project to make the API accessible.

$ gcloud services enable \
    iam.googleapis.com \
    cloudresourcemanager.googleapis.com

Create a service account and credentials key file for that account.

$ gcloud iam service-accounts create vault-secrets
$ export VAULT_SA_EMAIL="vault-secrets@${GOOGLE_CLOUD_PROJECT}.iam.gserviceaccount.com"
$ gcloud iam service-accounts keys create vault-key.json \
    --iam-account=${VAULT_SA_EMAIL}

Give this service account permissions to create and manage service accounts and keys in GCP.

$ gcloud projects add-iam-policy-binding ${GOOGLE_CLOUD_PROJECT} \
    --member="serviceAccount:${VAULT_SA_EMAIL}" \
    --role="roles/iam.serviceAccountAdmin"
$ gcloud projects add-iam-policy-binding ${GOOGLE_CLOUD_PROJECT} \
    --member="serviceAccount:${VAULT_SA_EMAIL}" \
    --role="roles/iam.serviceAccountKeyAdmin"

Later, you'll declare a set of IAM roles, applied to specific GCP resources. Vault will need permissions to set IAM policies on these resources. For this codelab, we'll give them permissions at a GCP project level. To be as restrictive as possible, create a custom role that gives only getIamPolicy and setIamPolicy permissions on projects.

$ cat > vaultRole.yaml <<EOF
title: "vaultProjectPolicyAdmin"
description: |
  Role for Vault secret backend codelab to manage project IAM policy"
stage: "GA"
includedPermissions:
- resourcemanager.projects.getIamPolicy
- resourcemanager.projects.setIamPolicy
EOF
$ gcloud iam roles create vaultSecretsAdmin \
    --quiet \
    --project ${GOOGLE_CLOUD_PROJECT} \
    --file vaultRole.yaml

Add this final policy binding to your project for Vault's service account.

$ gcloud projects add-iam-policy-binding ${GOOGLE_CLOUD_PROJECT} \
    --member="serviceAccount:${VAULT_SA_EMAIL}" \
    --role="projects/${GOOGLE_CLOUD_PROJECT}/roles/vaultSecretsAdmin"

You have successfully created a service account with the proper IAM permissions. In the next section, we will give this service account credential to Vault.

We can now set up the Vault Google Cloud secrets engine.

Enable the GCP secrets engine in Vault.

$ vault secrets enable gcp

Enable Vault to use the service account we created earlier by saving the key in Vault under configuration.

$ vault write gcp/config credentials=@vault-key.json

Add a roleset to the GCP secrets engine. Credentials are generated under a roleset, where the roleset defines the IAM permissions the credentials will have. In this example, we are creating a roleset named "viewer" which grants project Viewer permissions on the current project.

$ vault write gcp/roleset/my-project-viewer \
    project="${GOOGLE_CLOUD_PROJECT}" \
    secret_type="service_account_key"  \
    bindings=-<<EOF
resource "//cloudresourcemanager.googleapis.com/projects/${GOOGLE_CLOUD_PROJECT}" {
  roles = ["roles/viewer"]
}
EOF

As the bindings above show, the generated credentials will have the project Viewer role on your codelab project. When a user or machine accesses this endpoint, Vault will programatically generate a new credential with the specified IAM permissions automatically.

That's it! Vault is now set up to generate credentials.

Up until this point, you have acted as the Vault root sysadmin, setting up Vault for future users. We will now switch roles and act as a regular user of Vault.

Generate a service account key using the secrets engine.

$ vault read gcp/key/my-project-viewer

Key                 Value
---                 -----
lease_id            gcp/key/viewer/2HJiqcQnb...
lease_duration      768h
lease_renewable     true
key_algorithm       KEY_ALG_RSA_2048
key_type            TYPE_GOOGLE_CREDENTIALS_FILE
private_key_data    ewogICJ0eXBlIjogInNlcnZpY2VfYWNjb3VudCIsC...

Success! The private_key_data field is a base64-encoded credentials file. You can base64 decode this value and use it to authenticate to other Google Cloud APIs via HTTP or the client libraries. This credential as project Viewer permission, as we defined in the bindings in the previous step.

Each time you read from this endpoint, Vault will generate a new, unique credentials file.

As an example, activate these service account credentials in your session:

$ export GOOGLE_CREDENTIALS=$(vault read -field=private_key_data gcp/key/my-project-viewer | base64 --decode)

Obtain the service account email from this output and activate the credentials using gcloud.

$ export SA_EMAIL=$(echo $GOOGLE_CREDENTIALS | jq -r .client_email)

$ gcloud auth activate-service-account ${SA_EMAIL} \
    --key-file=-<<<$(echo $GOOGLE_CREDENTIALS)

Activated service account credentials for: [vaultviewer-...@....iam.gserviceaccount.com]

You learned how to run and configure HashiCorp Vault on Google Cloud to generate dynamic service account keys.

Clean up

If you are done exploring, please consider deleting your project.

Learn More

License

This work is licensed under a Creative Commons Attribution 2.0 Generic License.