Google Data Studio lets you build live, interactive dashboards with beautiful data visualizations, for free. You can fetch your data from a variety of sources and create unlimited reports in Data Studio, with full editing and sharing capabilities. Here's a screenshot of an example Data Studio dashboard:

(Click here to view this example report in Data Studio)

Community Connectors is a feature for Data Studio that lets you use Apps Script to build connectors to any internet accessible data source. Community Connectors are built by the Data Studio community. That means anyone can build Community Connectors. You can also share Community Connectors with other people so they can access their own data from within Data Studio.

You can use Community Connectors in different use cases:

What you'll learn

What you'll need

Why did you choose this codelab?

I am interested in data visualization in general. I want to learn more about Data Studio I want to build my own Community Connector. I am trying to integrate Data Studio with another platform. I am interested in Google Cloud solutions.

How do you plan to use this codelab/tutorial?

Skim through only Read it and complete the exercises

How would you rate your familiarity with Data Studio?

Never heard of it I know what it is but I don't use it. I use it regularly. I am an expert user.

What best describes your background?

Developer Business / Financial / Data Analyst Data Scientist / Data Engineer Marketing / Social Media / Digital Analytics Expert Designer Other

You can move to the next page to submit the survey information.

Data Studio Community Connectors enable direct connections from Data Studio to any internet accessible data source. You can connect to commercial platforms, public datasets, or your own on-premise private data. Community Connectors can fetch data through Web APIs, JDBC APIs, flat files (CSV, JSON, XML), and Apps Script Services.

Let's assume you have published a package on npm and you want to track the download count of the package over time by day. In this codelab, you will build a Community Connector that fetches this data from the npm package download counts API. Then you can use that Community Connector in Data Studio to build a dashboard visualizing this data.

In a basic Community Connector, you'll define four functions:

Depending on the current step of the workflow, Data Studio executes these connector functions and uses the response in the subsequent steps. The simplified workflow below gives an overview of:

There's no need to memorize this workflow, just have a look to get a sense of what happens in a connector. You can always come back to this diagram.

In the next step, you will start creating your connector in Google Apps Script. You will have to switch back and forth between the Apps Script UI and this codelab.

Step 1: Visit Google Apps Script.

Step 2: Create a new apps script project by clicking "+ New script" in the top left section.

You will see a shell project with a blank myFunction function in the Code.gs file.

Step 3: Delete the myFunction function.

Step 4: Give the project a name by:

  1. clicking on Untitled project in the top-left of the page and
  2. entering an appropriate name.

Now you will start writing your connector code in the Code.gs file.

Data Studio will call the getAuthType() function when it needs to know the authentication method used by the connector. This function should return the authentication method required by the connector to authorize the 3rd-party service.

For the npm download connector you are building, you do not need to authenticate with any 3rd-party service since the API you are using does not require any authentication. Copy the following code and add to your Code.js file:

Code.js

function getAuthType() {
  var response = { type: 'NONE' };
  return response;
}

Here, you are indicating that your connector does not require any 3rd-party authentication ('type': 'NONE'). To see all supported authentication methods, view the AuthType() reference.

Users of your connector will need to configure the connector before they can start using it. With the getConfig() function response, you can define what configuration options users will see. Data Studio calls the getConfig() function to get the connector's configuration details. Based on the response provided by getConfig(), Data Studio will render the connector configuration screen and change certain connector behavior.

In the configuration screen, you can provide information or get user input using the following form elements:

TEXTINPUT

Input element

A single-line text box.

TEXTAREA

Input element

A multi-line textarea box.

SELECT_SINGLE

Input element

A dropdown for single-select options.

SELECT_MULTIPLE

Input element

A dropdown for multi-select options.

CHECKBOX

Input element

A single checkbox that can be used to capture boolean values.

INFO

Display element

A static plain-text box that can be used to provide instructions or information to the user.

Use the INFO element to provide user instructions and a TEXTINPUT element to get the input package name from the user. In the getConfig() response, you will group these form elements under configParams key.

Since the API you are connecting to requires date as a parameter, you will set dateRangeRequired to true in the getConfig() response. This tells Data Studio to provide date ranges with all data requests. If your data source does not require date as a parameter, you can omit this.

This is what your getConfig() function will look like. Add the following code to your Code.js file that already has the code for getAuthType():

Code.js

function getConfig(request) {
  var cc = DataStudioApp.createCommunityConnector();
  var config = cc.getConfig();
  
  config.newInfo()
    .setId('instructions')
    .setText('Enter npm package names to fetch their download count.');
  
  config.newTextInput()
    .setId('package')
    .setName('Enter a single package name')
    .setHelpText('e.g. googleapis or lightouse')
    .setPlaceholder('googleapis');
  
  config.setDateRangeRequired(true);
  
  return config.build();
}

Based on these configParams, when you use the connector in Data Studio, you can expect to see a configuration screen like the following. But more on that later.

Let's move on to the next function - getSchema().

Data Studio calls the getSchema() function to get the schema associated with the selected configuration for the connector. Based on the response provided by getSchema(), Data Studio will show the fields screen to the user listing all the fields in the connector.

For any specific configuration of your connector, the schema is a list of all fields for which the connector can provide data to the user. For the same connector, you might have to return different schema with different fields based on different configurations. The schema can contain fields that you fetch from your API source, fields that you calculate in Apps Script, and fields that are calculated in Data Studio using calculated fields formula. You will provide certain metadata about each field in the schema, including:

You can review the getSchema() and Field reference later to learn more about this.

Depending on how you are getting the data for your connector and the configuration provided by the user, the schema may be fixed or you may have to dynamically create the schema when getSchema() is called. Configuration parameters from getConfig() that are defined by the user will be provided in the request argument for getSchema() function.

For this codelab, you do not need to access the request argument. You will learn more about the request argument when you write code for the getData() function in the next segment.

For your connector, the schema is fixed and contains these three fields:

packageName

Name of the npm package that the user provides

downloads

Download count of the npm package

day

Date of the download count

This is what the getSchema() code will look like for your connector. You are creating a separate variable npmSchema since the schema information needs to be accessed by both getSchema() and getData(). Add the following code to your Code.js file:

Code.js

function getFields(request) {
  var cc = DataStudioApp.createCommunityConnector();
  var fields = cc.getFields();
  var types = cc.FieldType;
  var aggregations = cc.AggregationType;
  
  fields.newDimension()
    .setId('packageName')
    .setType(types.TEXT);
  
  fields.newMetric()
    .setId('downloads')
    .setType(types.NUMBER)
    .setAggregation(aggregations.SUM);
  
  fields.newDimension()
    .setId('day')
    .setType(types.YEAR_MONTH_DAY);
  
  return fields;
}

function getSchema(request) {
  var fields = getFields(request).build();
  return { schema: fields };
}

Based on this schema, you can expect to see the following fields in the Data Studio fields screen when you use the connector in Data Studio. But more on that later when you test your connector.

Let's move on to our last function - getData().

Data Studio calls the getData() function any time it needs to fetch data for the fields from the connector. Based on the response provided by getData(), Data Studio will render and update charts in the dashboard. getData()might be called during these events:

There's no need to copy any code from this page since you will copy the completed getData()code in a later page.

Understanding the request object

Data Studio passes the request object with each getData() call. Review the structure of the request object below. This will help you to write the code for your getData() function.

request object structure

{
  configParams: object,
  scriptParams: object,
  dateRange: {
    startDate: string,
    endDate: string
  },
  fields: [
    {
      name: Field.name
    }
  ]
}

For your connector, an example request from the getData() function might look like this:

{
  configParams: {
    package: 'jquery'
  },
  dateRange: {
    endDate: '2017-07-16',
    startDate: '2017-07-18'
  },
  fields: [
    {
      name: 'day',
    },
    {
      name: 'downloads',
    }
  ]
}

For this getData()call, in the above request, only two fields are being requested even though the connector schema has additional fields. The next page will contain the example response for this getData()call as well as the general getData()response structure.

In the getData()response, you will need to provide both schema and data for the requested fields. You will divide up the code into three segments:

There's no need to copy any code from this page since you will copy the completed getData()code in the next page.

This is the structure of getData()for your connector.

function getData(request) {

  // TODO: Create schema for requested fields
  
  // TODO: Fetch and parse data from API
  
  // TODO: Transform parsed data and filter for requested fields

  return {
    schema: <filtered schema>,
    rows: <transformed and filtered data>
  };
}

Create schema for requested fields

This is how you will be subset the schema for requested fields. Here, you can see that request.fields contains the list of field.name for requested fields.

// Create schema for requested fields
  var requestedFieldIds = request.fields.map(function(field) {
    return field.name;
  });
  var requestedFields = getFields().forIds(requestedFieldIds);

Fetch and parse data from API

The npm API URL will be in this format:

https://api.npmjs.org/downloads/point/{start_date}:{end_date}/{package}

You will be first creating the URL for the API using the request.dateRange.startDate, request.dateRange.endDate, and request.configParams.package provided by Data Studio. Then you will be fetching the data from the API using UrlFetchApp(Apps Script Class: reference). Then your will parse the fetched response.

  // Fetch and parse data from API
  var url = [
    'https://api.npmjs.org/downloads/range/',
    request.dateRange.startDate,
    ':',
    request.dateRange.endDate,
    '/',
    request.configParams.package
  ];
  var response = UrlFetchApp.fetch(url.join(''));
  var parsedResponse = JSON.parse(response).downloads;

Transform parsed data and filter for requested fields

The response from the npm API will be in the following format:

{
  downloads: [
    {
    day: '2014-02-27',
    downloads: 1904088
    },
    ..
    {
    day: '2014-03-04',
    downloads: 7904294
    }
  ],
  start: '2014-02-25',
  end: '2014-03-04',
  package: 'somepackage'
}

You will transform the response from the npm API and provide the getData() response in the following format. If this format is unclear, have a look at the example response in the following paragraph.

{
  schema: [
    {
      object(Field)
    }
  ],
  rows: [
    {
      values: [string]
    }
  ]
}

In the response, you will return schema for only the requested fields using the schema property. You will return the data using the rows property as a list of rows. For each row, sequence of fields in values must match the sequence of fields in schema. Based on our earlier example of request, this is what the response for getData() will look like:

{
  schema: requestedFields.build(),
  rows: [
    {
      values: [ 38949, '20170716']
    },
    {
      values: [ 165314, '20170717']
    },
    {
      values: [ 180124, '20170718']
    },
  ]
}

You have already made the subset of the schema. You will use the following function to transform the parsed data and filter it down for requested fields.

function responseToRows(requestedFields, response, packageName) {
  // Transform parsed data and filter for requested fields
  return response.map(function(dailyDownload) {
    var row = [];
    requestedFields.asArray().forEach(function (field) {
      switch (field.getId()) {
        case 'day':
          return row.push(dailyDownload.day.replace(/-/g, ''));
        case 'downloads':
          return row.push(dailyDownload.downloads);
        case 'packageName':
          return row.push(packageName);
        default:
          return row.push('');
      }
    });
    return { values: row };
  });
}

Your combined getData() code will look like the one below. Add the following code to your Code.js file:

Code.js

function responseToRows(requestedFields, response, packageName) {
  // Transform parsed data and filter for requested fields
  return response.map(function(dailyDownload) {
    var row = [];
    requestedFields.asArray().forEach(function (field) {
      switch (field.getId()) {
        case 'day':
          return row.push(dailyDownload.day.replace(/-/g, ''));
        case 'downloads':
          return row.push(dailyDownload.downloads);
        case 'packageName':
          return row.push(packageName);
        default:
          return row.push('');
      }
    });
    return { values: row };
  });
}

function getData(request) {
  var requestedFieldIds = request.fields.map(function(field) {
    return field.name;
  });
  var requestedFields = getFields().forIds(requestedFieldIds);

  // Fetch and parse data from API
  var url = [
    'https://api.npmjs.org/downloads/range/',
    request.dateRange.startDate,
    ':',
    request.dateRange.endDate,
    '/',
    request.configParams.package
  ];
  var response = UrlFetchApp.fetch(url.join(''));
  var parsedResponse = JSON.parse(response).downloads;
  var rows = responseToRows(requestedFields, parsedResponse, request.configParams.package);

  return {
    schema: requestedFields.build(),
    rows: rows
  };
}

You are done with the Code.js file! Next, you will update the manifest.

In the Apps Script editor, select View > Show manifest file.

This will create a new appsscript.json manifest file.

Replace your appscript.json file with the following:

appsscript.json

{
  "dataStudio": {
    "name": "npm Downloads - From Codelab",
    "logoUrl": "https://raw.githubusercontent.com/npm/logos/master/%22npm%22%20lockup/npm-logo-simplifed-with-white-space.png",
    "company": "Codelab user",
    "companyUrl": "https://developers.google.com/datastudio/",
    "addonUrl": "https://github.com/google/datastudio/tree/master/community-connectors/npm-downloads",
    "supportUrl": "https://github.com/google/datastudio/issues",
    "description": "Get npm package download counts.",
    "sources": ["npm"]
  }
}

Save the Apps Script project.

Congratulations! You have built your first community connector and it is ready for a test drive!

Use the deployment

Step 1: In the Apps Script development environment, Click on Publish > Deploy from manifest to open the Deployments screen.

The default deployment, Latest Version (Head), will be listed Here.

Step 2: Click on the deployment name or the Data Studio icon next to the Get ID link. It will reveal the direct connector link into Data Studio for this connector.

Step 3: Click on the newly available connector link. It will open up a new window in Data Studio.

Authorize the connector

First time Data Studio users: If you have not used Data Studio before, you will be asked to authorize Data Studio to your account and agree to the terms and conditions. Complete the authorization process. When you first use Data Studio, you may also see a pop-up that asks to update your marketing preferences, go ahead and update them. You should sign up for the Product announcements if you want to know about the latest features, updates, and product announcements by email.

Now, you will see a prompt to authorize the new connector.

Click Authorize and provide the required authorization to the connector.

Configure the connector

Once the authorization is complete, it will show the configuration screen. Type in "lighthouse" in the text input area and click Connect in the top right.

Confirm the schema

You will see the fields screen. Click Create Report button in the top right.

Create your dashboard

You will be in the Data Studio dashboard environment. Click the Add to Report button.

In Data Studio, every time a user access a connector and add a new configuration, a new data source is created in the user's Data Studio account. You can think of a data source as an instantiation of the connector based on a specific configuration. Based on the connector and the configuration the user had selected, a data source will return a data table with a specific field set. Users can create multiple data sources from the same connector. A data source can be used in multiple reports, and the same report can use multiple data sources.

Time to add a Time Series Chart! In the menu, click on Insert > Time Series. Then drag a rectangle in the canvas. You should see a time series chart of the npm download count for the selected package. You can then add a date filter control and view the dashboard as shown below.

That's it! You just tested your first community connector! This brings you to the end of this codelab. Now, let's see what next steps you can take.

Improve the connector your built

Make improvements to the connector you just built:

Do more with Community Connectors

Additional resources

Below are various resources you can access to help you dig into the material covered in this codelab more.

Resource Type

User Features

Developer Features

Documentation

Help Center

Developer Documentation

News & Updates

Sign up in Data Studio > User Settings

Developer Mailing List

Ask Questions

User Forum

Stack Overflow [google-data-studio]

Videos

Data Studio on Youtube

Coming Soon!

Examples

Report Gallery

- Open Source Repository
- Connector Gallery