In this lab you learn how to use Workbox and IndexedDB together to make an offline-first, data-driven Progressive Web App (PWA). You also use Background Sync to sync your app with the server even when your web app is closed.

What you'll learn

What you should already know

What you will need

This lab requires Node.js. Install the latest long term support (LTS) version if you have not already.

Clone the starter code from GitHub with the following command:

git clone https://github.com/googlecodelabs/workbox-indexeddb.git

Alternatively, you can click here to download the code as a zip file.

Navigate to the project directory via the command line:

cd workbox-indexeddb/project/

Then run the following commands to install the project dependencies and start the Node.js server:

npm install
npm run --silent start

Explanation

In this step, the project dependencies are installed based on the configuration in project/package.json. Open package.json and examine its contents. There are multiple dependencies, many of which are for the local development server (you can ignore these). The important packages are gulp and workbox-build, a build tool and a Workbox module used to build the production app and service worker.

npm run start runs three commands

Open the app

Open the workbox-indexeddb/project folder in your text editor. The project folder is where you will be building the lab.

Open Chrome and navigate to localhost:8081 where you can see the starting app, an events dashboard. Click Allow on the notification permissions pop-up.

We use notifications in the section on background sync to notify the user that the app has been updated in the background. Examine the app and try adding a new test event using the form at the bottom of the page.

Explanation

The goal of this codelab is to modify an event calendar app to store its event data and its app shell locally so that the app works offline. The app currently lets you create new events and save them on the server. Look at the loadContentNetworkFirst function in app/js/main.js to get an idea of the app's current flow. The function first fetches the data from the server and, if that's successful, then it updates the page. This lab adds functionality to cache the server data and serve the cached data when the network is unavailable.

Write the source service worker

For an app to work offline, it needs a service worker. Let's build one now.

Add the following code to app/sw.js:

app/sw.js

importScripts('https://storage.googleapis.com/workbox-cdn/releases/3.5.0/workbox-sw.js');

if (workbox) {
  console.log(`Yay! Workbox is loaded 🎉`);
  workbox.precaching.precacheAndRoute([]);
} else {
  console.log(`Boo! Workbox didn't load 😬`);
}

Save the file.

Explanation

At the top of the service worker file we include the workbox-sw.js library. This library abstracts common service worker patterns and contains methods to precache files and add routes to the service worker.

The precacheAndRoute method takes a list of files to precache, called a "precache manifest", and caches these files when the service worker is installed. precacheAndRoute also sets up a cache-first response for the precached files - we don't have to write any logic to serve these files from the cache. Notice that the list is currently an empty array. We use workbox-build in the next step to generate the manifest.

Build the production service worker

The recommended approach to precaching your project files is to use one of the Workbox build modules, such as workbox-build, to generate the list of files to precache.

Let's create a gulp task called "service-worker" to generate our precache manifest. Add the following code to project/gulpfile.js, before the build task:

gulpfile.js

const serviceWorker = () => {
  return workboxBuild.injectManifest({
    swSrc: 'app/sw.js',
    swDest: 'build/sw.js',
    globDirectory: 'build',
    globPatterns: [
      'style/main.css',
      'index.html',
      'js/idb-promised.js',
      'js/main.js',
      'images/**/*.*',
      'manifest.json'
    ]
  }).then(resources => {
    console.log(`Injected ${resources.count} resources for precaching, ` +
        `totaling ${resources.size} bytes.`);
  }).catch(err => {
    console.log('Uh oh 😬', err);
  });
}
gulp.task('service-worker', serviceWorker);

Now, add the require statement to import the workbox-build package at the top of gulpfile.js:

gulpfile.js

const workboxBuild = require('workbox-build');

Then add the service-worker task to the end of our build task series. The updated build task should look like this:

gulpfile.js

const build = gulp.series('clean', 'copy', 'service-worker');
gulp.task('build', build);

Save the file. For changes in the gulpfile to take effect, we need to stop and restart the gulp process. In the command-line, press Ctrl+C to stop the process (you can safely ignore any errors thrown when it is stopped). Then restart the process by running npm run --silent start. Examine the service worker file that was generated at build/sw.js.

Explanation

In this step, we incorporate the workbox-build library into our gulp-based build process. Now our build process is using workbox-build to generate a production service worker at swDest (build/sw.js), from swSrc (app/sw.js). The globPatterns and globDirectory fields tell workbox-build which files to inject into the precacheAndRoute method call in build/sw.js. workbox-build also generates revision hashes for each file, which workbox-sw.js uses to keep the precached files up-to-date in the browser.

Since we've configured workbox-build as part of our build process, build/sw.js will regenerate each time we change our app's source code in the app directory.

Register and test the service worker

Add the service worker registration code to the top of app/js/main.js:

main.js

if ('serviceWorker' in navigator) {
  window.addEventListener('load', () => {
    navigator.serviceWorker.register('/sw.js')
      .then(registration => {
        console.log(`Service Worker registered! Scope: ${registration.scope}`);
      })
      .catch(err => {
        console.log(`Service Worker registration failed: ${err}`);
      });
  });
}

Save the file and refresh the page to install the service worker (you should see the appropriate console logs). Stop the server by pressing Ctrl+C in the command line to simulate going offline. Then return to the browser and refresh the page. The app shell should load offline!

Explanation

When we register and install the generated production service worker (build/sw.js), precacheAndRoute precaches all of the resources in the precache manifest. As mentioned earlier, precacheAndRoute also sets up an implicit cache-first strategy for those resources. Now the app serves its "shell" directly from the cache, which is not only very fast (no network round-trip required), but also works offline.

At this point, the app doesn't store any data in the browser and can't display data when the user is offline. We'll use IndexedDB to store the app's data. Let's create a database named dashboardr.

Add the following function to app/js/main.js:

main.js

function createIndexedDB() {
  if (!('indexedDB' in window)) {return null;}
  return idb.open('dashboardr', 1, function(upgradeDb) {
    if (!upgradeDb.objectStoreNames.contains('events')) {
      const eventsOS = upgradeDb.createObjectStore('events', {keyPath: 'id'});
    }
  });
}

Create an indexedDB database by adding the following code to the top of main.js:

main.js

// TODO - create indexedDB database
const dbPromise = createIndexedDB();

Save the file.

Restart the server by running the following command to take the app back online:

npm run --silent start

Return to the browser and refresh the page. Then, activate the new service worker with skipWaiting and refresh the page again. In Chrome, you can use skipWaiting in Developer Tools by selecting Service Workers in the Application tab and clicking skipWaiting.

Refresh the page again. Then, use developer tools to check that the dashboardr database exists. In Chrome, you can check the existing databases in the Application tab by clicking on IndexedDB. Open the dashboardr database and check that the events object store exists.

Explanation

In the above code, we create a dashboardr database and give it a version number of 1. Inside the upgrade callback function (which is what IndexedDB uses to modify its databases), the code first checks that the events object store doesn't already exist before creating it. This check is done to avoid potential errors if we ever update the version of the database. We give the events object store a key path of 'id', meaning that all data added to this object store must have an id property that contains a unique value.

Because we modified the app/main.js file, the gulp watch task rebuilt the service worker. Workbox automatically updated the revision hash for this file, and then intelligently updated main.js in the cache!

Now let's store our data in the newly created dashboardr database and events object store.

Add the following function to app/js/main.js:

main.js

function saveEventDataLocally(events) {
  if (!('indexedDB' in window)) {return null;}
  return dbPromise.then(db => {
    const tx = db.transaction('events', 'readwrite');
    const store = tx.objectStore('events');
    return Promise.all(events.map(event => store.put(event)))
    .catch(() => {
      tx.abort();
      throw Error('Events were not added to the store');
    });
  });
}

Then update the loadContentNetworkFirst function. The whole function should look like this:

main.js

function loadContentNetworkFirst() {
  getServerData()
  .then(dataFromNetwork => {
    updateUI(dataFromNetwork);
    saveEventDataLocally(dataFromNetwork)
    .then(() => {
      setLastUpdated(new Date());
      messageDataSaved();
    }).catch(err => {
      messageSaveError(); 
      console.warn(err);
    });
  }).catch(err => { // if we can't connect to the server...
    console.log('Network requests have failed, this is expected if offline');
  });
}

Call the newly defined saveEventDataLocally function inside the addAndPostEvent function:

main.js

function addAndPostEvent() {
  // ...
  // TODO - save event data locally
  saveEventDataLocally([data]);
  // ...
}

Save the file. Refresh the page and activate the new service worker. Refresh the page again and check the events object store to see if the network data was stored (remember that you might need to refresh IndexedDB in the developer tools).

Explanation

The saveEventDataLocally function takes an array of objects and adds each object to the IndexedDB database. The store.put operations happen inside a Promise.all. This way if any of the put operations fail, we can catch the error and abort the transaction. Aborting the transaction rolls back all the changes that happened in the transaction so that if any of the events fail to put, none of them will be added to the object store.

In loadContentNetworkFirst, once the server data is received, IndexedDB and the page are updated. Then, when the data is successfully saved, a timestamp is stored and the user is notified that the data is available for offline use.

The call to saveEventDataLocally in addAndPostEvent keeps the local data up-to-date when the user adds new events.

At this point we are storing the data locally, but to use the data offline we must first retrieve it.

Add the following function to app/js/main.js:

main.js

function getLocalEventData() {
  if (!('indexedDB' in window)) {return null;}
  return dbPromise.then(db => {
    const tx = db.transaction('events', 'readonly');
    const store = tx.objectStore('events');
    return store.getAll();
  });
}

Then update the loadContentNetworkFirst function. The whole function should look like this:

main.js

function loadContentNetworkFirst() {
  getServerData()
  .then(dataFromNetwork => {
    updateUI(dataFromNetwork);
    saveEventDataLocally(dataFromNetwork)
    .then(() => {
      setLastUpdated(new Date());
      messageDataSaved();
    }).catch(err => {
      messageSaveError();
      console.warn(err);
    });
  }).catch(err => {
    console.log('Network requests have failed, this is expected if offline');
    getLocalEventData()
    .then(offlineData => {
      if (!offlineData.length) {
        messageNoData();
      } else {
        messageOffline();
        updateUI(offlineData); 
      }
    });
  });
}

Save the file. Refresh the page and activate the updated service worker. Now take the app offline by stopping the server in the command line with Ctrl+C. Return to the browser and refresh the page. The app shell and all the app's data should load offline!

Explanation

If there is no network availability when loadContentNetworkFirst is called, then the getServerData function rejects and the catch method takes over. In the catch call, the getLocalEventData function retrieves local data from IndexedDB. If there isn't any local data saved, then the user is alerted by messageNoData. Otherwise the local data is displayed on the page with updateUI and a message informs the user that the data might be outdated.

Now we're storing and retrieving local data so the app can be viewed offline. We can further enhance this app by using the Workbox Background Sync module so the user can save their offline changes to the server (in the background) when they come back online.

Add background sync

Add the following code to app/sw.js beneath the call to precacheAndRoute:

app/sw.js

const bgSyncPlugin = new workbox.backgroundSync.Plugin('dashboardr-queue');

const networkWithBackgroundSync = new workbox.strategies.NetworkOnly({
  plugins: [bgSyncPlugin],
});

workbox.routing.registerRoute(
  /\/api\/add/,
  networkWithBackgroundSync,
  'POST'
);

Save the file. Return online by running the following command:

npm run --silent start

Refresh the page in the browser. Activate the updated service worker and refresh the page again.

Stop the server again with Ctrl+c to take the app back offline.

Add an event to the list using the form. Check IndexedDB to see that the workbox-background-sync queue was created and contains the requests object store (you may need to refresh IndexedDB to see the changes). Open the object store and confirm that the request to /api/add was added to the store.

Explanation

The first step when adding background sync functionality is to initialize the backgroundSync plugin. Background sync needs to create a Queue (dashboardr-queue), represented by an IndexedDB database, that is used to store failed HTTP requests. The plugin is then added to the configuration of a handler, networkWithBackgroundSync. Finally, a Route is created and registered using this handler, and the app's /api/add endpoint. Registering this Route instructs Workbox to use the backgroundSync plugin for requests to the /api/add endpoint, which our app uses for adding events. If a user tries to add an event while offline, the failed /api/add request will be saved in the background sync queue. When the user returns online, the queued requests are re-sent even if the app is closed!

In the next section we will add notifications to clarify to the user (and developers!) when background sync occurs.

Learn more

Workbox Background Sync

Add notifications to background sync

Replace the bgSyncPlugin instantiation with the following code:

app/src/sw.js

const showNotification = () => {
  self.registration.showNotification('Background sync success!', {
    body: '🎉`🎉`🎉`'
  });
};

const bgSyncPlugin = new workbox.backgroundSync.Plugin(
  'dashboardr-queue',
  {
    callbacks: {
      queueDidReplay: showNotification
      // other types of callbacks could go here
    }
  }
);

Save the file. Restart the server and refresh the page in the browser. Activate the updated service worker and refresh the page again. Take the app offline by stopping the server in the command line with Ctrl+C.

Turn off your computer's WiFi. Background sync replays queued requests if you have connection to the internet, so just stopping the server won't be enough to test this section.

Add an event to the list using the form. Then, check the requests object store in the workbox-background-sync database in IndexedDB and confirm that the request to /api/add was added to the store.

Restart the server with npm run --silent start.

After the server has started, turn on your computer's WiFi. You should get a notification that says background sync was a success! Check the requests object store again and confirm that the request has been removed.

Explanation

The backgroundSync plugin can be configured with different lifecycle callbacks. The queueDidReplay callback fires when queue requests successfully replay to the server. This code shows a notification letting the user know that their offline requests have been synced.

As a challenge to do on your own, try adding the ability for the user to delete events while online and offline. The supplied server supports a /api/delete route, similar to /api/add. POSTing to /api/delete with the id of an event in the POST body deletes the event from the server data. Remember to delete the data from IndexedDB as well!

You've built an offline-first, data-driven PWA using Workbox and IndexedDB. You also learned how to use background sync to update the client and server even when the browser is closed!

What we've covered

Resources