Measure performance with web-vitals.js, Google Analytics and BigQuery

1. Before you begin

What you'll do

In this codelab, you're going to:

  • Link your Google Analytics 4 Property to BigQuery.
  • Add the web-vitals library to a web page.
  • Prepare and send web-vitals data to Google Analytics.
  • Query your Core Web Vitals data in BigQuery.
  • Build a dashboard in Google Data Studio to visualize your Core Web Vitals data.

What you'll need

  • A Google Analytics account with a GA4 property.
  • A Google Cloud account.
  • A Chromium-based web browser, such as Google Chrome or Microsoft Edge. (For more information about why you need a Chromium-based web browser, see Browser Support.)
  • A text editor of your choice, such as Sublime Text or Visual Studio Code.
  • Somewhere to host your test pages to see how the web-vitals library works. (You could use a local server to deliver static web pages, or host your test pages on GitHub.)
  • A public site where you can deploy your analytics code. (Getting your code into production makes the BigQuery and Data Studio examples at the end of this Codelab more comprehensible.)
  • Knowledge of HTML, CSS, JavaScript, and Chrome DevTools.

Before you start

First, link Google Analytics 4 to BigQuery, to ensure that you can start analyzing performance as soon as your code goes live.

Follow the steps in the Google Analytics Help Centre to link your GA4 property to BigQuery.

Now that your Google Analytics property is ready to export event data to BigQuery, integrate the web-vitals library on your site.

2. Add the web-vitals library and gtag to a web page

First, add the web-vitals library to a web page.

  1. Open a page template where you want to add the web-vitals library. For this example, we're going to use a simple page:

basic.html

<!DOCTYPE html>
<html lang="en">
<head>
  <meta charset="utf-8">
  <title>Web Vitals Test</title>
  <meta name="viewport" content="width=device-width, initial-scale=1">
</head>
<body>
  <p><img style="max-width: 360px" src="https://placekitten.com/g/3840/2160" alt="Kitten" /></p>
  <p>Text below image</p>
</body>
</html>
  1. Paste the source code into a blank file in your text editor.
  2. Save the file locally as basic.html.
  3. Copy this module script, and paste it just before the closing </body> tag. This script loads the web-vitals library from a content delivery network.

basic.html

<script type="module">
  import {getCLS, getFID, getLCP} from 'https://unpkg.com/web-vitals?module';

  getCLS(console.log);
  getFID(console.log);
  getLCP(console.log);
</script>

The resulting code should look like this.

basic.html

<!DOCTYPE html>
<html lang="en">
<head>
  <meta charset="utf-8">
  <title>Web Vitals Test</title>
  <meta name="viewport" content="width=device-width, initial-scale=1">
</head>
<body>
  <p><img style="max-width: 360px" src="https://placekitten.com/g/3840/2160" alt="Kitten" /></p>
  <p>Text below image</p>

<script type="module">
  import {getCLS, getFID, getLCP} from 'https://unpkg.com/web-vitals?module';

  getCLS(console.log);
  getFID(console.log);
  getLCP(console.log);
</script>
</body>
</html>
  1. Save the file.

You added the web-vitals library to the web page.

3. Measure the web page's Core Web Vitals

Core Web Vitals are a measure of real-user experiences, as captured through Chrome or the web-vitals library on Chromium browsers. When you release web-vitals to production, you see a wide range of results based on users' connection speeds, device power, and how they interact with your site. To demonstrate the web-vitals library's capabilities, we're going to simulate a user experience with a slow connection.

  1. Open the saved file in your web browser.
  2. Right-click the web page.
  3. Click Inspect to open Google Chrome Developer Tools.

1d60156133044215.png

  1. Click the Console tab > Console settings b5c716ebfacfd86.png .

a310e2b6e03891a1.png

  1. Select the Preserve log checkbox to make logs persist when you refresh the web page.

cdfbcd3315aa45cd.png

  1. Click the Network tab > Online > Slow 3G to simulate a slow network connection.

b1fab3d167d032f0.png

  1. Click the Console tab.
  2. Click anywhere on the web page to force the metrics for Largest Contentful Paint (LCP) and First Input Delay (FID) to print.
  3. Click Reload this page acaaa8c0fdd33b1.png to force the metric for Cumulative Layout Shift (CLS) to print.

e18b530e48108a4.png

  1. Click the Network tab > Online > Fast 3G to simulate a fast network connection.
  2. Click the Console tab.
  3. Click anywhere on the web page to force the metrics for LCP and FID to print again.

e5d5ca555ded9f7a.png

  1. Click Reload this page acaaa8c0fdd33b1.png to force the metric for CLS to print again.

e8bde4594a01021b.png

That's it! You measured the web page's Core Web Vitals.

4. Explore web-vitals data in more detail

For each of the Core Web Vitals events you're measuring, there is a host of information available in the data returned that you can use to debug performance bottlenecks. Each web-vitals event contains an entries array, with information about the events contributing to the current metric value.

CLS entries

Expanding the entries property of the object logged by getCLS() shows you a list of LayoutShift entries. Each LayoutShift contains a value property reflecting the layout shift score, and a sources array that we can use to see which elements were shifted.

355f0ff58e735079.png

In this example, two layout shifts occurred, both moving an h1 element on the page. The currentRect property tells us where the element is now, and the previousRect element tells us where it was before.

LCP entries

Expanding the entries property of the object logged by getLCP() shows us which elements were candidates for Largest Contentful Paint before the final value was reported.

737ebf826005dbe7.png

In this example, the entries array contains a list of all the LCP candidates in chronological order. In this case, an h1 element was rendered first, followed by an img element. The img was the Largest Contentful Paint. The reported LCP element is always the last item in the array.

FID entries

When you expand the entries property of the object logged by getFID(), it shows an array containing the PerformanceEventTiming entry for the first user input on the page.

a63ef33575c3218d.png

The name property tells you which type of user input triggered the timer for main thread availability. The value that web-vitals reports is the delay between the PerformanceEventTiming entry's startTime and processingStart properties, converted from microseconds to milliseconds. In this case, the measured FID is 2 milliseconds.

5. Prepare and send web-vitals data to Google Analytics 4

Before you can send web-vitals data to Google Analytics 4, it needs to be converted into a format that GA4 can receive. You'll also add some useful functions that pull out valuable diagnostic information.

Generate a selector to help identify the entry target node

First, add a function to the script block that generates a string representation of the node and its place in the DOM, in a format similar to a CSS selector. The output of this function helps identify which elements in the page are responsible for your CWV values.

diagnostics.html

function getSelector(node, maxLen = 100) {
 let sel = '';
 try {
   while (node && node.nodeType !== 9) {
     const part = node.id ? '#' + node.id : node.nodeName.toLowerCase() + (
       (node.className && node.className.length) ?
       '.' + Array.from(node.classList.values()).join('.') : '');
     if (sel.length + part.length > maxLen - 1) return sel || part;
     sel = sel ? part + '>' + sel : part;
     if (node.id) break;
     node = node.parentNode;
   }
 } catch (err) {
   // Do nothing...
 }
 return sel;
}

Retrieve LayoutShift information

To log every layout shift that occurs is likely to generate an excessive amount of data. The functions below focus only on the largest LayoutShift entry, and the largest LayoutShiftSource within it. This lets you focus your optimizations on the most significant causes of layout shifts on your site. As you identify the causes of layout shifts and find ways to minimize them, the layout shift source you see in your reports changes to show the new worst offender.

diagnostics.html

function getLargestLayoutShiftEntry(entries) {
 return entries.reduce((a, b) => a && a.value > b.value ? a : b);
}

function getLargestLayoutShiftSource(sources) {
 return sources.reduce((a, b) => {
   return a.node && a.previousRect.width * a.previousRect.height >
       b.previousRect.width * b.previousRect.height ? a : b;
 });
}
  • getLargestLayoutShiftEntry() returns only the largest layout shift entry over the lifecycle of the page view.
  • getLargestLayoutShiftSource() returns only the largest layout shift source within that entry.

Determine if FID happened before or after DOMContentLoaded

The DOMContentLoaded event takes place after the page's HTML has completely loaded and parsed, which includes waiting for any synchronous, deferred, or module scripts (including all statically imported modules) to load. This function returns true if the first user input happened before DOMContentLoaded, or false if it happened after.

diagnostics.html

function wasFIDBeforeDCL(fidEntry) {
 const navEntry = performance.getEntriesByType('navigation')[0];
 return navEntry && fidEntry.startTime < navEntry.domContentLoadedEventStart;
}

Identify the FID target element

Another potentially useful debug signal is the element that was interacted with. While the interaction with the element itself does not contribute to FID (remember FID is just the delay portion of the total event latency), knowing which elements your users are interacting with may be useful to determine how best to improve FID.

To get the element associated with the first input event, reference the first-input entry's target property:

diagnostics.html

function getFIDDebugTarget(entries) {
  return entries[0].target;
}

Identify the FID input event type

It may also be useful to capture which type of event triggered the FID measurement to identify how users are interacting with your pages.

diagnostics.html

function getFIDEventType(entries) {
  return entries[0].name;
}

Structure the debug information for each CWV

The last step before sending this code to Google Analytics is to structure the information from the entries, including the information returned by the above functions.

diagnostics.html

function getDebugInfo(name, entries = []) {
  // In some cases there won't be any entries (e.g. if CLS is 0,
  // or for LCP after a bfcache restore), so we have to check first.
  if (entries.length) {
    if (name === 'LCP') {
      const lastEntry = entries[entries.length - 1];
      return {
        debug_target: getSelector(lastEntry.element),
        event_time: lastEntry.startTime,
      };
    } else if (name === 'FID') {
      const firstEntry = entries[0];
      return {
        debug_target: getSelector(firstEntry.target),
        debug_event: firstEntry.name,
        debug_timing: wasFIDBeforeDCL(firstEntry) ? 'pre_dcl' : 'post_dcl',
        event_time: firstEntry.startTime,
      };
    } else if (name === 'CLS') {
      const largestEntry = getLargestLayoutShiftEntry(entries);
      if (largestEntry && largestEntry.sources && largestEntry.sources.length) {
        const largestSource = getLargestLayoutShiftSource(largestEntry.sources);
        if (largestSource) {
          return {
            debug_target: getSelector(largestSource.node),
            event_time: largestEntry.startTime,
          };
        }
      }
    }
  }
  // Return default/empty params in case there are no entries.
  return {
    debug_target: '(not set)',
  };
}

Send the data to Google Analytics

Finally, create a function that takes parameters from the web-vitals event and passes them to Google Analytics.

diagnostics.html

function sendToGoogleAnalytics({ name, delta, value, id, entries }) {
  gtag('event', name, {
    // Built-in params:
    value: delta, // Use `delta` so the value can be summed.
    // Custom params:
    metric_id: id, // Needed to aggregate events.
    metric_value: value, // Value for querying in BQ
    metric_delta: delta, // Delta for querying in BQ
    // Send the returned values from getDebugInfo() as custom parameters
      ...getDebugInfo(name, entries)
  });
}

Register the function with each of the web-vitals functions, which fires when the browser is ready to measure each event:

diagnostics.html

getLCP(sendToGoogleAnalytics);
getFID(sendToGoogleAnalytics);
getCLS(sendToGoogleAnalytics);

Well done! You are now sending web-vitals events to Google Analytics.

6. Check that the web-vitals data populates in Google Analytics

To ensure that your events are recorded by your Google Analytics 4 property:

  1. Open your Google Analytics 4 property and navigate to Reports.

ab1bf51ba70f3609.png

  1. Select Realtime.

65a5b8087b09b2a.png

  1. Refresh your test page a few times and make sure to click on the page between refreshes to trigger FID events.
  2. Look for the Event count by Event name section of the Realtime overview UI. You should see LCP, FID, and CLS events.

f92b276df1c2f6ce.png

  1. Click on any of the event names to see the parameters passed with those events.

8529bd743f121dd9.png

  1. Click on those parameter keys to see a summary of values Google Analytics received.

f0cf6a3dd607d533.png

You might want to add other data to your debug info, like page template names or the other page events relevant to FID discussed earlier in this Codelab. Simply modify the return statements in the getDebugInfo() function.

Once you're happy with the data coming from your test pages, deploy your new GA code to production on your site and move on to the next step.

7. Query your data in BigQuery

Once your Google Analytics code has been live for a few days, you can start querying the data in BigQuery. First, check that the data is being transferred to BigQuery.

  1. Open the Google Cloud Console and select your project from the drop down menu at the top of the screen.
  2. From the navigation menu 3cbb0e5fcc230aef.png at the top left of the screen, click on BigQuery under the Analytics header.
  3. In the Explorer pane, expand your project to see your Google Analytics dataset. The name of the dataset is analytics_ followed by your Google Analytics 4 property ID (e.g., analytics_229787100).
  4. Expand the dataset and you should see an events_ table. The number in parentheses is the number of days available to query.

Subquery to select only CWV events

To query a data set that only includes our CWV events, start with a subquery that selects the last 28 days of LCP, CLS and FID events. This specifically looks for the last reported value for each web-vitals event ID using the metric_id key to make sure you are not counting the same CWV events more than once.

# Subquery all Web Vitals events from the last 28 days
WITH web_vitals_events AS (
 SELECT event_name as metric_name, * EXCEPT(event_name, is_last_received_value) FROM
 (
   SELECT *
   , IF (ROW_NUMBER() OVER (
     PARTITION BY (SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'metric_id')
     ORDER BY (SELECT COALESCE(value.double_value, value.int_value) FROM UNNEST(event_params) WHERE key = 'metric_value') DESC
   ) = 1, true, false) AS is_last_received_value
   # Make sure to update your project ID and GA4 property ID here!
   FROM `YOUR_PROJECT_ID.analytics_YOUR_GA_PROPERTY_ID.events_*`
   WHERE event_name in ('CLS', 'FID', 'LCP') AND
     _TABLE_SUFFIX BETWEEN FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE, INTERVAL 28 DAY)) AND FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE, INTERVAL 1 DAY))
  )
  WHERE is_last_received_value
)

This forms the basis of all of your queries against this dataset. Your main query will run against the temporary table web_vitals_events.

How GA4 events are structured

Each Google Analytics 4 event data is held in a STRUCT the event_params column. Each of the event parameters you pass to GA4 on your site is represented by its key, and the value is a STRUCT with a key for each possible data type. In the above example, the metric_value key could have an int_value or a double_value, so the COALESCE() function is used. To get the debug_target you passed earlier, you select the string_value key in the debug_target.

...
(SELECT value.string_value FROM UNNEST(event_params) WHERE key = "debug_target") as debug_target
...

Find your worst performing pages and elements

The debug_target is a CSS selector string that corresponds to the element on the page that is most relevant to the metric value.

With CLS, the debug_target represents the largest element from the largest layout shift that contributed to the CLS value. If no elements were shifted then the debug_target value is null.

The following query list pages from worst to best by their CLS at the 75th percentile, grouped by debug_target:

# Main query logic
SELECT
  page_path,
  debug_target,
  APPROX_QUANTILES(metric_value, 100)[OFFSET(75)] AS metric_p75,
  COUNT(1) as page_views
FROM (
  SELECT
    REGEXP_SUBSTR((SELECT value.string_value FROM UNNEST(event_params) WHERE key = "page_location"), r'\.com(\/[^?]*)') AS page_path,
    (SELECT value.string_value FROM UNNEST(event_params) WHERE key = "debug_target") as debug_target,
    ROUND((SELECT COALESCE(value.double_value, value.int_value) FROM UNNEST(event_params) WHERE key = "metric_value"), 3) AS metric_value,
    *
  FROM web_vitals_events
  WHERE metric_name = 'CLS'
)
GROUP BY 1, 2
# OPTIONAL: You may want to limit your calculations to pages with a 
# minimum number of pageviews to reduce noise in your reports. 
# HAVING page_views > 50
ORDER BY metric_p75 DESC

1bbbd957b4292ced.png

If you know which elements on the page are shifting, it should make it much easier to identify and fix the root cause of the problem.

Keep in mind that the elements reported here might not be the same elements that you see shifting when you debug your pages locally, which is why it's so important to capture this data in the first place. It's very hard to fix things that you don't realize are problems!

Debug other metrics

The above query shows the results for the CLS metric, but the exact same technique can be used to report on the debug targets for LCP and FID. Just replace the WHERE clause with the relevant metric to debug:

# Replace:
# WHERE metric_name = 'CLS'
# With:
WHERE metric_name = 'LCP'

8. Visualize query results in Data Studio

BigQuery provides a quick way to visualize any query results through Data Studio. Data Studio is a data visualization and dashboarding tool that is free to use. To visualize your query results after running your query in the BigQuery UI, click Explore Data and select Explore with Data Studio.

Explore with Data Studio option in BigQuery

This creates a direct link from BigQuery into Data Studio in the explore view. In this view, you can select the fields you want to visualize, choose chart types, set up filters, and create ad hoc charts for quick visual analysis. From the above query results, you can create this line chart to see the trend of LCP values over time:

Line chart of daily LCP values in Data Studio

With this direct link between BigQuery and Data Studio, you can create quick charts from any of your queries and do visual analysis. However, if you want to do additional analysis, you might want to look at several charts in an interactive dashboard to get a more holistic view or to be able to drill down into the data. Having a handy dashboard means you don't have to write queries and generate charts manually every time you want to analyze your metrics.

You can create a dashboard in Data Studio using the native BigQuery connector. To do so, navigate to datastudio.google.com, create a new data source, select the BigQuery connector, and choose the dataset you want to work with:

Using the BigQuery native connector in Data Studio

9. Materialize Web Vitals data

When creating dashboards of the Web Vitals event data as described above, it's not efficient to use the Google Analytics 4 export dataset directly. Due to the structure of the GA4 data and the preprocessing required for the Web Vitals metrics, parts of your query end up running multiple times. This creates two problems: dashboard performance and BigQuery costs.

You can use the BigQuery sandbox mode for free. With BigQuery's free usage tier, the first 1 TB of query data processed per month is free. For the analysis methods discussed in this post, unless you are using a significantly large dataset or are heavily querying the dataset regularly, you should be able to stay within this free limit every month. But if you have a high traffic website and want to regularly monitor different metrics using a fast interactive dashboard, we suggest preprocessing and materializing your web vitals data while making use of BigQuery efficiency features like partitioning, clustering, and caching.

The following script preprocesses your BigQuery data (source table) and creates a materialized table (target table).

# Materialize Web Vitals metrics from GA4 event export data

# Replace target table name
CREATE OR REPLACE TABLE YOUR_PROJECT_ID.analytics_YOUR_GA_PROPERTY_ID.web_vitals_summary
  PARTITION BY DATE(event_timestamp)
AS
SELECT
  ga_session_id,
  IF(
    EXISTS(SELECT 1 FROM UNNEST(events) AS e WHERE e.event_name = 'first_visit'),
    'New user',
    'Returning user') AS user_type,
  IF(
    (SELECT MAX(session_engaged) FROM UNNEST(events)) > 0, 'Engaged', 'Not engaged')
    AS session_engagement,
  evt.* EXCEPT (session_engaged, event_name),
  event_name AS metric_name,
  FORMAT_TIMESTAMP('%Y%m%d', event_timestamp) AS event_date
FROM
  (
    SELECT
      ga_session_id,
      ARRAY_AGG(custom_event) AS events
    FROM
      (
        SELECT
          ga_session_id,
          STRUCT(
            country,
            device_category,
            device_os,
            traffic_medium,
            traffic_name,
            traffic_source,
            page_path,
            debug_target,
            event_timestamp,
            event_name,
            metric_id,
            IF(event_name = 'LCP', metric_value / 1000, metric_value) AS metric_value,
            user_pseudo_id,
            session_engaged,
            session_revenue) AS custom_event
        FROM
          (
            SELECT
              (SELECT value.int_value FROM UNNEST(event_params) WHERE key = 'ga_session_id')
                AS ga_session_id,
              (SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'metric_id')
                AS metric_id,
              ANY_VALUE(device.category) AS device_category,
              ANY_VALUE(device.operating_system) AS device_os,
              ANY_VALUE(traffic_source.medium) AS traffic_medium,
              ANY_VALUE(traffic_source.name) AS traffic_name,
              ANY_VALUE(traffic_source.source) AS traffic_source,
              ANY_VALUE(
                REGEXP_SUBSTR(
                  (SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_location'),
                  r'^[^?]+')) AS page_path,
              ANY_VALUE(
                (SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'debug_target'))
                AS debug_target,
              ANY_VALUE(user_pseudo_id) AS user_pseudo_id,
              ANY_VALUE(geo.country) AS country,
              ANY_VALUE(event_name) AS event_name,
              SUM(ecommerce.purchase_revenue) AS session_revenue,
              MAX(
                (
                  SELECT
                    COALESCE(
                      value.double_value, value.int_value, CAST(value.string_value AS NUMERIC))
                  FROM UNNEST(event_params)
                  WHERE key = 'session_engaged'
                )) AS session_engaged,
              TIMESTAMP_MICROS(MAX(event_timestamp)) AS event_timestamp,
              MAX(
                (
                  SELECT COALESCE(value.double_value, value.int_value)
                  FROM UNNEST(event_params)
                  WHERE key = 'metric_value'
                )) AS metric_value,
            FROM
              # Replace source table name
              `YOUR_PROJECT_ID.analytics_YOUR_GA_PROPERTY_ID.events_*`
            WHERE
              event_name IN ('LCP', 'FID', 'CLS', 'first_visit', 'purchase')
            GROUP BY
              1, 2
          )
      )
    WHERE
      ga_session_id IS NOT NULL
    GROUP BY ga_session_id
  )
CROSS JOIN UNNEST(events) AS evt
WHERE evt.event_name NOT IN ('first_visit', 'purchase');

This materialized dataset has several advantages:

  • The data structure is flattened and easier to query.
  • It retains only the Web Vitals events from the original GA4 dataset.
  • Session ID, user type (new vs returning), and session engagement information is directly available in columns.
  • The table is partitioned by date and clustered by metric name. This usually reduces the amount of data processed for each query.
  • Since you don't need to use wildcards to query this table, query results can get cached for up to 24 hours. This reduces costs from repeating the same query.
  • If you use the BigQuery BI Engine, you can run optimized SQL functions and operators on this table.

You can directly query this materialized table from within BigQuery UI, or use it in Data Studio using the BigQuery connector.

Run regular materialize jobs

If you run the query above without a date range, it runs on your entire Google Analytics dataset. You want to avoid doing this every day, as you reprocess large amounts of historical data. You can update your query to only append the last day's data by removing the CREATE or REPLACE TABLE statement at the beginning of the query, and adding an additional criteria to the WHERE clause in the subquery against the events_intraday_ table:

FROM
  # Replace source table name
  `YOUR_PROJECT_ID.analytics_YOUR_GA_PROPERTY_ID.events_intraday_*`
WHERE
  event_name IN ('LCP', 'FID', 'CLS', 'first_visit', 'purchase')
  # The _TABLE_SUFFIX replaces the asterisk (*) in the table name
  # 
  AND _TABLE_SUFFIX = FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE, INTERVAL 1 DAY)

This query returns only data from yesterday. You can then use the BigQuery Console to schedule your query to run on a daily basis.

10. Visualize the data in Google Data Studio

Google Data Studio natively supports reading data from Google BigQuery. Now that you have web-vitals data from Google Analytics 4 populating in BigQuery, you can use the Data Studio BigQuery connector to directly read your materialized table.

Use the Web Vitals Connector

Since making a dashboard from scratch is time consuming, we developed a packaged solution that creates a template dashboard for you. First, make sure that you have materialized your Web Vitals table using the above query. Then access the Web Vitals connector for Data Studio using this link: goo.gle/web-vitals-connector

After providing a one time authorization, you should see the following configuration screen:

Web Vitals Connector authorization screen

Provide the materialized BigQuery table ID (i.e., the target table) and your BigQuery billing project ID. After you click CONNECT, Data Studio creates a new templated dashboard and associates your data with it. You can edit, modify, and share the dashboard as you like. If you create a dashboard once, you don't have to visit the connector link again unless you want to create multiple dashboards from different datasets.

As you navigate the dashboard, you can see the daily trends of the Web Vitals metrics and some usage information for your website like users and sessions, in the Summary tab.

In the User Analysis tab, you can select a metric and get a breakdown of the metrics percentile, as well as user count, by different usage and business metrics.

The Page Path Analysis tab helps you to identify problem areas on your website. Here, you can pick a metric to see the overview, but you also see the scatter-map of all the page paths with the percentile value on y-axis and record count on x-axis. The scatter map can help to identify pages that have lower than expected metric values. Once you select the pages, you can further drill down on the problem area with the scatter chart of the Page path table, or by viewing the Debug Target table.

The Revenue Analysis tab is an example of how you can monitor your business and performance metrics in the same place. This section plots all sessions where the user made a purchase. You can compare the revenue earned versus user experience during a specific session.

11. Other resources

Well done on completing this Codelab! You should now be able to keep track of your Core Web Vitals performance across your site with a high level of granularity. You should also be able to identify specific page types and elements on your site that are causing high CWVs so you can focus your optimizations.

Further reading

web.dev has a host of articles and case studies with strategies for improving Core Web Vitals. Start with the optimize articles for each metric:

Reference docs