Blog

Lessons learned debugging Interaction to Next Paint (INP)

Interaction to Next Paint (INP) is a new performance metric significantly affecting your search ranking (SERP). But how does it work, and how can we measure and debug it to improve our website's INP?

Core Web Vitals is an initiative by Google that uses a set of metrics to measure user experience on a website. As of March 2024, Interaction to Next Paint (INP) has replaced First Input Delay (FID) as a Core Web Vital to measure the responsiveness of a website. Where FID measures the input delay of the first interaction with a page, INP considers all interactions within a user session.

Below is an image from Google illustrating what an “interaction” and a “paint” is, along with a video demonstrating examples of good and poor responsiveness:

A diagram of the interaction phases: blocking tasks, input delay, processing time, render, paint, frame presented
The lifecycle of an interaction leading eventually to a frame being painted. Source: web.dev

INP is designed to be a more reliable metric for capturing a website’s responsiveness when a user interacts with it. Given that INP is based on real user interactions, I wanted to set up real user monitoring (RUM) around INP. Here’s what I discovered.

How Interaction to Next Paint is different from other metrics

The other Core Web Vitals, such as Largest Contentful Paint (LCP) and Cumulative Layout Shift (CLS), are primarily first-load metrics, meaning they measure the initial page impression. Since INP measures real user interactions during a session, lab measurement becomes much more challenging. INP relies on user interaction, so users must interact with the website to measure INP. It is difficult to trigger all possible interactions in synthetic tests, and manually triggering every potential interaction is time-consuming. Therefore, using RUM data to monitor INP values makes sense. You can then attempt to reproduce those slow interactions in a controlled environment.

While there are excellent tools available for performance monitoring that also measure INP in the field, my goal was to build my own to gain a deeper understanding of INP and how to measure it.

How to measure INP

INP is measured using the Event Timing API and specifically the Performance Event Timing interface.

We can load a script on every page with something like this:

let largestINPValue = 0;

new PerformanceObserver(list => {
  for (const entry of list.getEntries()) {
    if (entry.duration > largestINPValue) {
      largestINPValue = entry.duration;
      console.log(`[INP] duration: 
        ${entry.duration},
        type: ${entry.name}`,
        entry
      );
    }
  }
}).observe({
  type: 'event',
  buffered: true
});

Now, the INP value will be logged to the console on any interaction larger than the previously recorded INP value. We do this to avoid overwhelming ourselves with INP interactions. You could also implement a system to report values that are considered "poor" or "in need of improvement."

Below is a screen recording showing INP values being logged to the console:

Another useful tool for tracking down and resolving INP issues is the Long Animation Frame API (LoAF). The LoAF can provide detailed information on how time was spent and which script was responsible for the INP value. It looks something like this:

const observer = new PerformanceObserver((list) => {
  console.log(list.getEntries());
});

observer.observe({ type: "long-animation-frame", buffered: true });

The challenge here is that we need to measure both INP and long animation frames to match the INP entry with the LoAF entry and obtain attribution data for the INP event.

If you prefer not to delve into the underlying Web APIs and just want a working solution, the npm package web-vitals simplifies the implementation. Particularly, connecting the INP data to its attribution can be cumbersome, so web-vitals is highly beneficial here. With attribution, it would look something like this:

import { onINP } from 'web-vitals/attribution';

onINP((data) => {
  console.log(`
    [INP] duration:
    ${data.value},
    type: ${data.entries[0].name}`,
    data.entries[0]
  );
}, {reportAllChanges: true});

Gathering Real User Metrics (RUM) data

Instead of logging the data to the console, we can send it to our analytics provider or wherever you prefer to store the information. Since we want to collect INP values per page and per user, we’ll send the page URL along with a unique identifier for the user. These will be useful when processing the data later.

Our `reportINP` implementation looks something like this:

import { onINP } from 'web-vitals/attribution';

let sessionId = sessionStorage.getItem('session-id');
if (!sessionId) {
  sessionId = uuid();
  sessionStorage.setItem('session-id', sessionId);
};
	
onINP(({.value. attribution }) => {
  reportINP({
    value,
    attribution,
    sessionId,
  })
}, {reportAllChanges: true});

All looks great, but there is a catch like always...

Considerations

While in the process of starting to gather RUM data I wanted to implement this on our own website. While setting it up and logging into the console I found a couple of things to consider:

Browser support

Measuring INP requires the PerformanceEventTiming API. This web API is not supported in Safari meaning there is no way to measure INP in Safari. That means that in the Netherlands around 24% of your users will not be included in your test results. I do think that if you collect enough data you will still capture a good understanding of the overall responsiveness of your website.

It's also worth noting that the other Core Web Vitals, such as Largest Contentful Paint (LCP) and Cumulative Layout Shift (CLS), are also measured using Web APIs not supported in Safari. Thus, it’s impossible to collect RUM data on non-Chromium browsers using these APIs.

A note on single-page applications or hybrid applications

Our website is built with Nuxt, and some of you may be familiar with Next.js, which is wildly popular among web developers. Frameworks like these typically serve a fully rendered HTML page on first load, but then function as a single-page application (SPA) for subsequent pages. This behaviour is known as soft navigation, where the application only loads the additional resources needed for the next page and rewrites the URL in the browser. All performance metrics are measured relative to the top-level page navigation, meaning that if we use soft navigations, the metrics won’t reset per page, making it challenging to obtain core web vitals RUM data per page.

The future looks bright though! To solve these challenges, there is a soft navigation specification being worked on. Additionally, it will soon be possible to report soft navigations using performance observers. It will look something like this, kindly borrowed from Experimenting with measuring soft navigations:

const observer = new PerformanceObserver(console.log);
observer.observe({ type: "soft-navigation", buffered: true });

Note that there is also a `soft-navs` branch in the web-vitals repo that already implements reporting soft navigations.

Conclusion

At the time of writing, calculating RUM data with web APIs intended to do so is harder than you might think, especially when using popular hybrid rendering web frameworks. It is also important to note when looking at RUM data collected using these API that this data is not telling the whole story. It probably does not take client-side routed page visits into account.

If you wish to collect RUM data around INP today and your application uses soft navigations, I recommend not focusing on INP at the page level. Instead, define a user journey on your website that is crucial to both you and your users and involves extensive JavaScript. Measure INP for that user journey instead. This approach shifts the focus from Core Web Vitals scores to the most critical user journeys within your web application, which, in my opinion, is a better strategy anyway. Concentrating on uninterrupted user journeys will naturally lead to improved Core Web Vitals, and now we know how to measure them effectively!

Related blog posts

← All blog posts

Would you like to improve your page speed?

Read more about our web performance optimisation service.

More about web performance