Website performance is important for several reasons. Being able to measure, collect, and review meaningful performance metrics is the first step in verifying that your website is performing as expected, and can help you figure out if you are providing a good user experience to your site visitors.
Before getting into the details about how and what to measure, make sure that you have familiarized yourself with the basics of website performance.
Before talking about the different types of performance and UX metrics that can be measured and collected for websites, it's important to differentiate between the various sources of performance data.
We will group these into two categories:
Which one(s) should you use? We answer that in a bit, after we talk about the types of performance metrics.
There a many, many metrics and attributes relating to page loading and website performance that can be measured for sites. Here's a partial list:
DNS Lookup time | TCP Connection time | TLS Handshake | Server Response time | Time to first byte (TTFB) | Download time |
Network connection speed | Time to last byte | Start Render time | Page Load Event | Speed Index | First Input Delay |
Total Blocking Time | Number of Requests | Largest Contentful Paint | DOM Content Loaded | First Contentful Paint | Cumulative Layout Shift |
Bytes Downloaded | Browser cache hit % | DOM Interactive | First Paint |
But which ones are important?
If we focus our concern on actual human visitors and user experience, we can distill this list down quite a bit.
Google has defined a list of web vitals on their web.dev site, including a set of Core Web Vitals.
We will cover web vitals and Core Web Vitals in more detail in another post. But next, let's continue our discussion regarding lab data vs. field data.
Google's web.dev site has a page about this called Getting started with measuring Web Vitals.
As we discussed in our post covering website performance basics, you almost always want to focus on performance and speed as it relates to actual site visitors.
In this regard, field data, or metrics collected from real users, should be used to help you understand what kind of performance actual visitors to your site are likely experiencing.
Synthetic testing tool results should never be used as goal or benchmark themselves.
Starting with a tool like GTMetrix, and making the scores better, then considering yourself "Done"
=
Declaring premature victory, while worshiping a false God
Testing tools are meant to be used to understand page performance for a specific number of page loads, under very specific conditions.
Before you can use the tools effectively, you must first understand the basics of website performance thoroughly.
The Chrome UX Report, also known as the CrUX for short, is "a dataset that reflects how real-world Chrome users experience popular destinations on the web".
Google has a lot of useful information right on this one page:
Understanding what the Chrome UX Report is, what it contains, and how the data gets there, is one of the most important pieces for understanding how Google actually collects and uses Core Web Vitals as part of their Page Experience ranking signal.
If you're speaking to anyone who makes claims about Core Web Vitals (CWV) or how Google views website performance, checking to see if they know about and understand the Chrome UX Report is one of the easiest ways to verify if they actually know what they are doing or what they are talking about.
PageSpeed Insights, SiteDistrict Web Vitals dashboard, and more ...