🥳 We have launched Auditzy™ RUM (Real User Monitoring)
to monitor Website Speed in Real-Time 🥳,Try Now For Free  arrow
🥳 We have launched Auditzy™ RUM (Real User Monitoring) to monitor Website Speed in Real-Time 🥳,Try Now For Free  
Features
Use Cases
Pricing
About
Resources
Contact
menu-header
Why Lighthouse Performance Score and Page Speed Insights Performance Scores keeps changing? Understand Score Variance

Lighthouse

6 mins read

Why Lighthouse Performance Score and Page Speed Insights Performance Scores keeps changing? Understand Score Variance

shubham-saurabh
Shubham Saurabh
May 06, 2023

The Overview

I have often seen PMs, and Devs chase Perfect lighthouse scores. And I have seen many devs saying, 

  • The lighthouse performance score is different every time; there is no way to debug it because there is no consistency
  • Page Speed and Lighthouse performance scores are not the same
  • My webpage Lighthouse score has been changed; I have yet to write a single line of code
  • Different Platforms like Auditzy, GTMetrix, WebPageTest etc provide different results; I am still determining which one I should trust

Does any of the above statements look familiar to you? If yes, do not worry; I will explain in this blog why the Lighthouse performance score variance at a core level.

As per Google, 

“Lighthouse performance scores tend to change due to the inherent variability in web and network technologies, even if there hasn't been a change to the page.”

Now let's dive deep to understand the above statement in detail.

There are 2 ways to run a Website Performance Audit:

  1. Google Lighthouse (For Lab Data)
  2. Page Speed Insights (For Field & Lab Data)

Google Lighthouse has been designed to be used in 2 ways widely.

  • Use Lighthouse as a Chrome/Firefox dev tool extension to extract website performance scores. The catch here is, Lighthouse will take the device computation power and current network and use your current location to establish the connection with the hosted server of the URL which you are looking to Audit. This will lead to varying website performance on multiple audit runs as your network and device computation power can change over various runs. You can not save your data programmatically, and numerous runs will be frustrating.
  • Use Node CLI to run Headless Chrome and Lighthouse. This is the most recommended way to perform Audits. But there is a catch here; this is only for some; this route will cost you money, as you will be asked to set up your own compute server (8GB RAM 4 Core CPU, recommended). Once your VM is set, you should be able to extract information with minimal performance variability compared to dev tools' chrome LH extension. Here you can run audits across your personified Simulated/Emulated Device, Various Network Conditions, and the location would be the server location of the Hosted compute server (like, Mumbai, Texas, London etc.). 

Please note at any given time, you can only perform one Audit on any machine; running multiple audits in parallel is not recommended because it will skew the performance for all the URLs.

In-depth understanding of Page Speed Insights

On the other hand, PSI gives your Field Data (Core Web Vitals) and Lab Data for any URL. 

PSI, use Node CLI Headless Chrome Lighthouse in the background to extract the information of Lab Data for any given URL for both Mobile and Desktop Mode.

As of 5th May 2023, For Mobile Audits, below are the configurations:

  • Device: Moto G Power
  • Network: Slow 4G (150 ms TCP RTT, 1.6 Mbps simulated)
  • Location: Asia  (Now, this is a pain point)

For the Desktop, the configuration is below:

  • Device: Simulated Desktop (Screen emulation: 1350x940, DPR 1)
  • Network: Broadband 4G (40 ms TCP RTT, 10 Mbps throughput, Simulated)
  • Location: Asia  (Now, this is a pain point)

The Challenge with Page Speed Insights

  • The location must be specified for both Desktop and Mobile, which is a big issue because my website might be hosted in India. So ideally, I want to check my website performance from an Indian server location. A massive performance drop will occur if the server is hosted somewhere in Asia. And hence PSI should not be considered a source of truth for performance data
  • You can not control network speed and devices with PSI because they use a standard configuration for all their runs. This is fine, but what if I want to check my website performance on a Simulated/Emulated Iphone device with a 5G network? This is just not possible with PSI. 
  • Since the last optimization, you can not store Audit data and must jump between files to understand changes. There is no data-driven analytics.

When to Trust Page Speed Insights

Page Speed Insights are good for tracking Core Web Vitals. The source of this data is CrUX (Chrome User Experience). CrUX calculates the CWV data sets when the user is logged in to Chrome while browsing your website. It means

You will not have the CWV data for the users coming from the Chrome incognito window or any other browsers.

But you can still trust this data because the majority of the traffic on the internet still comes from Chrome. To learn more about CWV for all other browsers, you must set up your RUM (Real User Monitoring).

Also, CrUX does provide historical CWVs data for any URL of the previous 6 months, which you can find via their APIs, but it's not available for PSI. You can check any URL Core Web Vitals History, with Auditzy, for Free !!

Other factors which can vary Google Lighthouse performance score in multiple audit runs

Some more factors can impact the website performance and produce a variance during multiple runs, and they are:

  • 3rd party scripts sometimes play a substantial role; for example, if a script fires depending on some condition during page load, then the lighthouse performance score will vary.
  • Page nondeterminism, e.g., an A/B test that changes the layout and assets loaded or a different ad experience based on campaign progress. It is an intentional and irremovable source of variance.
  • Local Network Variability happens when you are running Lighthouse from the dev tool extension. Local networks have intrinsic variability from packet loss, variable traffic prioritization, and last-mile network congestion. Users with cheap routers and many devices sharing limited bandwidth are usually the most susceptible to this.
  • Client Resource Contention can also cause performance variance, which is true because if the client (the machine from which the LH audit is happening) is running multiple applications at the same time when LH is being triggered, then there can be performance variance because the computation power of the device will be get shared and hence the performance variance.
  • Browser Nondeterminism also contributes to website performance variability because browsers have inherent variability in their execution of tasks that impacts the way webpages are loaded. It is unavoidable for dev tools to throttle as they are simply reporting whatever was observed by the browser.
  • Lighthouse version Mismatch can also cause performance score variance; this happens when you look at the report of the exact URL and various platforms like Auditzy, GTMetrix, WebPageTest, etc. Always trust the report generated from the latest Lighthouse version because algorithms keep changing, and looking at the older LH version's data can be disastrous.

How to Deal With Lighthouse Performance Score Variance

As we have observed, there is no 100% correct solution because many factors can cause performance variance. But there are ways by which you can minimize the variance. These are:

  • Run Lighthouse on Adequate Hardware like 8 GB RAM and 4 Core CPU machine (dedicated) configuration will provide minimal variance as compared to other 
  • Isolate external factors like the impact of 3rd party code, network variance, and server variance because you should not blame Lighthouse for performance variance when the culprit is someone else :)
  • Run Lighthouse at least 5 times to understand the data sampling for one Visitor First Persona (VPF). This will help you understand the performance trend.

The Way Forward

As passionate performance enthusiasts, we have also faced similar challenges in the past while optimizing websites and have spent an ample amount of time chasing scores. But once we understood the core algorithm of Lighthouse and why performance score varies, we started building Auditzy and spreading the knowledge about Visitor First Personas (VPF)

Auditzy is built on top of Lighthouse which complements lighthouse with pool of features which are not available in LH and also uses adequate hardware dedicated for audits, so we do not run 2 LH runs in parallel on the same machine, providing minimal variance in multiple runs.

Auditzy provides a spectrum of 40+ devices, 10+ networks and 13+ Global Locations to know about your website performance. We have a minimal variance in the website score, but we suggest having at least 5 runs to understand the website performance for any URL.

We have a pool of features you can explore to minimize manual auditing and monitoring processes, like:

You can start your free trial of 14 days as we do not seek CC information. 

Use Auditzy first-hand to experience the power of professional website performance management.

Ref:  Read official Lighthouse Documentation on Performance Variance.

About the Author
shubham-saurabh
Shubham Saurabh
linkedin
Founder & CEO at Auditzy
Now a part of :
lighthouse
Lighthouse interactions in Web Perf services
microsoft
Microsoft for Startups Founders Hub
Follow Us On:
linkedin
twitter
youtube
instagram
facebook
Auditzy™ is one of the trademarks of Auditzy Technologies Pvt Ltd © 2023. All Rights Reserved.
Built with ❤️ in JAMstack
Auditzy - Analysing web & the pre-user journey | Product Hunt
Proud to be part of
startup india