Improving the website speed is always a pain. With LLMs like Chat GPT and Claude in the picture, it is much easier to figure out how to improve the website's performance. But I assure you this blog is more convenient as it contains the full guide to improving the webpage score. The special thing about this blog is, that I have organized all the ways in a certain order, prioritizing the ways where the efforts would be minimal and improvement would be substantial. I have curated all these ways with 7 years of experience in the field of frontend development. I have implemented these practices on my projects and achieved fast page load time, even perfect lighthouse score in a few cases.
About Website Performance
Before we proceed, it is crucial to be aware of the distribution of the performance metrics. I am sharing an image from the pagespeed insights.
The 100 points in the performance metric of the lighthouse score are distributed as following core web vitals:
- TBT (Total Blocking Time) has a weightage of 30 points
- CLS (Cumulative Layout Shift) has 25 points
- LCP (Largest Contentful Paint) has 25 points
- FCP (First Contentful Paint) has 10 points
- SI (Speed Index) has 10 points
If you want to know about each metric in detail, you can read it from Google’s official documentation on lighthouse sore. Let’s cut to the chase and learn how to improve website speed.
Optimizing Images
Images, being heavy and prominent are an integral part of most websites. Not handling images well can easily mess up your webpage performance. Below are the things you need to keep in mind:
Image Compression
Large Images take more time to load, which impacts the page performance. Compressing the images can significantly reduce their size and decently improve the web page performance. Image compression is of two types:
- Lossy Compression (Compressing the image without reducing the image quality)
- Lossless Compression (Compression with reducing some of the quality)
You should prefer using Lossy Compression as it significantly reduces the size of an un-optimized image without any noticeable quality reduction. You can do image compression online free by using Tinypng Tool. It is a lossy compression tool that can do a 100kb image compression to 20 kb easily.
Using CDNs
CDN stands for content delivery networks. These are easily replicable servers that will store a copy of your image to the nearest hosted server to your user. This way the images will get served quickly without putting load on your servers. You can use Cloudflare to set up a free CDN for your web application without any hassle.
Using Webp/Avif Images:
Image formats like jpg and png are larger in size which takes time to load and slows down your page speed. Using the next-gen image formats such as webP and AVIF can be beneficial as they are upto 50% smaller than the traditional PNGs and jpegs. You can convert PNG to webP and JPG to webP using this converter.
Lazy Load Images
We don't need to load the images instantly that are not visible in the viewport to save bandwidth. We can lazy load such images so that the browser only loads them if the user scrolls to that image. You can use the below code to implement lazy loading:
1<img loading="lazy" src="image.jpg" alt="..." />
Using Picture Tag
You should prefer using <picture> tag to load images. It provides flexibility to select the images of lower resolution for the low-resolution devices. It also helps in loading the fallback image if the primary image format is not supported.
To load the image in lower resolutions in low resolution devices, you can use the below code:
1<picture>2 <source media="(max-width: 799px)" srcset="elva-480w-close-portrait.jpg" />3 <source media="(min-width: 800px)" srcset="elva-800w.jpg" />4 <img src="elva-800w.jpg" alt="Chris standing up holding his daughter Elva" />5</picture>
To use fallback if the webP/AVIF is not supported by the browser, use the code below:
1<picture>2 <source srcset="image.webp" type="image/webp">3 <img src="image.png" alt="Alternative text">4</picture>
Loading the required Image on Priority
This feature is particularly important if you are using NexJs, NexJs’s Image tag lazy loads all the images by default. If you lazy load the images that are present inside the initial viewport, it would slow down the LCP (Largest Contentful paint). So make sure you load all such images on priority.
Using Brotli Compression on Images
Brotli compression is usually used to compress text-based files such as HTML, CSS, and JavaScript. But you can also use it for image formats. You can compress the images before serving and they will get uncompressed automatically in browser thus reducing the image bandwidth and increasing the page speed. I am sharing an express server which can be used for botli compression.
1const express = require('express');2const shrinkRay = require('shrink-ray-current');3
4const app = express();5
6app.use(shrinkRay({ brotli: { quality: 11 }, gzip: true }));7
8app.use(express.static('public')); // Serve static files (including images)9
10app.listen(3000, () => {11 console.log('Server running on port 3000');12});
Fonts
I have taken a lot of interviews throughout my career, more than 100 I believe. I usually the web page optimization-related questions from most of the frontend candidates. But hardly 5% of them talk about font optimization. Fonts are blocking resources that can significantly impact the webpage performance and are often neglected. Here are some practices which you need to follow to optimize your fonts:
Use WOFF2 Format
You should prefer using a woff2 font format as it is more compressed and efficient as compared to eot and ttf font format. You can use the font converter to convert all the unoptimized font formats to WOFF2.
Use Font Subsets
You can use tools like Google Fonts Subsets or glyphhanger which allows you to subset fonts by only using the characters that you need in your website, which can significantly reduce file sizes. You can use the code below to use font subsets.
1@import url('https://fonts.googleapis.com/css2?family=Roboto:wght@400;700&display=swap&subset=latin');
Implement Font Display Swap
Font display swap will display fallback text until the custom font is fully loaded, preventing layout shifts and improving the First Contentful Paint (FCP) score. You can refer to the code below for implementation.
1@font-face {2 font-family: 'MyFont';3 src: url('myfont.woff2') format('woff2');4 font-display: swap;5}
Preload Critical Fonts
Preload the most critical fonts to ensure they're fetched early. This helps speed up the first paint of the text, improving the Largest Contentful Paint (LCP) and overall performance. Use the code to preload the fonts mentioned below
1<link rel="preload" href="/fonts/myfont.woff2" as="font" type="font/woff2" crossorigin="anonymous">
Hosts the Fonts Locally
If you host the fonts locally, instead of loading it from google or an external domain, it saves your browser from establishing a connection to an additional domain and saves a lot of bandwidth and improves performance. To host the fonts locally, you can just download the font, save it in your codebase, and load the fonts from there.
Use Limited Font Weights
Using limited font weights reduced the size of the font file resulting in the improvement in the page speed. Only use the font weights required in the code base. Avoid using multiple font weights on your website. You should prefer using only 2-3 different font weights.
Use HTTP3
You should prefer HTTP/3 over HTTP/1.1 and HTTP/2 as they have faster connection establishment. HTTP3 is also able to multiplex the requests over a single connection which reduces the latency and roundtrips. It has better performance over poor networks, better security, and better steaming and real-time applications. To implement and know more about HTTP3, you can read about it on Cloudflare.
Defer Third Party Scripts
By deferring a script, you can delay the script’s execution until the rest of the page has been parsed and loaded. This ensures the main thread won't block the rendering of the page. Prefer to defer the scripts which are not required to be loaded instantly such as third-party or analytics scripts. You can use the code below to defer the scripts.
1<script src="https://third-party.com/script.js" defer></script>
Load Low Priority Third Party Scripts on Interactivity
If you avoid loading the low-priority scripts that you don't need until the user interacts with the page. This way the website speed testing tools get tricked as they don't load these scripts in the website speed tests and you get a better performance score. I do the same for Blogologer and get 8 additional points for the performance metrics on mobile. Prefer delaying the load only those scripts that you need after the user interactivity. Here is the code you can use to load the scripts on interactivity:
1// Function to load the script dynamically2function loadScript(src) {3 const script = document.createElement('script');4 script.src = src;5 script.async = true; // Optional: load asynchronously6 document.body.appendChild(script);7}8
9// Event listener to detect interactivity (scroll, click, or touch)10function loadOnInteraction() {11 // Load your third-party scripts here12 loadScript('https://third-party.com/script1.js');13 loadScript('https://third-party.com/script2.js');14
15 // Remove event listeners after the first interaction to avoid reloading scripts16 window.removeEventListener('scroll', loadOnInteraction);17 window.removeEventListener('click', loadOnInteraction);18 window.removeEventListener('touchstart', loadOnInteraction);19}20
21// Add event listeners for interactivity (scroll, click, touch)22window.addEventListener('scroll', loadOnInteraction);23window.addEventListener('click', loadOnInteraction);24window.addEventListener('touchstart', loadOnInteraction);
Minify HTML, CSS, JSS
Minifying the files helps reduce the file size helping in improving the website load times and overall performance. It also helps in reducing unwanted content in the file such as comments. You can use HTML minifier to minify HTML files and Minifier to minify CSS and minify JS. You can also use Gulp which is an automated minification solution. Below is the implementation:
- Install Webpack:
1npm install --save-dev webpack webpack-cli html-webpack-plugin css-minimizer-webpack-plugin terser-webpack-plugin
- Create a
webpack.config.js
:
1const path = require('path');2const HtmlWebpackPlugin = require('html-webpack-plugin');3const CssMinimizerPlugin = require('css-minimizer-webpack-plugin');4const TerserPlugin = require('terser-webpack-plugin'); // Minifies JS5
6module.exports = {7 entry: './src/index.js',8 output: {9 filename: 'bundle.js',10 path: path.resolve(__dirname, 'dist'),11 },12 module: {13 rules: [14 {15 test: /\.css$/i,16 use: ['style-loader', 'css-loader'],17 },18 ],19 },20 optimization: {21 minimizer: [22 new TerserPlugin(), // Minifies JS23 new CssMinimizerPlugin(), // Minifies CSS24 ],25 },26 plugins: [27 new HtmlWebpackPlugin({28 template: './src/index.html',29 minify: {30 collapseWhitespace: true,31 removeComments: true,32 },33 }),34 ],35};
- Run Webpack
1npx webpack --mode production
Eliminate Render Blocking Resources
Eliminating the render-blocking resources helps in improving both FCP and LCP. You remove all the scripts or resources that can block the HTML parsing and hence, slow down the page. Below are the key techniques to eliminate the various render-blocking resources:
Eliminating Blocking JS
You can use async
or defer
to download the Javascript without blocking the HTML parsing. Async scripts are executed as soon as they are loaded without caring about the execution order:
1<script src="script.js" async></script>
Defer makes sure the scripts are executed after HTML parsing:
1<script src="script.js" defer></script>
Eliminating Blocking CSS
To eliminate the blocking CSS, you need to inline the critical CSS instead of fetching them as a separate file. To inline the CSS follow the process below:
- Code to inline the critical CSS
1<style>2 /* Inline Critical CSS */3 body { font-family: Arial, sans-serif; margin: 0; }4 /* Other critical styles for above-the-fold content */5</style>
- Load the CSS asynchronously
1<link rel="stylesheet" href="style.css" media="print" onload="this.media='all'">
Reduce Unused CSS, JS
Reducing the unused CSS, and JS means loading and parsing less data which results in faster page loads. There are multiple practices to reduce CSS and JS.
Using Code Coverage Tool
Using code coverage tools, you can identify the unused CSS and JS and eliminate the unused code. Below are the steps to use the code coverage feature in Google Chrome:
- Press
Ctrl + Shift + I
(Windows/Linux) orCmd + Option + I
(Mac) to open the Chrome DevTools. - Select the Sources Tab.
- Click the Run command link. Then type Show Coverage and press enter. Click on the record button and reload the page to record the code coverage as mentioned in the image below.
Purge The Unused CSS
You can use Tailwind to purge the unused CSS. If you don't want to use Tailwind to purge the CSS manually, you can use purcecss
. Below are the steps to use it:
- Install Purge CSS:
1npm install purgecss --save-dev
- Configure Configure
1module.exports = {2 content: ['./src/**/*.html', './src/**/*.js'], // Specify HTML and JS paths3 css: ['./src/css/style.css'], // Path to your CSS file(s)4 output: './dist/css', // Output folder for the cleaned CSS5}
- Rune Purge CSS:
1npx purgecss --config ./purgecss.config.js
Use Tree-Shaking
Tre shaking is used to remove unused code from your javascript code. You can use Webpack to tree-shake the unused code automatically. Below is the Webpack config to perform tree shaking.
1module.exports = {2 mode: 'production', // Ensures tree-shaking works3 entry: './src/index.js',4 output: {5 filename: 'bundle.js',6 path: __dirname + '/dist',7 },8 optimization: {9 usedExports: true, // Enable tree-shaking for used exports10 },11};
Code Splitting in Javascript
Code splitting helps to split your code into smaller chunks and you can load the chunks on demand instead of loading the entire code in the initial load. This helps you to reduce the Javascript by eliminating the unused code and boosting the webpage speed. You can use webpack for code splitting.
Conclusion
I have not mentioned about browser caching and cdn caching in this blog as I have already talked about it in my existing blog on setting up Cloudflare caching. I have covered all the crucial aspects required in the webpacge optimsation. You can alsow use webpack bundle analyser to analyse which chunks/modules are taking up significant space and can be removed or replaced with some alternatives. Always remember optimisation is an ongoing process, all the tools out there would be evolving with time, and you will have to be uptodate to ensure you are following the best ptactices. By improving the webpage speed, you would not only improve the user experience but it would also help you out with the search engine optimisation.