Proper websites, done properly

Top website performance tips

15 minute read time.

In the grand scheme of things, it really boils down to money. The faster your site, the more chance you'll make money off it. The lighter and smaller it is to download, the less it costs to send that data to people, saving both you and your customers money.

Why is performance so important?

The better the performance of your site, the higher your search engine rankings on Google will be. Which in turn should mean more visitors, and more chance of conversions to make you more money. I've previously written about why website performance is so important if you want to know more.

Let us also not forget that the smaller your site's footprint, the better it is for the environment too.

Priorities

Element order

It might seem such a random thing to get right, but getting your <head> in order can make a big difference in page rendering time. And since this is such a simple thing to follow, you should just do it on every site and not worry about it.

Preload

Assets you know you're going to need as quickly as possible should be preloaded in the <head>. This should include your base CSS, and any important font files you're using at the least. If you try preloading assets that aren't used in the first few seconds after the preload request, you can check the dev console for warnings telling you that you probably shouldn't be preloading that particular asset.

On this site, I currently preload the CSS, JavaScript (mostly because the JS for the site is under 7KB on pages that don't need syntax highlighting) and the variable font I'm using.

<link rel="preload" href="/css/site.css" as="style">
<link rel="preload" href="/js/index.js" as="script">
<link rel="preload" href="/fonts/rubik-variable.woff2" as="font" type="font/woff2" crossorigin>
The preload elements are additional, so you'll still need to remember that you'll need to add your usual style and script calls.

Fetchpriority

If you're loading any images high up on the page - such as a the first item in a carousel, your main hero image, the first image for a product's detail page, or an <img> for your logo maybe - then you'll want to add a fetchpriority="high" on the most important one.

All Chrome-based browsers and Safari will assign higher priority to that item and attempt to load it sooner. This is currently the best use case for fetchpriority, but it is possible to assign a priority to <link> and <script> tags providing you're sensible in how you use them.

Lazy loading

This is almost the opposite of using fetchpriority; any images that aren't the most important one on the page, or those that are likely below the mythical 'page fold' should probably have a loading="lazy" attribute set on it.

This allows the browser to de-prioritise those elements and concentrate on other, more important assets first.

Inline important icons and logos

Talking about 'above the fold' elements, or as I'm calling them: Initial Viewport Priority Elements (IVPE), sometimes you need to switch up how you add icons and images to your pages. One of the first images you're likely to see on any website is the logo, and if this can be changed to an inline SVG element without negatively impacting overall page weight, you should probably go ahead and do that.

Using an inline SVG means that it's downloaded as part of the HTML payload, so no additional request to the server is needed to get the image once the HTML has been fetched. This is the most performant way to do it, although not really the prettiest.

However, not all logos will work well as an SVG, and sometimes an optimised PNG image will be a smaller filesize. If this is the case, adding it as an <img> with fetchpriority="high" set might be the better option.

Sometimes, you might want to use a background image via CSS instead, and that's OK too - but it's the least performant of the three options. Although it is the most elegant, and possibly the most accessible way to do it. I wrote a bit about inline SVGs versus CSS background images if you want to know a bit more.

Page weight

For me the most important metric of a page is the overall size, or as it's known: page weight. This is the total size of the entire page once rendered, including all CSS, HTML, images, JavaScripts, videos, audio or anything else that might be requested during the load of a page.

Over the years, pages have gotten larger and larger to the point where the homepage of a World-famous brand (not naming any names) was over 15MB! Not only was it huge, it was slow. So very slow.

This is down to many different reasons: webfonts are used more, images are bigger and of better quality, more video content is consumed, more adverts, more third party scripts, more overall JavaScript per page than ever before, and, most commonly: people are loading entire libraries of CSS and JavaScript when they're only using a small fraction of it.

Sometimes these things aren't even needed, but it's so commonplace to see them used now that it's almost a given that you'll use them regardless. This is just wrong!

Keeping your page weight as low as possible is the healthiest thing you can do, and will keep your site at it's most performant. Reducing the amount of JavaScript you use is highly recommended. Your page is only as fast as your slowest JavaScript. And don't forget, not everybody has JavaScript enabled anyway, so making sure your pages mostly function without it is a great way to be more inclusive as well.

Agreeing and sticking to a performance budget can help you keep your eye on the ball and not fall foul of the heavy page burden.

DOM size

If you run a Lighthouse, Page Speed Insight, or get any sort of Google Core Web Vitals metrics, your pages will be tested for an excessive DOM size.

Google notes that an excessive DOM size is:

  • Approximately 1,500 nodes or higher
  • A tree depth of more than 32 elements
  • More than 60 children per parent element.

To uncomplicate this, what they're saying is that you should be writing as little HTML as possible to get the job done. Stop nesting elements unnecessarily.

Having an excessive DOM size results in slower page loading, parsing and rendering times, and increased bandwidth use for both you and your site visitors. Smaller DOM sizes means saving more money as well as being quicker.

Less JavaScript

This ties into the page weight issue from earlier. Less JavaScript per page is better for many reasons, but the main two are the weight they add to the page, and how much they will slow your pages down. As mentioned, your pages are only as fast as your slowest JavaScript, so the less JavaScript you need to get the job done, the faster your site will be.

So often I find developers using JavaScript to provide functionality for things that you can do either directly in HTML or with new HTML5 APIs that mean most of the heavy lifting is done by the browser to save you having to write most of the functionality, or sometimes just using the right CSS.

Optimise images and fonts

Ensure you run every single JPG, PNG, WEBP, AVIF and SVG through some sort of image optimisation/compression service or plugin. Most images exported from graphics apps such as Pixelmator, Figma, Sketch, Photoshop etc are not as optimal as they can be.

Squashing the filesizes of your images can make a huge difference to the weight of your pages. On a recent project, running all the sites images and icons through optimisation shaved off 500KB from the entire site build. That is not inconsequential.

The bigger your site, the more images you're likely to have, and therefore the bigger a deal this will be. It's so easy to do that it's inexcusable not to be doing it!

There are many free services out there you can drag and drop images onto and get the basics in order. For PNG or JPG images, I tend to use TinyPNG, for SVGs I use SVGOMG, for WEBP I tend to use either CloudConvert or Pixelied, and for AVIF generally I use Pixelied

These services all offer different levels of 'free', and they also output different file sizes for the same images in the same type sometimes. I have noticed that Pixelied has a better level of 'free' but tends to output larger WEBP files compared to CloudConvert.
If you're a macOS user, there's probably a tonne of free/freemium/paid-for apps that will do batch conversions, etc on desktop. There's a handy tool I've tried called Advanced Batch Image Convertor (ABIC) which is no longer in active development but does seem to do a very reasonable job of JPG to WEBP conversion if you need something quick and dirty.

You'll often find that not all images will be the smallest file size for a reasonable quality in the same file format as others. I've tended to run photography through three conversions and keep the format that has the best file size without looking pixelated or showing too many visual artefacts. Getting a WEBP, JPG and AVIF version of each to compare is a good idea.

Optimising web fonts can also give you fairly good savings too. I've written about web font performance before, and I've found that subsetting fonts is currently the best way to get the smallest file sizes for your WOFF2 fonts. Font generators like Font Squirrel allow you to subset fonts, tweak many settings, and convert to WOFF2 ready for use on your website.

I often find that running any viable font through that generator and subsetting it to just western languages (since for the most part, that's all I need!) can reduce the file sizes by a decent amount.

If you only need a very specific range of glyphs, your file sizes can be slashed. Subsetting is an excellent performance tool.

Minify/compress

Minifcation and compression are a time-old tradition for the web. It's an obvious way to improve the size of a CSS, JavaScript or HTML file: remove all the whitespace and new lines. Computers/browsers don't need those, we just format code nicely so other humans can read it more easily. As long as the syntax is valid, the formatting is irrelevant as far as your web browser is concerned.

There are many plugins out there that you can integrate into your build pipeline, and most will do a decent job. It's worth knowing just how well they perform though, and choose the right tool for the job. I'm currently using LightningCSS for minifying and transpiling CSS via PostCSS/Gulp.

You can go a step further with your JavaScript too (and sometimes with CSS, depending on how you've implemented your build tools) - uglification. Sometimes called 'mangling', it is a process of running your source code through a tool that will make argument, variable, and possibly function names shorter to reduce file size. This renders it hard to read for humans, but again makes no difference to your browser's JavaScript engine!

Compression can be applied at a web server level too, where G-ZIP or BROTLI compression can further reduce payloads sent from the server to the browser, where it can be uncompressed and run.