How to build a fast blog

A while ago I decided to redesign my blog and make performance a first-class citizen! This is the story of my approach and what I've learned from doing it.

Let's get straight to the point:

There isn't any one-fits-all solution out there.

Plus, with all the tools available to us as Front-end developers, achieving performance is a mix between research and implementation. A mix which requires something more than technical skills - it requires curiosity & perseverance!

Countless times when I would end up with a pretty good solution, I almost called it done. Even though there were still things I hadn't tried, things which might improve the experience even further, I was impatient to get the result out there. In times like this I had to remind myself that:

It's a marathon, not a sprint!

I'm not doing this just for my readers but for myself too, because I'm also learning and growing as a developer in the process. Win-win!

So, this is my advice to you - try and look at it as a marathon, not a sprint. I won't bullshit you, it's not easy, especially when you're holding an almost finished solution in your hand and you just wanna press deploy. But it will pay off in the long run.

The final result

Although at the end of the day this is just a blog, it's still an uber performant one. On slow 3G connections:

  • first visits require3.5 seconds until the content is readable
  • subsequent visits reduce that time to2 seconds
  • assets are only loaded when needed thussaving bandwidth


Code splitting

As you know, scripts & style sheets don't arrive to the browser by magic but through the network. This takes time, depending on your user's connection. Once they're here though, they need to be parsed and executed - which again takes time depending on your user's computing power.

So the No. 1 rule - which really is no rocket science but still a lot of Single Page Apps ignore it - is tosend just the code needed for that page - or maybe even just theabove-the-fold part - and nothing more!

This is exactly what I did with this blog. Each page requires just the JS & CSS needed for itself to work. No hugebundle.js orbundle.css files. Plus the code isminified &uglified so that I truly send as few bytes as possible.

Hold on. This is a static website... What about SPA's?

What about them? You can code-split there as well - no problem. The first step would be to split at route level. If I'm on the/home route, do I need the code & styles for the/settings one? Of course not.

After that, you can split even more aggressively: let's say you useSweetAlert to show a register modal when the user clicks the register button. If you bundle SweetAlert - which is60 KB - together with the rest of the code you're basically sending 60 KB that are not needed upfront. That's why you should load it dynamically, only when the user clicks the button.


Even though this is a static website powered by PHP on the Back-end, a Front-end build process - viaWebpack - was needed to speed up development & handle all the tools/techniques I'll mention here.

Its most important task is to create, for each page, a.js and.css file. The home page, for example, will get 2 files:homePage.js andhomePage.csswhich will be included in the HTML of just that page - the CSS in the<head> and the JS at the end of the<body> so that it doesn't block rendering.

Appart from theminify &uglify processes, I'm alsotranspiling the JavaScript down to ES5, so that evenOpera Mini's 100 million users can interact with this blog. I usedbabel-loader to achieve this.

module: {
  rules: [{
    test: /.js/,
    use: "babel-loader"

  "presets": [
        "targets": {
          "browsers": [
            "last 2 version",
            "not dead",
            "> 1%"]


Early into the project I realized that the JavaScript could be split into two:

  • some common code needed in every page - like the opening & closing of the menu
  • dedicated code only for some pages - like the scroll-on-click functionality of theservices page

So I organized the JavaScript into

  • onesetup.js file, which handles the common, initializing code
  • one file per page, following the`${pageName}Page.js` naming convention, which imports the setup.js file as well as having the specific code of that particular page

Thesetup.js file is included in pages with no custom code - like theAbout page which is just text & images - while the other dedicated scripts are included into their specific page. This means I needed webpack to generate multiple smaller bundles, instead of a huge one with all the code. This is the entry point of the webpack config:

 // webpack.config.js
entry: {
  setup: './src/javascript/setup.js',
  homePage: './src/javascript/pages/',
  servicesPage: './src/javascript/pages/',
  eventsPage: './src/javascript/pages/',
  articlePage: './src/javascript/pages/',

But, should I include the files as external scripts or actually inline them in the HTML?

Well, I tried both methods and I saw no clear difference in performance. So I decided to stick to the external script way - thus allowing caching to step-in and help for subsequent visits.


With the same code-splitting technique in mind, the CSS could also be split into two:

  • thecommon CSS needed for the common elements of every page: header, footer, menu
  • thespecific CSS of each page

I used SASS -Syntactically Awesome Style Sheets - to make it easy on development, but the exact same methods could be applied to plain old CSS.

So, I split the code into a:

  • _commons.scss file which has - you guessed it - the CSS needed for the common elements in every page
  • one`${name}Page.scss` file per page which has the specific CSS while also importing the_commons.scss

Now that we've written the SASS files, it's time to put them on the page. We'llimport them into the JavaScript code

 import "../../sass/pages/";

and then, by using the right loaders & plugins, we'll create the final.css files.

module: {
  rules: [{
    test: /.scss$/,
    use: [
    plugins: [new MiniCssExtractPlugin({
      filename: "[name].[contenthash].css"
    }), new OptimizeCSSAssetsPlugin()]
Why not CSS in JS?

Well, that would need the JavaScript to be executed first so that it applies the styles, which means the rendering will be blocked until the code has been parsed, compiled and executed. Big NO NO!

What about inlining the CSS?

Just like the JavaScript, even when throttling the connection to slow 3G and the computing power by 4 times, I didn't experience any noticeable improvement. So, I decided to leave the CSS as is - in a separate file. This also means that I can leveragebrowser cache - which actually negates the effect of the additional network request and provides - in my opinion - the best overall solution.


Fonts are a pretty big deal to performance because the browser delays painting the text until the fonts have been downloaded. But the font's aren't downloaded until they are truly needed - in other words - not until the browser encounters a CSS rule making use of that font.

This means that the download & parsing of CSS blocks the download of fonts which blocks painting of the text, ugh! The result of this, more noticeable on slow connections, is aFOUT - flash of un-rendered text.

FOUT - flash of unrendered text

But things don't have to stay like this, because throughpreloadingwe can force the browser to download the fonts in parallel with the CSS, thus painting the text faster & potentially eliminating the FOUT all-together.

 <link rel="preload" as="font" href="/dist/assets/fonts/comfortaa-v12-latin-regular.woff2" type="font/woff2" crossorigin="anonymous" />
network tab without preloadwithout preload - fonts are downloaded after CSS
network tab with preloadwith preload - fonts are downloaded in parallel with the CSS

A small caveat here: Google Fonts keeps changing the font's URL so that's why I'm self hosting the fonts - followingAddy Osmani's recommendation in hisWeb Performance Made Easy video. Thistool is a big help for that.

And, just in case the connection issuper slow and the custom fonts take multiple seconds to download, I've also set the - currently experimental -font-display property to swap so that the browser will show the default ones while they load.

 font-display: swap;


And finally we get to the last piece of the puzzle: images!

I hope you've realized by now that no technique here is rocket-science material. They are morecommon senseapproaches, which have the user's best interest in mind. Well, the same goes for optimizing images.

The first thing I did was to make sure that the images areproperly sized for their intended use-case. If I load a huge, high-res image but only display it in a 500 x 500 pixels format, then I am wasting my user's bandwidth and time! So, I went through all the images and resized them to 1.25x size from their current use-case. You could argue I should resize them to exactly that size but I would like to have some extra "space", in case I change the styling.

Then I converted them to the newerWebP format which is really abyte-saver. It's onlysupported by Chrome & Opera at the moment, which is a bummer, but still for those users the experience will be even better. There are a couple of webpack-plugins which seems to help with this, but I couldn't get them to work so instead I wrote one for myself -imagemin-webp-webpack-plugin.

And in the HTML, I switched the<img> tag with the newer HTML 5<picture>. This allows the browsers that support WebP to use this format, while the others gracefully fallback to JPEG or PNG.

  <source srcset="/dist/assets/images/pava_at_jspub.webp" type="image/webp">
  <source srcset="/dist/assets/images/pava_at_jspub.jpg" type="image/jpeg">
  <img src="/dist/assets/images/pava_at_jspub.jpg" alt="Pava at JSPub">

In total, the WebP images are8.6 MB smaller than their JPEG/PNG counterparts.

By this point the images are looking pretty good. We've saved loads of useless bytes from being sent over the network. Hooray! But, there's one extra thing we can do:lazyloading.

Lazy loading may sound fancy & high-tech, but what it does is to delay loading of above-the-fold images until the user is about to see them. This saves bandwidth since no hidden images are gonna be downloaded.

Currently I don't have a lot of images in this blog, except for theevents page which does have a few. You don't see all of them the first time you enter the page so let's load just the ones you see and lazy-load the others just when the scroll is about to reach them. I used a package calledlazysisez for this which is brilliantly simple to use. Just import the script and change the HTML like below:

 <img className="lazyloaded" data-src="src/assets/images/talks/jspub.png" alt="jspub logo">

Compression & Caching

Up until now we made sure we use every downloaded byte responsibly by improving the client-side. It's time we move on to theserver-side of things because there are some additional improvements that can be done from there.

First one is tocompress the.css &.jsfiles we send to the browser. I chosegzip which reduced the file sizes by around 50%. The biggest save, of 16 KB (60%) was for this page's JavaSript.

Lastly, we can even avoid downloading the assets alltogether by managing the server'scache policy.

By default the php server I'm using comes with a no-cache policy which forces the browser to go GET things from the network every time. But the browser has a cache, so let's leverage that and change the cache policy to a30 day period.

<FilesMatch ".(woff|woff2|ico|pdf|flv|jpg|jpeg|webp|png|gif|css|js)$">
  Header set Cache-Control "max-age=2592000, public"

Now, on subsequent visits, the site will load (almost) instantly! ❤

Closing thoughts

Putting performance first and actually caring about it was a tough journey. Sure, it might seem easy when things work seamleslly and you see the site's metrics improving! I thought as well several days into this project, until...

  • I realized the lazyloading library I'm using doesn't support videos!
  • Or that I have 500 KB of JavaScript on this page only to realize that I've been importing all 176 languages in thehighlight.js pack!
  • Or that there's another technique whichI DON'T KNOW ANYTHING ABOUT, so now I have to READ IT before coding!
  • Or when you just wanna put this site out there but there's this voice in your head reminding you that you haven't tried everything!
But that's what takes you to the next level!

Cheers! 🥂

Portrait of Pava
hey there!

I am Pava, a front end developer, speaker and trainer located in Iasi, Romania. If you enjoyed this, maybe we can work together?