New website

This website was previously powered by DokuWiki, which works well, but I was not really happy with the general system and style of the whole site.

A new design

First and foremost, I wanted my own website, with my own style. Starting from a black slate, I built it stone by stone, to have this wonderful (or questionable? 🤔) greenish dominant color. I give space to the content as much as I can. I give it contrast, too, to have a sharp and readable text, joining the Contrast Rebellion!

Not everything is perfect and definitive, but if you can find measurable improvements that can be done, especially regarding accessibility as it's something I never worked on.

A Web 1.5 website

I want something that is fast, efficient, and working well. Far from huge frameworks on the front-end or back-end, I settled on a website without JavaScript. This is why it's not a brand-new webapp, with shiny new technologies. The table of contents at the beginning that reveals the content uses only the CSS :hover pseudo-class.

On the other hand, I like the idea of adapting the website to the user's preferences. Since a few years, it is possible to know if the user is using a dark or light system theme. Therefore, my website follows this, with a stylesheet that is theme-aware using the @media (prefers-color-scheme: dark) media query. I might add a button for switching it on and off, but that is not sure.

A new backend

Under the hood, there are just plain static HTML files. I don't need any dynamic features, a comments system, a complex relational database, or who knows what. Before the migration to this new fully static system, DokuWiki was generating the same pages again, and again, and again.

Based on this, and the pain of creating full HTML pages by hand one after another, I looked for static sites generators. They read the content of the pages to create, load the template files that provide a canvas for the content, and render all the pages at once. Done.

I have elected Zola for the task, mostly because it's written in Rust. This is the only reason I went for that one, as I never ever used such a software before, so I do not know the merits of its competitors (besides they are probably more mature). Anyway, it works for me.

For the performance part, I have a two steps process:

  1. When adding an image in an article, I ask Zola to resize it and recompress it dramatically, saving a lot of bandwidth.
  2. Before uploading everything, I optimize all JPG and PNG images:
    • jpegtran to remove metadata, and save it as a progressive (interlaced) image, that can be already displayed in a lower quality before having received everything,
    • pngnq to reduce the color palette of PNG files, to have a better change to compress it efficiently,
    • optipng which saves a PNG file using the best encoding parameters (by testing all combinations possible)

Here is the script used, adapted from the tips given by SebSauvage.

#!/bin/bash

# JPG files
for i in `find public/processed_images/ -iname "*.jpg"`; do
  tempname="$i.opti"
  jpegtran -copy none -progressive -optimize -outfile "$tempname" "$i"
  cp "$tempname" "$i"
  rm "$tempname"
done

# PNG files
pngnq -vf -e .png -s1 public/processed_images/*.png
optipng -o7 public/processed_images/*.png

Not breaking everything

I tried to be careful, and not to break links to the previous version of my website: there should be redirections set up for all (relevant) pages and RSS feed to the new adresses. It should be fully transparent!

Final words

I do not yet know if I will be adding content regularly to this website, nor if I will really resurrect my link sharing page. Time will tell.