[2021-04-12]
Time to celebrate another fix to the regression I've made!

Some time ago I introduced a dynamic content-loading system that allows posts to be written without having to transform them into some kind of bulky HTML files (see the 'RawRead' post for more information). This approach works and I do save a couple kilobytes here and there (considering I only have 1 GB here, I'd say that's a good change!), but it comes with an issue:

What about those that can't use (or chose not to use) JavaScript?

I don't want to rework the site or do any complex engineering to fix the issue, but I also like reading this site with Lynx and eLinks -- and plus, massively complex solutions like introducing frameworks and such is a waste of time for you and me.

Some thought went into this; I don't mean to wall off those that can't use JS, rather, I want to use that to show a static content. The solutions I had in mind were:

* noscript tag
* Multiple version of posts?
* Using JS for.. something?

Both noscript and multiple version of posts proved to be cumbersome and it goes against the philosophy of this site: save time and effort and just write.

So, I set out to develop a stupid solution in JS. We know that browsers without JavaScript support won't execute the script tag, so we can.. maybe use that.

The solution I developed was, ultimately, dumb:

First, I made it so that by default, the index will show the files in the raws folder. Instead of passing these files through the loader script (loader.html), the browser can instead directly load them.

Then all files in the raws folder are renamed to .html. They were already partial HTML files anyway, so nothing was lost in this process.

One change I had to make to make this work was to inject the pre tag on the post content. Sacrifices have to be made.

Finally, I hacked together a very quick script that replaces the links pointing to the raws folder. This script will replace those links to links that point to the loader instead. However! You won't see this happen if the browser lacks JavaScript support (it will instead, show the default raws link).

Here's the script, to those that are interested:


   window.onload = function() {
      var x = document.getElementsByTagName("a");
      for (i = 0; i < x.length; ++i) {
          if (x[i].getAttribute("href").includes("raws/")) {
              x[i].setAttribute("href", x[i].getAttribute("href").replace("raws/", "loader.html?page="));
          }
      }
  };


With this, we manage to achieve our goal: have the site be supported and readable by browsers without JavaScript support. Best of all, we don't lose a great number of fidelity through this!

One lingering issue that remains is the lack of CSS styling because the posts don't require them when loaded through the loader -- but the same can't be said if you load them directly.

I suppose that can be fixed for later. In the backlog it goes.