Started writing a decision log for our #DesignSystem. Documenting why we chose to build plain ol' #HTML and #CSS where we can and #WebComponents where client-side #JS is needed is turning into a bit of a manifesto. Essentially we're using (and encouraging others to use) #ProgressiveEnhancement 😉
To celebrate https://rss-is-dead.lol, a site that usually leans on JavaScript for async data fetching, is today demonstrating how a reduced experience is better than a broken one. 😊
Will probably be hated for this… why do #accessibility advocates often come off as preachy when I hear them talk? To the point that they will say (at times indirectly) that they care for people and others don’t… I get that it could be a frustrating field, but sometimes devs just need educating—admittedly including me 🙋♂️
"Out-of-touch, influential old guy working in web accessibility for many years claims that all hope is gone and only #AI can save us. People who he thinks are disabled will receive a more “concise” UI, because he thinks they are not able to use a computer and assistive technology."
Let me show you how easy it is to create a simple counter web app using the new Streaming HTML workflow in Kitten before peeling away the magic layer by layer so you learn how to make the same app using:
• HTMX & WebSockets
• Plain old JavaScript, and, finally,
• Without Kitten in pure Node.js.
This year it’s 12 years since I built what I still think is the most pure implementation of how I personally envision a social feed of a site to work (except maybe the masonry effect). The social feed at Flattr: https://www.dailymotion.com/video/xqfed0
The one thing I would make different today is not to drop the #progressiveEnhancement or the #noJs approach – it’s that I would drop jQuery and instead use some minimalistic modern helpers.
In fact, this is what I did when I worked for the HD-Sydsvenskan newspapers. I based those helpers on my personal collection of helpers and in the end we open sourced them (as we open sourced a module that needed them): https://github.com/Sydsvenskan/js-dom-utils
I still think that it’s easiest to create #noJs HTML sites with <form> tags initially and then use #progressiveEnhancement to turn them into a smoother experience.
That’s still how I would build such a social feed today.
For a simple blog, does it make sense to provide a fully static editing interface, with nothing but HTML forms, as the baseline, and only build a REST API + some sort of js based flow later as an alternative?
A custom element that wraps a native form control and augments it (i.e. adds/modifies attributes for the slotted light DOM element and provides a shadow DOM stylesheet to improve the affordance of the new behaviour).
A form associated custom element that wraps a native form control and replaces it (similar to fallback content for <audio> or <video>).
A custom element that has a few options and can choose between the right one for the circumstances (similar to <picture>, <audio>, and <video>).
The problem with our approach to #WebComponents is that we’ve forgotten the “Web” part. It’s really important and it informs what to expect from the “components:” how they work, how we build them, and how they should work with the rest of the Web. Unfortunately, we’ve allowed frameworks built on premises like “#ProgressiveEnhancement is dead” inform our expectations.
Fun fact: My #dotfiles are running on a variety of machines. Some with X, some with Wayland, some with Windows, some with just a framebuffer text console.
And these machines run vastly different versions of common tools. Some are on #Vim 8.0, some on Vim 9.0, some on #Neovim 0.7, some on Neovim 0.9.4.
What I'm missing in the community that promotes #ProgressiveEnhancement are empiric studies on why, how and when #JavaScript fails. The broad assumption is “more JS = more points of failure, slower” and the broad conclusion is “less JS = better, faster”. Which is true – in a broad sense. But there are important nuances. It's necessary to know which parts fail frequently and why. To take action we need to relate this to how crucial the parts are and how the fallbacks look like.
I guess their logic was “We need JavaScript for the ‘My time’ option anyway, so why bother including the official race time (and date) in the HTML document? Let’s just generate it all with JavaScript.”
That logic is wrong.
edit: The correct approach would have been to include the track time and timezone in the HTML document, and then in JavaScript add the My time/track time toggle functionality, and set it to My time (the user’s local time). #ProgressiveEnhancement
#shadowDOM partners with the likes of :not(:defined) to make progressive enhancement a more natural part of component development than ever.
#declarativeShadowDOM builds on top of that, and once Firefox ships things powerful API the distance between "page load" and "page needs JS" can be even further apart.
It's powerful to lean into the existing web APIs to learn new and exciting ways to support our clients while continuing to push the boundaries of those APIs at the spec level.
Sure, not all content requires this level of interoperability and would benefit from an alternate path of development. #shadowDOM and #customElements are also opt-in, so if/when you need it later, it'll be there waiting.
These APIs progressively enhance any page you deliver, and they can add #progressiveEnhancement to your development by adding them to your work when you need them. Whether that's scale, complexity, reusability, or a feature you discover as valuable in your own investigations.