256: Unbreaking the Web with Chris Ferdinandi

October 27th, 2021 · 1 hr 44 secs

About this Episode

Greater Than Code Episode #170: The Case for Vanilla JavaScript with Chris Ferdinandi

02:50 - Project Gemini and Text Protocols

  • Always Bet on JavaScript

07:05 - Overusing Analytics & Tracking Scripts

  • Be An Advocate For Your Users / Ethical Obligations

12:18 - Innovations: Making Accessibility The Default

14:48 - Ad-Tech and Tooling

32:08 - HTMX

46:30 - Frontend Development is Hard


Rein: Vanilla JavaScript + Privacy.

Jacob: The web piqued at LiveJournal. Also, encouraging devs to think about what tool would be best for different jobs.
Chris: Maintaining privacy on the web.

Sign up for Chris’s newsletter at gomakethings.com!

This episode was brought to you by @therubyrep of DevReps, LLC. To pledge your support and to join our awesome Slack community, visit patreon.com/greaterthancode

To make a one-time donation so that we can continue to bring you more content and transcripts like this, please do so at paypal.me/devreps. You will also get an invitation to our Slack community this way as well.


PRE-ROLL: Software is broken, but it can be fixed. Test Double’s superpower is improving how the world builds software by building both great software and great teams. And you can help! Test Double is hiring empathetic senior software engineers and DevOps engineers. We work in Ruby, JavaScript, Elixir and a lot more. Test Double trusts developers with autonomy and flexibility at a remote, 100% employee-owned software consulting agency. Looking for more challenges? Enjoy lots of variety while working with the best teams in tech as a developer consultant at Test Double. Find out more and check out remote openings at link.testdouble.com/greater.

REIN: Hello and welcome to Episode 256 of Greater Than Code, a nice round number. I’m your co-host, Rein Henrichs, and I’m here with my friend, Jacob Stoebel.

JACOB: Thank you so much! I'm joined with this week's guest, Chris Ferdinandi.

Chris helps people learn vanilla JavaScript. He believes that there is a simpler, more resilient way to make things for the web. His Developer Tips newsletter is read by thousands of developers each day. Learn more at gomakethings.com.

Welcome to the show. Welcome back to the show, I should say. We had you on just before COVID, we were saying before the show started, so it's been quite a while.

CHRIS: Yeah, yeah. Thanks for having me back. It's been kind of a wild 18 months.

Last time I was on the show, I think we spent a lot of time talking about how modern development best practices might be ruining the web and this time, I was hoping we might have a little bit of chat about how that's still kind of the case, but there's also a whole ton of new things that have happened in the last 18 months that maybe swinging the pendulum back in the other direction, creating a web that's faster, little bit more resilient, and works better for everybody.

REIN: That sounds great. But first, has your superpower changed? Do you still have the same superpower?

CHRIS: I don't remember exactly what I said last time, but assuming it's derailing conversations, then the answer is absolutely yes. [laughs] That has always been and always will be my superpower. I am great at tangents.

REIN: Well, this podcast is just a series of tangents stitched together, so.

CHRIS: Excellent. Always makes for a fun conversation.

JACOB: Yeah. Very true.

REIN: Have you heard about this new internet protocol, Gemini?

CHRIS: No, I have not.

REIN: So it's like somewhere between Gopher and HTTP and so, it's a plain text protocol with no mark, no markup, no XML, no HTML and you can have links, but they have to be on a separate line. So you basically are sharing these plain text documents and there's no JavaScript, there's no CSS and people are seeing it as a revitalization of what the web used to be about. No click tracking. No injected advertisements.

CHRIS: Yeah. This is weird. So there's a few years ago where I've been like, “Yes, that's what the web needs.” I feel like I'm a little bit more pragmatic now as I have less hair and more white in the beard. Like that seems really cool in some ways and a huge step back in others. I have very mixed feelings about that as a gut reaction; knowing nothing else about it other than what you just told me.

REIN: What do you think is the happy medium between where we are now and 1990’s text protocols?

CHRIS: Yeah. So in some ways, I feel like the web maybe peaked with LiveJournal, or maybe Myspace. Myspace made it really easy to hack on the web and that was really cool. But the text-only web, I don't necessarily think I'd like to go back to.

I think I'm actually not even really opposed to commercialization on the web in large part because I'm only able to do what I can do professionally because of that. But I would love something that really curtails all the spyware for profit stuff over tracking. I have none of that on any of my websites. I removed all of my analytics, all of my – I don't even track opens on my newsletter and so, I really like the interactive and immersive nature of the web. I don't mind the commerce side of the web. I really hate the whole Big Brother-esque “we’re always watching you” nature of the web. I think that's really awkward and creepy.

Also, I feel like sometimes we try to run before we can walk on the web and so, we end up throwing a boatload of JavaScript at the frontend to make up for limitations in the platform and we have a tendency to create experiences that are really slow, brittle, and super prone to breaking. I think about how the web has gotten, or the internet as a whole has gotten four to five times faster in the last decade, but the average webpage still loads at about the same speed as it did a decade ago.

[chuckles] The original Space Jam website loaded in about the same amount of time that the new Space Jam website loaded, even though the internet [chuckles] has gotten so much faster in that time and a large part of that has to do with the way we build and the tooling that we use.

JACOB: Just consuming that extra capacity that we get through faster connections, et cetera. More people.

CHRIS: Yeah, for every extra megabit of internet speed that we get, we throw a bunch more JavaScript on the frontend and we have a tendency—it's really weird for me to, as someone who teaches JavaScript for a living, tell people to use less JavaScript. But the web keeps moving to a more JavaScript driven future and JavaScript is the most fragile and bad for performance part of the frontend stack. I feel like in the last maybe 5 years, or so, we saw the pendulum swing really far in the JavaScript end of things.

I still think the phrase, “Always bet on JavaScript” is a good one. I don't think JavaScript is going anywhere. I don't, in abstract, hate it. The interactivity that it brings is good. I think the challenge is around how we lean on it heavily for things that it's not necessarily the best tool for the job for. I'm starting to see a new slate of tools that take advantage of some of the things that's great at in a way that doesn't punish the users for those decisions so that's pretty cool.

We can dig into that if you both want to. There's a lot of new stuff in the works that I think has the ability to maybe fix some of the challenges that we've been facing up to this point.

REIN: What would you say are some of the low-hanging fruit in terms of implementations, design that people could take that would just make their app a little bit better, play a little bit more nicely, be a little less extractive in terms of tracking everything the user does?

CHRIS: Yeah. So one of the weirdest things that I've just encountered on the web, there's a certain subset—a lot of times it's e-commerce vendors, sometimes it's SAAS—but they're loading eight different tracking and analytics scripts on a single page through all sorts of different vendors. So they'll load Google Analytics, Salesforce, and three, or four other vendors that I'll do some version of the same thing—racking what you click on and where you go next. The impression I've gotten from talking to folks is that this is a byproduct of having a bunch of different internal departments that all want access to data and no one wanting to be like, “This is the tool we're using,” [chuckles] and so, they all just chuck it in there.

So that's probably a really big offender there and there's a couple things that you can do about that. First is I feel like a lot of developers and a lot of designers have this; I don't know how to describe it. I don't want to – I'm trying to think of the right phrase here, but it's almost like your job is to an advocate for the user. So just because your manager, or executive is saying, “We need this” doesn't necessarily mean your job is to just be like, “Okay, let me throw that in there.” I liken it to if you were building a house and a customer told you they wanted you to install a hair dryer in the bathtub. You could do that for them because they ask, but it's a really bad idea and maybe it's your professional responsibility to tell them that.

REIN: Yeah. That is the thing that distinguishes a profession is that you have ethical obligations to uphold,

CHRIS: And this is where I think you get into the pro and con of the web being an industry that you can get into without professional certifications and trainings. It's like anybody can do it, but there's also not necessarily that same level of there's no certification board that's like, “You're going to lose your certification if you do these things that are harmful.”

For most of us, for a lot of what we do, the side effect of having eight tracking scripts on a website is not life-threatening, but depending on the type of site that you offer and the services that you provide, it can be. Like I think a lot of people get taken a little aback by that, but I've heard multiple stories in the last few years about utility companies—electricity, gas, whatever—where utilities get knocked out in a really bad storm. So people are relying on their smartphones on 3G and they live in areas where connectivity is not particularly great and their Wi-Fi's down because their power is out and trying to connect to the electric company website to be able to file a claim like, “My electricity's out,” or even just find the contact number to call them, the site just keeps crashing, reloading, and they can't open it.

Not having access to your utility company because somebody was irresponsible with how they built the website really sucks. And depending on what the weather is like, like I live in a very cold climate up in the US in the Northeast and if the temperatures drop below freezing and there's a down tree blocking your ability to get in and out of the area and you don't have electricity, or heat, that can be a really serious thing.

I think a lot of times we just think about like, “I'm just building websites.” But it can be so much more than that depending on the industry you work in and the type of work that you do.

So the whole there's no equivalent of you're going to get disbarred if you do these bad things, there just isn't that for a profession, which can be a good thing, but is also – [crosstalk]

REIN: There's no equivalent of the city code that's 2,000 pages long, but also means that you can't build a bathroom that electrocutes people and you can't put asbestos in the walls. The counterargument is that regulations are onerous and they stifle innovation, but – [crosstalk]

CHRIS: And they can.

REIN: Do you want innovation in less safe ways to build houses? Like, is that what we're looking for?

CHRIS: Right. Yeah. I had a similar argument once with someone about accessibility on the web and how it shouldn't be legally required for sites to build themselves accessibly because it stifles innovation for one person shops who are just trying to throw something up quickly. I don't know, at some point it just boils down to a moral argument and it's really hard to have an objective conversation about should you care about other people and doing the right thing? Like, I don't really know how to have that kind of conversation in a logical kind of way.

JACOB: And I'm thinking about where the innovation can happen is the big platforms—your WordPresses, your Wixes, Squarespaces. What innovations can they think of that can make accessibility the default? Like help people fall into the pit of success? What are the new innovations that they haven't come up with yet that just make accessibility just happen magically for someone who is –?

CHRIS: Yeah. I also sometimes feel like it's a little bit of a – well, so there's two aspects here. I feel like the you can have innovation, or regulation, but not both thing is a bit of a false dichotomy. I think one of the things we've seen in—my liberalness is going to show a little bit here—but one of the things we've seen in unregulated capitalism is that it doesn't necessarily drive innovation. It just drives more ways to squeeze profit out of people. I think you see that on the web with the current state of internet surveillance and ad tech. Do you, as a consumer, feel like you’ve got a lot of innovation out of all the new ways that large companies have figured out how to track what you do on the web so they can sell more toothpaste to you? Because I certainly don't.

So that's one aspect of it and the other is, I feel like sometimes we're overly obsessed with innovation for innovation's sake and there's something to be for the boring, predictable web. Websites that try to be different just for the sake of being different often just end up being confusing and unusable and I don't necessarily want that in my web experience. I find that particularly frustrating. I'm a really big advocate of “the boring web.” I like when I show up on a website and I know exactly how to use it, I know how to move around, and I don't have to follow a whole bunch of popup tutorials just to figure out how to achieve the task I'm there to achieve.

REIN: And if you look at the context that's driving, some of this behavior from startups, from UX engineers at startups, it's often that their business model depends on being able to sell customer data. There are a lot of mobile apps that if they lost the ability to sell their customer's data would cease to exist and so I guess, the question is, is it justified? Should they exist if that's the only way they can exist?

CHRIS: I struggle with this a fair bit because I use and have benefited from free sell customer data services in the past. I'm of the mind that I personally would pay for –like, let's just use Twitter as an example. If I had the ability to pay to use Twitter and they would stop recommending all these completely irrelevant ads to me in my timeline all the time, I would probably pay a not insignificant amount of money for that. But I know there's a lot of people who either wouldn't, or couldn't afford to and the value of Twitter to me would go down substantially if a bunch of people dropped off the service.

So that's a really good question that I don't have a great answer for. I feel like there's a balance somewhere between where we are today and this ideal, you're never tracked ever state. I don't know what it is, but I know there is one.

One just related things, since we're talking about ad tech, is a lot of these third-party scripts are some of the biggest offenders when it comes to slowing down performance on the web. They add a lot latency into sites. I just saw this interesting project this morning from a guy by the name of Adam Bradley called Partytown. I don't know if either of you have heard of this yet, but it's essentially a lightweight interface that allows you to load and run your third-party scripts from a web worker instead of on the main thread.

One of the biggest challenges with a lot of these scripts in JavaScript is that JavaScript is single threaded and so, all of these things block other stuff from happening because they're on the one main, I shouldn't say single threads. Single thread within the browser, but service workers and other web workers run on the separate thread in the background but don't have access to the DOM.

So Adam Bradley created this really interesting, that I haven't had time to properly play around with yet, library that allows you to bridge that gap. So you can run these scripts off that main thread, but still give them those hooks into the DOM where they're needed with ideally the potential of reducing the overall load on the main thread and the latency and performance issues that come from that.

The other thing that I think a lot of – I guess, another angle you could pull out here is the fact that tracking and analytics don't have to be as privacy invasive as they are. I think you see this in things like Paul Jarvis's analytics platform, Fathom, which is so privacy minded that it doesn't require a GDPR notification on your website to use. So it's not doing this really invasive follow you all over the internet kind of tracking.

Naturally doing that dramatically reduces that data's value for advertisers. But if you're looking to use that data for you as a business, it still gets you the information you need without sacrificing your user's privacy. So it'll tell you things like what pages people are looking at and how frequently certain things on your site convert without you needing to know that after leaving your site, John Smith went to Colgate and bought a tube of toothpaste and then went to Amazon and bought a new kayak and all that kind of stuff.

There's a balance somewhere. I'm not a 100% sure where it is, but I'm seeing a lot of interesting ways of coming at this problem.

JACOB: Yeah. I'm going to be very speculative here because I can't claim to know about all this stuff, but I would guess that a lot of users that are just plugging Google Analytics just drop it into their site. They have no personal interest in all that advanced stuff. Like, they're not going to use it. They do want to know about convergence and that simple stuff anyway and really, they're just [chuckles] funneling more data to Google [chuckles] in the first place.

CHRIS: Yeah. Honestly, a big part of the reason why I pulled Analytics from all of my stuff is it just wasn't giving me that much value. I was tracking all this data that I wasn't actually using. Well, not even track, I was basically giving Google all this data about my users for free that I wasn't really taking meaningful action on anyways. I'd imagine for a lot of, like you've said, folks who are using these scripts, they're not really doing much with them and probably don't need nearly as much information as they're sucking up.

So ad tech is a big, a big part of the challenge with the modern web. But actually, I think one of the other kind of related problems is the fact that we're using JavaScript for all the things. The entire frontend is being powered and generated with JavaScript and that just creates not just performance issues, but extreme fragility in the things that we build just because as a scripting language, JavaScript is so unforgiving when it runs into errors, or when things go wrong. It's never fun when you click a navigation element, or click a button, or try to load a page and nothing happens and that just happens so often because of JavaScript.

So for the last 3 years, I've been on this tirade about how JavaScript is ruining the frontend. We're starting to see a bunch of new tools now that take some of the best parts of all of the JavaScript we've been shipping to the frontend and get rid of all the stuff that makes it so terrible, or at least minimize it as much as possible. That's been really interesting to see. To the bigger trends I've seen here are around micro libraries and precompilers.

If you're both interested, I'd love to dig into that a little bit. If you have another way, you'd like to take this conversation, that's totally fine, too.

JACOB: It sounds good to me.

REIN: Sounds interesting.

CHRIS: Yeah. So just to set the scene here. I have lost track of the number of times in the last few years that I've heard people say, “You need to use a JavaScript framework in your app because it's better for performance.” “The real DOM is slow; React uses a virtual DOM so it's faster.” Or “If you write vanilla JavaScript, you're just building your own framework.” I hear stuff like this all the time and it drives me nuts because it's not true.

But the thing I think people don't always realize is that can potentially be true depending on how your UI is structured. So if you ever view source on Twitter, their Like button is nested within 13 other divs and is itself a div and so, doing the div thing whenever you update the UI with an absurdly nested structure like that is going to be costly. But I think you could also argue that that's just bad HTML and you could probably structure that differently and better.

React itself is 30 kilobytes of JavaScript minified and gzipped that unpacks in the browser into, I think it's like a megabyte, or two of JavaScript when it's all done. It's huge and all that abstraction is really, really costly.

So on one end of the spectrum, I've seen the rise of micro libraries, which take some of the best concepts of libraries like React and Vue—state-based UI, DOM diffing where when you make an update, you only change the stuff that needs changing—and then they provide it in a much smaller package that gets you closer to the metal is maybe the best phrase here. They remove as many abstractions as possible and in doing so, they mean you have to load less JavaScript, which is an instant win on initial page load time, and then by removing abstractions, the actual interactions are faster themselves as well.

So for example, a state change in Preact, which is a 3-kilobyte alternative to React that uses the same API is four times faster than that same state change in React. Even though you're using the same patterns, you're just loading a much smaller footprint. You’ve shed some features, but not all and you end up with that same kind of user experience, or developer experience if you like the React developer experience, but with a much smaller footprint and a much friendlier experience for the people who ultimately use the thing you build.

Similarly, for a while, Alpine.js was gaining some traction. It was another small library built based on the way view works. Evan You, who built Vue was so inspired by it that he just recently released Petite-vue, which is a small subset of Vue built for progressive enhancement. It's a fraction of the size.

So I find those really, really intriguing because they take some of the best parts and then they get rid of all the cruft.

On the other end of the spectrum though, are folks who have started to realize that you can get some of those same developer benefits without passing on any of that cost to the user and to be honest, I'm finding that aspect of things a lot more intriguing. This takes the form of proper frameworks, or compilers where rather than authoring your JavaScript, shipping it to the browser, and then having the browser generate the HTML from it at runtime in the browser, or in the client, you still author your content in JavaScript, but then a compiler builds that into HTML, converts your library-based code into plain old vanilla JavaScript without the abstractions and that's what's get ships to the browser.

So Rich Harris, a couple years ago, built Svelte and it was, as far as I know, the first of these tools. I'm sure there have probably been others before it, but Rich’s is the one that got most popular. It's just really, really interesting because you write with a similar pattern that you might in React, but then it spits out just HTML files in old-school like DOM manipulation, interactions. It's doing all of the heavy lifting before the code gets shipped to the browser and the user gets a really nice, lightweight experience. He is in the process of building out this new tool called SvelteKit that gives you really, really awesome stuff like routing and built-in progressive enhancement.

Actually, he just recently gave a talk and a demo on this at Jamstack Conf last week at time of recording. I'll make sure I get you both a link to that if you want to drop it in the show notes for this one.

But in it, he gave this demo about how you can author this page with an interactive form and if JavaScript is supported and loads in the browser, it does Ajax form handling. And if for some reason that JavaScript fails, it does an old-school HTTP form submit and then manually reloads the page and gives you the same exact experience. But you, as an author, don't have to write two different applications, like your client-side code and then your server fallback. SvelteKit just takes care of all that for you.

I think this is one of the biggest reasons why people like JavaScript libraries is they have a single codebase to manage and these compilers are allowing you to get those same benefits without punishing the user for that developer experience.

There's another tool that came out. I forget if it's called Atomic, or Astro. Astro, yeah. Similar kind of thing, slightly different angle. This one allows you to take all of your favorite client-side library components, mash them together, and then it spits out prerendered HTML and remove as much of the JavaScript as possible. So like you could use a dropdown menu component from React, a card component from Vue, and some Svelte files that you started working on and this will mash them all up together for you and spit out a ton of really small code.

Jason Lengstorf—whose name I almost certainly butchered and Jason, I'm very sorry—over at Netlify recently tried this on a next JS project of his and the resulting build actually had 90%, less client-side JavaScript in it and decreased the page load time by 30%, even though it used almost all of the same project code. It just produced a much smaller, faster kind of frontend thing with the same am developer experience.

So these are the kinds of things that I get really excited about because I'm seeing us taking everything that we've learned from the last 5, or 10 years and finally starting to swing in the other direction with tooling that doesn't harm the users and we'll hopefully, start to unbreak the web a little bit.

REIN: So the analogy I use to try to understand this is basically frameworks like React install a runtime into your browser. Just like Ruby installs a runtime, you're not just compiling down to C calls. You're compiling to C calls, but those C calls are a framework of runtime that is quite large and quite future rich. Maybe the most direct example is in Rust, if you compile with no standard and you don't have a runtime, you're somewhat limited in what you can do, but you're getting as close to the metal as possible.

CHRIS: Yeah. That's a good analogy. I like that. That's a good way to describe it.

JACOB: Safely.

CHRIS: You really are. Yeah, I like that.

JACOB: I think the analogy goes further and correct me if I'm wrong, but Rust gives you all of that memory safety you wouldn't get with C. Svelte is doing the same thing with, can we call it DOM safety? [chuckles] That it's going to help you not make the common errors that you would often get with state manipulation.

CHRIS: Yeah, for sure and really, it has the potential to just save you from this situation that happens where the JavaScript breaks and then the whole app falls apart. The less you can rely on that, the better. It's not that you can't still ship that nice, enhanced experience to your users if they can tolerate it, but you end up with something that's a lot more resilient, which is not just better for them, but it's better for you.

I've just lost track of how many things I haven't purchased because I couldn't get the site to work in these JavaScript heavy apps, or even there's been one, or two occasions where my wife has run into an issue on a web app she's been trying to use. I've opened up dev tools, found the error, gone into the JavaScript code, fixed to the error live, and then she's been able to continue and like, should I file that with their dev team and send them a bill for fixing it? It's just, JavaScript is so unforgiving in the browser and having tools that provide more fallbacks and safety nets around that is definitely a good thing.

REIN: Maybe the other thing is that the runtime starts take on a whole bunch of responsibilities like you just start to pack it full of features. So in Rust, the runtime does everything from a stack overflow of protection to processing command line arguments.

CHRIS: I don't know Rust that well so I don't have a really good comment on that, but, [laughs] or I can't necessarily make an analogy between that and JavaScript, but that sounds like a good thing.

I guess, the related thing here is we also have a bad habit, as developers—just not necessarily you guys personally, but just as a community—we have a bad habit of doing our work on really high-end machines and testing our work on really high-end machines and good internet connections, and then assuming that the majority of our user base is like that.

I think React works perfectly fine on modern smartphone, or modern computer and a really good internet connection. But so many of the people who use the things we build don't have either of those things, or have one but not the other and the house of cards really starts to fall apart in those situations. Things become really slow, really buggy really fast and this is again where we get into the whole there's no professional standards board that says your site has to load this fast on this type of internet connection. There's no threshold mandating, or fault tolerance testing, or anything like that like you might have with the electrical in your house, or anything like that and maybe there should be, I don't know.

MID-ROLL: And now we want to take a quick time out to recognize one of our sponsors, Kaspersky Labs.

Rarely does a day pass where a ransomware attack, data breach or state sponsored espionage hits the news. It's hard to keep up or know if you're protected. Don't worry, Kaspersky’s got you covered. Each week their team discusses the latest news and trends that you may have missed during the week on the Transatlantic Cable Podcast mixing in humor, facts and experts from around the world. The Transatlantic Cable Podcast can be found on Apple Podcasts and Spotify, go check it out!

JACOB: I was reading something interesting. So I haven't tried it, but I guess, it's called HTMX, which are you familiar with it?

CHRIS: You are the second person to mention that to me. I have not played around with it myself, but I heard just a little bit about it. So I'd love to hear your take on it.

JACOB: We might know about the same, but what it looks like is it's coming from the other side, which is saying you as a developer should really, you're just going to author markup and from your perspective, you don't know if the browser's native capability is going to handle it, or if there's going to be JavaScript that's going to look at a certain attribute and handle it for you. You just want to handle markup and I just think that's a really interesting take because it gets developers back into the mindset of markup first.

CHRIS: You brought up another good point that I totally forgot to mention. So one of the things that I think we're starting to see is—and we saw this with jQuery, too—I call it paving of the cow path and I, by no means coined that term. I'm sure you're both heard it before, but.

So when jQuery came about, there was no good way to get elements by classes, looping through things was really hard. Like everything about JavaScript kind of sucked and jQuery really showed the developer community what a good API around working with the DOM could look like. It took a long time, but eventually, the browsers standards’ bodies incorporated a lot of that into what we get out of the platform. So the reason query selector and query selector all exist today, and the array for each method, and all of these awesome ways for interacting with the DOM, the class list API, the only reason any of that stuff exists is because John Resig and the jQuery team showed us a better way and paved those cow paths.

As much as the massive popularity of state-based UI libraries bugs me because I think they're overused, just like jQuery was probably overused in its day, they have really in many ways, paved the cow path for what a better browser native system could be.

One of the trends I'd like to see more of is, to what you were just talking about, Jacob, HTML doing more of the work and JavaScript doing less of it. I think a really good model for this is in the details and summary elements, which allow you to create a browser native, show and hide disclosure component without any JavaScript at all. It's just entirely HTML. You can click it, it shows the thing, you click it again, it hides the thing. It's accessible out of the box. If the browser doesn't support it, it's progressively enhanced; you get the full text. Beautiful.

I want that for everything. I want that for tabs. I want that for carousels. I want that for image galleries and just any sort of interactive component. Like I want that and the really nice thing about details and summary where I feel they really nailed it is its styleable. So if you want it to look different, if you want the expand and collapse icon to be styled differently, you can do that. If you want to animate it in, you can do that. Like you can add CSS to make it look the way you want. And if you want to enhance it with some JavaScript, it also is a custom JavaScript event that you can hook into and build on top of, but you don't need to.

One of, I think the biggest boons of JavaScript libraries is the ability to add interactive components, complex interactive components with ease. I feel like for a lot of developer teams, that's a real draw for them. They don't have to figure out how to redesign an accordion, because there's a component for that and that has the real benefit of adding more accessibility to the web, too. But it would be really cool if the platform just did that for you and we didn't have to reinvent the wheel, but this is where I feel like a lot of a lot of these libraries are paving the cow paths and hopefully, at some point, the platform will catch up and we'll have some of this stuff just baked right in.

I think the HTML enhanced thing you just referenced is another example of what that could look like. I think from what I've gathered from it, it's still a runs in the browser type tool, but it allows you to just focus on running HTML. I could be wrong. It could be a compiler, but I just really want that stuff out of the box in the browser, without me having to think about it. I'm also a lazy developer, though so that's [laughs] part of it.

JACOB: [chuckles] Me too.

REIN: I feel like the $2 trillion elephant in the room here is that the browser everyone's using is made by Google.

CHRIS: Yeah, and that used to not be the case, right? We've lost a lot of rendering engines in the last 3, or 4 years. You can do your part by not using Chrome. I'm not saying you should use Firefox. I'm on Edge. A lot of people like Brave. I have very mixed feelings about that one for a variety of reasons. But yeah, no, that is true. Chrome is like what, 70, or 80% of the market at this point? So that, just from a tracking and data absorption perspective, is not great.

One interesting argument I've heard in the past is that it's not necessarily bad if there's only one rendering engine on the web and browsers are competing on different features. Like, imagine a world where you didn't have to worry about which APIs are supported by which browser; we're pretty close at this point. [chuckles] But I'm thinking back to when Firefox had more popularity and Edge was still running on its own operating system and they were always just a little bit out of sync.

It would be awesome if the entire web ran on a single rendering engine and features were layered on top of that. Like, I think there is potentially an argument for that being a good thing. I think the real problem is that that rendering engine is controlled by Google and so, even if you're using a chromium-based browser that's not Chrome, it's still very much subject to the whims of what Google wants from the web. And you see that in a lot of the way things get prioritized, and what makes it into the platform and what doesn't. They have a nasty habit of if they can't get the rest of the folks in the standards board on board, they just plow ahead with it anyways and then users start using it and then everybody either follow suit, or riots happen. So that is an elephant in the room and I don't really have a good way to reconcile. That kind of sucks.

REIN: Have you heard about the new idle tracking API fiasco with Google Chrome?

CHRIS: No, I haven't, but I'd love to learn more.

REIN: This was in the news a couple weeks ago, so this is pretty fresh, but Google is basically introducing a new API to Chrome that detects when the users are idle and – [crosstalk]

CHRIS: That's gross.

REIN: Every other browser manufacturer is like, “This is an invasion of privacy and you should stop doing it.” Meanwhile, Google is also like, “Web tracking is out of control and has resulted in an erosion of trust,” and they say that out one side of their mouth. Now to the other side, they introduce this tracking API that for example, malicious sites could use to determine when it's okay to use your CPU to mine Bitcoin.

CHRIS: I'm thinking about how they recently insisted that alert had to be deprecated because it's bad for user security and now I'm hearing about this and it just really doesn't – like, I have a tough time consigning that what we say, what we do kind of aspect. Yeah, that's gross and that really sucks. I wish Firefox had maintained more of its market dominance, that would've been nice. Or if the W3C managed the rendering engine so that browser vendors weren't controlling that. This is all really, it's a little bit disheartening.

I don't have a really great solution for this kind of stuff. I'm by no means smart enough for that. But for some reason, it seems really, really hard to get a new browser engine in the market as evidenced by the fact that even big corporations who have tried it, eventually just give up and fold and switch over to chromium. I'm not enough of a computer science expert to really understand why that is, but I can imagine it's very hard, especially as the platform gets more complicated.

REIN: Yeah. I mean and there's also a vendor lock-in. So on iOS, every browser is secretly WebKit under the hood.


Because they literally aren't allowed to ship their own browser implementations.

CHRIS: Right, yeah. That one's always really fun. That one catches people by surprise; you're running Chrome, but you're actually running Safari under the hood.

JACOB: I think for a while, Mozilla wouldn't make an iOS app because they didn't want people to think that they were getting everything you associate with Mozilla's values when you download it. I think they have one now and it's because they are able to do certain privacy features, even if they can't do all of them but yeah, that's an interesting debate.

CHRIS: Yeah.

REIN: S you could imagine a version of HTML that remove moves a whole bunch of features that makes it harder to track people, that makes it harder to implement extractive, hidden stuff. The problem is there's no way to enforce that a certain site is using that subset. That would have to be done at the browser level and Google has no incentive to ever make that possible

CHRIS: [laughs] Oh man, I'm thinking now about the web we lost.

REIN: Yeah. That was actually one of the motivations for Jim and I to not try to mess with HTML is oh look, we can specify this restrictive subset of HTML that meets our needs, but there's no way to guarantee that any particular site you access is actually well-behaving. So they came up with an entirely new protocol so that they could enforce very strict rules about what a site can do.

JACOB: Would that mean end users have to type in Gemini:// explicitly?

REIN: Yeah. So there is a – is that called the protocol call part of the URL?

JACOB: I think so.

REIN: So there is a Gemini:// protocol. Is it scheme? Anyway, there is that and it has its own protocol definition. One of the goals of the protocol is to be civil enough that you could implement it in about a hundred lines and keep it all in your head.

CHRIS: [chuckles] That's pretty wild. I'm on the Project Gemini website right now and this is very old-school. Ooh, and it uses the details and summary element.

REIN: That's basically an HTML proxy for an actual Gemini page.


REIN: But there was no CSA. There's no JavaScript. There are no headers. Aside from the one header that you use to make the request, there are no headers so you can't insert anything in headers. There's no user agent.

CHRIS: See if this loads. No, this doesn't load. I wonder if there's any browsers that have actually incorporated this, or that allow you to – [crosstalk]

REIN: No, but there are like a hundred different clients that have been implemented in every language imaginable.

M: Ooh. I am noticing that it uses like a markdown-esque syntax. I'm looking at the advanced line types here where you use hashes for headings and asterisks for bulleted lists.

REIN: Yeah, but it doesn't allow inline links for example.

CHRIS: So you can always see what the actual URL that you're going to follow is? That's cool. Yeah, I – [crosstalk]

REIN: So there is this movement tool people are interested in moving away from the huge mess that is HTML, CSS, and JavaScript. Some people, I think are interested in this for privacy reasons. Some people, I think are interested in this for the same, I think motivation that brought you towards vanilla JavaScript, which is can't we just build sites that work better by not doing all this extra stuff?

CHRIS: Yeah, and they're separate, but they're also very tightly linked, I think where a lot of the privacy stuff is what causes a lot of the issues that bother me about the way that the web works today.

This is an interesting project. Candidly, I'm not entirely sure this will ever really catch on in a mainstream fashion. I think the genie is just way too far into the bottle, but it is interesting to think about a way the web could be different.

REIN: Yeah. It's interesting because this is definitely fringe, but fringes are where the interesting stuff happens.

CHRIS: Yeah. I could see parts of this informing what happens on the platform itself. The flipside here is I also do like some of the interactivity. I hate parallax and animation effects and all that, but I like being able to watch a video in a browser. I think that's pretty cool.

REIN: There are advantages to single-page applications to have – that user experience has some, I think real advantages over traditional hypermedia making a bunch of requests to new pages stateless. Basically, we’ve you figured out how to reimplement a stateful thick client right on top of HTTP.

CHRIS: Yeah like, being able to keep media playing as you navigate around those near instant page loads, that’s pretty sweet. Man, you're making me really sad about [laughs] just where the web is today. I hadn't really sat on just how pervasive ad tech and web surveillance are until this conversation.

REIN: Yeah, and it's also, React is almost a declaration that the REST manifesto was wrong.

CHRIS: [laughs] It's a bold claim.

REIN: I mean, React is – the original REST documentation basically would make React style SPAs impossible.

CHRIS: Yeah. Just one of the things in that talk Rich Harris from Svelte gave at Jamstack Conf talked about how there's this battle on the internet between the single-page apps are awesome people and the no, multi-page apps are better. They're way less complicated, better for accessibility, et cetera and admittedly, I tend to fall into that camp more often than not. He likened it to almost a bit of a false dichotomy where they both have really good points and they both serve important functions. Sometimes one is the right tool for the job over the other. So I absolutely have historically maybe come down a little bit too hard on the SPAs are always terrible, never use [chuckles] camp when they do sometimes have good uses.

But so, his whole talk was about this new term that he was trying to get going called transitional apps, #transitionalapps, that [chuckles] took the best of both worlds and allowed you to seamlessly move from one to the other, when appropriate, without having to just choose out of the box like, I'm going to build this, or I'm going to build that. I thought that was a really interesting approach that I hope we see mature a little bit more over the next year, or two because I think it has a lot of teeth and could do a lot of good for the web.

REIN: Yeah. Once again, it's the boundary zone between these two things where the interesting stuff happens, right? So HTMX, I think of it as federated multi-page apps so you might make multiple requests, but this one's just for this part of the page and this one's just for this part of page.

JACOB: It’s called micro frontends is a term I've heard.

REIN: Ooh. So the main difference is the micro frontends are an SPA thing and so, you have different subsites rendering different parts of the page, but they each render their own SPA type thing. But what you have with HTMX, or GitHub, this for a long time was GitHub’s style is I want to click this button and when I click this button, it's going to make a request for a new HTML fragment, and then it's going to put this HTML fragment on the page.

JACOB: Tight coupling has its value sometimes.

REIN: And then you also have things like Phoenix LiveView, the Elixir framework, where it looks a lot like a single-page app, but is actually making tons of server push updates.

CHRIS: This might be a little bit of – I know they call it HTML Over the Wire, that Hotwire thing that Basecamp came out with a year, or two ago and it's also interesting, Basecamp politics aside, where you build your old-school monolithic multi-page app, and then you layer a light JavaScript client on top of it that simulates a single-page app, or progressively enhances in some ways into one. I was really, really intrigued by the idea, but the more I played around with it, the more it pulled in some of the best aspects of both, but also some of the worst aspects of both and ended up being in my opinion, this weird Franklin project that did neither one particularly well. It just didn't work for me. I'm sure for certain types of types of projects, it can be really useful.

But I think the takeaway for the show is that frontend engineering is hard and there's a lot of trade-offs you have to make no matter what. I love to sit in my ivory tower and postulate about this stuff. We’re building really simple and really narrow apps that get by just fine as a multi-page vanilla JavaScript thing because they're not doing that much.

REIN: So speaking of frontend development is hard. There is a particular way in which frontend development is becoming incredibly complex and it is this movement away from client server models, away from Shannon communication style, I make a request. You give me a response. I make a request. You give me a response. This sort of a serial communication to a form of communication that's called joint activity, which is where just everything's happening all at once. I'm not making a request and waiting for a whole new page back. This part of the page is updating. This part of the page is updating. I'm typing over here, just a whole bunch of stuff happening at the same time and this is a paradigmatically different form of communication than request and response.

CHRIS: Do you have an example of that? I'm having a really tough time picturing what that looks like in my head.

REIN: So there's a book called Joint Cognitive Systems introduces this stuff if folks are interested, but think about incident response. During an incident, you're not synchronously causing things to happen and then getting the response. You have this person looking at this dashboard and you have this person on this machine doing this. It's all just happening all at once and you're not blocking waving on every next piece of information. The information arises in the environment whenever it does and you have to react to it in real time. There's no guarantee that only one thing will be happening at a time; any number of things can happen at the same time.

JACOB: Yeah.

REIN: It's basically non-blocking – [crosstalk]

JACOB: Several people are typing.

REIN: Yeah, several people are typing is actually a really good example. It's you no longer have an expectation that you're in this synchronous serial mode of communication with a single other entity. The entire environment is changing it in whatever ways it needs to and you have to respond to all of it. So React apps are starting to become more like this where the dashboards that you build today that you use to respond to incidents are like this. You've got 16 little widgets and they're all updating at the same time. Well, which one am I supposed to look at? Is that the one that shows me where the problem is, or is it this one?

CHRIS: Ah, this also kind of makes me wonder, not wonder, just think out loud. It sometimes feels like the things we build—and I'm admitting right up front, this is dumb. But it sometimes feels like the things we build are potentially more complicated than they need to be and I don't mean from the engineering under the hood, but there's a tendency to kitchen sink all the things like, if one is good, five is better and that's not always the case.

I think about, for example, Facebook, which has eight different things built into it and would each of those things be better if it was his own standalone application that had a very narrow focus potentially? That's just a really high-level throwaway comment that I think someone could very easily pick apart and point out all these examples of why it's stupid and wrong. But it also feels like if we didn't try and do this with everything we built, it would potentially alleviate a lot of the problems and challenges we have with all these moving parts and complexity. Admittedly, just a random thought that popped into my head so, not very well-developed in the slightest.

REIN: So it seems like we've organically moved into reflections, which is right on time.

CHRIS: Yes, indeed.

REIN: I think my reflection is, I don't think it's a coincidence, Chris, that you're, like you said, interested in both vanilla JavaScript and in privacy. We talked about it a little bit, but there are some deep connections between these two things, I think.

CHRIS: Yeah, absolutely.

REIN: And I think that the solution to one might be found in the other one and potentially vice versa. I think if we design – moving in the direction of vanilla JavaScript, I also think naturally moves us in the direction of increased privacy and maybe – [crosstalk]

CHRIS: Yes, potentially.

REIN: So the thing that I struggle with is how to motivate people to move in this direction because a lot of people have a lot of different conflicting goals. They may be in contexts that make it difficult for them to move in that direction. They work at a startup where selling user data is part of the business model and you're not going to get a product person on the same page with you on removing this tracker. It's not going to happen. Where are the actual levers that allow us to make progress in these directions?

CHRIS: Yeah. For me, because I've been thinking about this a lot, I think this is one of the reasons why I'm particularly excited about this new bit of tooling that I'm seeing come out. Because I am not personally big on lots of tooling for the things that I built, but I noticed that a large chunk of the community is and I think tools that make it easier to build things, but also keep the cost to the user down, whether it's privacy, or just the amount of shipped code are a very good thing. So when I look at compilers like Astro and Svelte, or I look at that tool we were talking about earlier, Partytown, that keeps those third-party tracking scripts off the main thread, that's great.

I think the other lever here is browsers themselves and the platform itself and what gets baked in. I think we already talked a little bit about how I think as long as Google is the dominant player in the browser market, there's only so much we can really do there because it is very much against their corporate interest to do that. But having platform native ways to do the things we want to do in a way that's easy and painless, like that path of least friction is in my opinion, probably one of the more powerful paths forward.

JACOB: I have two reflections. The first is, I think the web did peak at LiveJournal [laughs] for lots of reasons.


Yeah. The second is I'm thinking a lot about the software education industry and the whole space of just new developers generally and there is a lot of pressure in that spot that [inaudible]. It's all about single-page apps and showing that you can use “modern tooling,” which means React and probably lots of other complicated things that change every six months. I can't help but think about how that's actively shaping the web and it's making me wonder what would be different if we were encouraging developers to think about what would be best for what job and how React isn't the right tool for many jobs. So, yeah.

CHRIS: Yeah. I strongly agree. Strongly agree. There's definitely this kind of perception that if you're not using React, you're not serious about what you're building and I think the education market plays a big role in that.

On my end, I think one of the big things that came out of this talk, that I was not actually expecting to go in that direction so it was really interesting, was just around the whole privacy angle and how difficult it really is to maintain that privacy on the web. Even with tools like VPNs and adblockers and stuff, like the platform itself keeps making it harder and harder and I just really wish that weren't the case.

REIN: I really enjoyed this episode.

CHRIS: Yeah, no, I guess the only other thing I would add is if people enjoy having these conversations, or just want to tell me how wrong I was about something, I have a Daily Newsletter over at gomakethings.com that may, or may not be of interest to you.

REIN: Awesome. Well, thank you so much for joining us again.

CHRIS: Thanks for having me. This was a lot of fun. I appreciate it.

Support Greater Than Code