Modern JS has made the web worse

Let's talk about that

Posted by Chase Q. Aucoin on July 3, 2020

That’s a bold statement

I know right?! We’ve been told by gurus for a few years now that if you’re not doing it the NPM way you and everything you stand for is complete and utter garbage. I work with hundreds of website a week as part of my job and I am going to let you in on a secret: most run like shit. This is bad for customer acquisition, bad for retention, bad for experience, bad for the brand, and ultimately bad for your dollars as an organization.

What is strange is that even 5 years ago website were loading and operating faster. So what happened? I’m going to explore one area of change with how we create web based application that I think is one of the largest culprits. There are others, but I’m going to focus on just one thing.

My, That’s a big library

To do any meaningful work in an application there is typically a lot of up front work that has to happen. To help make development easier we use frameworks and libraries to make much of that repetitive baseline effort easier and more reproducible.

Enter JQuery.

While not the first framework certainly few frameworks since have had as broad of an impact on the internet as JQuery. At it’s peak 90+ percent of the internet had JQuery as a dependency, which is incredible!! For the time period JQuery was a pretty big overhead to put on top of a site and delivering it was slow and costly to the performance the site.

Enter the CDN

With websites having to serve traffic to a bunch of locations globally and repetitive files like JQuery being served to the same people many times a day the internet was wasting billions upon billions of KB a day serving up the same content to the same person over and over. This is where CDN’s came in. Content Delivery Networks. The goal, centralize and geographically distribute static content across the globe so that re-usability and speed could be high. It was, and it was glorious. 90% of the web delivered over half of it’s page load from offsite and was cached locally by the browser so pages were snappy and load times were low. Think about it, your website with maybe a few hundred or thousand visitors a week was getting to leverage files that were in use by large companies that had millions of requests a day so there was a good chance your large JavaScript payloads were already cached.

How we got modern JS build systems

So with pages loading faster than ever interactivity started to increase and the thought came “Hey! What if, and I’m just thinking out loud here, we take this application and cram the whole thing in one giant JavaScript file?”

And so it was.

The world slowly moved away from individual pages being loaded from the server and started caching entire applications in the form of a JavaScript file in these CDNs. The frameworks however were still centralized so some of the speed gains were still there but the additional load began to slow down the time to first meaningful interactions.

The other obvious problem that arose from this chiefly among them was composability and bloat. It quickly became unwieldy to work on these files as they got larger and larger. This is when we started to see projects like CommonJS and RequireJS in 2009 and the notion of modules. So files began to get split apart so they could asynchronously load.

This too caused problems.

So now that files could be separated into sets of concerns projects began to shift from one extreme to the other. Now massive sets of small javascript modules were being littered throughout projects and developers cried in agony to the heavens, screaming “WHY GOD, WHY MUST WE SUFFER SO!” and so new solutions were devised. With server systems for years there have been effective tools for the building and testing of large systems with thousands of moving parts, surely the same could be done for the web client. So, build systems began to emerge and with their emergence developers thought “Well, this isn’t too bad. In fact I bet we can build even more complex front ends now”

And so it was.

Complicated complications

So with this new set of tooling web applications became more and more complicated, pushing more and more of the burden of business logic to the front-end and tightly coupling it to web based UI.

At this point new problems started to emerge. “We’re doing a lot of the same work over and over, there should be some kind of framework for this” developers said, and there were. Many in fact, and it became difficult to find them and manage them. With some applications using 5, 10, 20+ different dependencies for their front end. Something had to be done, and something was done.

Enter Package Management

NPM (Node Package Manager) was not the first, but it is now the defacto standard for JavaScript dependency management so I will be using NPM as a broad term for package management in this ecosystem going forward.

So with all these dependencies an opportunity arose to begin to manage them. At the same time a runtime technology was introduced to take advantage of how many JavaScript developers were out in the wild called Node.js. Node.js was a big keystone for the changing landscape of JavaScript practices and patterns. Engineers started using JavaScript for doing classical server-side actions and creating new applications using this language and developers thought “well if it makes sense for the back-end it MUST make sense for the front-end”

It did not.

About a year after Node.js NPM was introduced to help manage the wealth of packages being developed and distributed. These two technologies opened wide the door for building and packaging JavaScript.

Why does that matter?

Okay, now to the crux of the conversation now that we know how we got here. With these build systems developers started bringing in these large frameworks into the deployables for their website, and while yes, they get minified and and compressed it is a lot of data for consumers to have to download for your website. Now that many website no longer pull there core frameworks from central locations consumers are now forced to have hundreds of copies of effectively the same code over and over again. This also means your site is having to server wasted bytes over and over again. This is bad for consumers, this is bad for you. If the major consumers of the highest used frameworks all used the same CDN for delivery and smaller websites that want to use those same framework followed suit this would greatly reduce wasted traffic across the web and make everyone faster.

The organic evolution of the web today has made the web worse not better and caused incredible amounts of waste. We need more holistic frameworks that incorporate more facets of the web experience even if large when used by a few key players it would drastically speed up the web enabling less busy websites to only serve what makes their structure and style unique, and a bit of code for actions. This would further simplify development and increase deliverability of content.

I don’t know if we’ll ever get to this point, but it’s certainly something I’d like to see.

Thanks for reading, if you enjoyed please share on your social media platform of choice.

If you ever need any help making your web properties more performant, development cycles faster, or have an urgent problem that you need fixed yesterday hit me up on LinkedIn I’m always available for consulting work.


If you enjoyed this article, please consider subscribing to my blog for future updates.Your email address is only used to send you notifcations of new articles.

Unsubscribe anytime using the link included in every email.