I have been really impressed with esbuild. It is lightning fast and trivially easy to configure. Instead of using a crappy React boilerplate that stitches together a thousand fragile, poorly-maintained npm packages, you just install one tiny binary implemented in performant, reliable Go.
I previously dismissed esbuild thinking it was too limited, until realizing the extreme performance difference allows for far far more simplification than I had initially appreciated... I'm not just talking about the lack of 10k npm dependencies.
When a build process is slow, if we are unable to make it fundamentally faster we start to do loads of crazy workarounds that keep growing tentacles with increasing complexity and optimisation challenges: Incremental builds, caching, different pipelines for development vs production. We've been living with this stuff for so long it's easy to forget that when a full build is humanly instantaneous you can just throw all of this complexity away, you don't need it... that's when I realized I could use esbuild, I don't need those "features".
esbuild changes the game. Before we have a dynamic language with a compile step that compiles slow like a C++ project. All the disadvantages of a dynamic language combined with the disadvantages of a compiled language. In the browser you have no choice, but to use that setup on the server with node or on mobile with react native was questionable. Now with instant builds, you've got a dynamic language back - and one that mops the floor with Ruby and Python in terms of performance. You now have just one language to hire for, all your developers can work on our understand any part of the code, and you can share code between server and client. The time for universal JavaScript is here.
Agreed, I added it to a large project for a client for development builds that were crawling and it is far and away both easy to use and blows all other build tools out of the water when it comes to speed. We still use webpack for prod builds since we need to support legacy browsers (I know... But corporate) but it's blazingly fast for development.
I started extracting code from my project into a shared npm library. esbuild takes 70ms to bundle all files into a single index.js, then I also need to run tsc (TypeScript) to emit the declaration files and that takes ~10 seconds. esbuild is 3 orders of magnitude faster than tsc! Granted, they are not doing the same type of work, but still…
I really love esbuild. Its speed is amazingly fast, and my node_modules folder is so clean.
But not being able to transpile to ES5 makes it a hard choice to use in client-facing projects, where IE11 is not totally dead (yet).
Its doable to integrate with swc (another great tool on the list), but now the build chain is longer, and more things to take care (such as sourcemaps).
Even if I guess IE11 is technically not EOL'd by Microsoft, they will stop supporting it for 365 services on Aug 17 this year, so I'm curious what kind of businesses are still using it.
I've been meaning to check out esbuild. Question to those who have used it: what's the catch, other than flexibility? If you've got a standard TypeScript/JSX/bundling setup, is there any disadvantage at all to switching to esbuild?
Flow + OCaml is a great example of this approach failing IMHO. I ran into tons of issues that were difficult to debug myself, and things like the regex format and config file conventions being different were also problematic. Compile times are also annoying.
Something undervalued is the debuggability of using a single language. There are so many times I need to debug some tooling, and being able to to just open the stack trace and place a breakpoint/console.log is incredibly valuable. I do it all the time for webpack/babel.
Having to switch to a new toolchain to debug something is tedious, and often impossible to reach the same convenience of JS-based tools.
I think the future is taking small pieces of JS-based tools and adding native extensions to speed up parts of them.
> being able to to just open the stack trace and place a breakpoint/console.log is incredibly valuable. I do it all the time for webpack/babel.
This is something I achieve by using multi-languge-aware build systems and/or vendoring. I can then add a log statement to a compiler implemented in C which emits binary shader code that in turn gets parsed by Rust, and immediately see results after running the usual test target. No need to limit myself to a single language.
> There are so many times I need to debug some tooling, and being able to to just open the stack trace and place a breakpoint/console.log is incredibly valuable. I do it all the time for webpack/babel.
That sounds more like an indictment of Webpack than anything else.
>The other assumption going away is that JavaScript tools must be built in JavaScript. The potential for type safety and 10x-100x performance speedup in hot paths is too great to ignore...
The thing is, what the "best" tool is, is relative to your requirements. I much prefer "Use the right tool for the job", as it makes it less cargo-culty and emphasizes that it's not always right.
For example, if your build process is very slow and you for some reason have to have it be very fast, esbuild is probably the right tool.
But if instead your goal is to build a large community of developers who can change their own tooling, webpack is probably the right tool as it's by JS developers, for JS developers.
Overall I agree with you, just nitpicking a bit I guess :)
Exactly, as when it comes to transpilers, it's unlikely that you can eat your own dog food anyway - as in, compiling/transpiling/whatevering webpack with webpack for webpack development would make no sense as far as I can tell.
> compiling/transpiling/whatevering webpack with webpack for webpack development would make no sense as far as I can tell.
Could you explain further why you think so? Dogfooding tooling and bootstrapping runtimes with themselves is a fairly common practice, at least for discovering issues, bugs and future features for your own project.
Supposedly since webpack is using JS, they would also benefit from using later features than what's available in NodeJS stable, so they can use webpack to transpile future ES versions to compatible code so it can run on more NodeJS versions.
Then that those features can both be used for frontend or backend code matters less, as the new JS features are usually not geared to a specific environment but for JS as a whole.
webpack is web-oriented though. While what you are saying is right, for a project which aims to keep web as its focus, this would perhaps not be the best way forward. When we also look the the codebase, they are not using webpack for webpack development, and using commonjs imports instead. The "modern JS" part of webpack comes from Babel and Babel uses TS (though still in progress).
So dogfooding here isn't used by major projects, and I think we can see a pattern.
For a formatting tool, I would agree with you more, but again, for a tool, wouldn't the formatting of the code of the tool itself be a distraction? It may be or it may be not, it depends, but the benefits are less clear for tooling.
It does make sense with TypeScript as it's a language. It, however, does not make sense with "tooling" (IMHO - there may be of course obvious stuff I may be missing).
Tooling may also use a transpiled language but again, I, as a developer who's very experienced with JS, don't see a big win there.
Someone mentioned Flow + OCaml as a bad example and how it doesn't work out for them, but that's more than tooling. At that point you are not using tools written in other languages to build JS, you are just using other languages.
Flow failed because Microsoft enabled TypeScript by default in VS Code. It had nothing to do with OCaml in my opinion.
However, TypeScript is now suffering from long compile-times because it targets JS. If it was implemented in a language with more control over threading etc. I think it could be faster.
What tooling are you using that is slow? Sometimes it's all down to configuration and "you could do it that way but it takes eons..."
For example, on a recent project I added prettier to eslint and tried eslint-plugin-prettier however it was very very slow (like multiple minutes for an eslint run). I discovered it's basically a deprecated package and the right way to use it is eslint on it's own (with ruleset for prettier so it ignores what prettier should do) and prettier on it's own. Then each command runs lightning fast and there is no slowdown. Their docs even go into this issue under Notes on this page:
Let me put it like this: your second paragraph would take me 2-3 hours (as a backend dev). Does that put things into perspective?
It's never about "I am an expert and can do it quickly".
What frontend devs seem to not realize is that backenders and sysadmins have to occasionally do something quickly on the frontend. And when I visit webpack's site and copy-paste a config and then adjust a few variables and run a command... yeah, it can be faster. Much faster. Not to mention that it rarely works the first time around and the error messages are absolutely not telling me what I do wrong.
---
This isn't intended to flame anything or anyone, mind you.
---
But please understand that for many the frontend tooling is a bitter disappointment because we intend to do a quick tweak and move on and it of course disagrees and I lost half my workday. And as you can guess, the next time it will take me half of my workday again because enough time has passed that I'll forget those lessons. Why can't the thing just be intuitive?
Can I do it better after I become an expert? Absolutely.
But that's the whole point: I don't want to become an expert. I want to install the tool, execute 2-3 incantations and get my work done. And most of the web dev tooling fails miserably at that task.
(btw, GIT is guilty of the same for like 90% of its commands.)
> .. occasionally do something quickly on the frontend. And when I visit webpack's site and copy-paste a config and then adjust a few variables and run a command.
This fails quite often due to subtle differences not only in major but also minor versions of tools such as webpack or babel and because there are so many variables (nodejs version, npm modules etc) in that equation.
Not my problem, is it? I have to use the same tools as the rest of my team and those tools are as fickle as a teenager's mood.
There are ways around those problems but nobody from the core team even attempted it (one example: have a stable set of sub-commands that get translated to now-modified-in-next-version internal APIs; is that really so hard? almost every commercial dev has to do it).
But no, let's throw the devs under the bus every time we change our minds.
See, that's why a lot of people hate the JS ecosystem, and that's why this perception will not change anytime soon.
All languages have idiosyncrasies. I'd be an idiot to hate on JS in particular because literally all languages I ever used have strange and weird deficiencies. But that's not it. The ecosystem and the tooling (and for some languages, like Erlang/Elixir, the runtime is also included) make all the difference to novices or casual dippers.
If that experience is bad, then the JS ecosystem will remain the same old meme of youngsters experimenting at the expense of everybody else like it has been viewed for 6+ years now.
---
Don't think I am blaming you for anything. I just ranted and vented. A lot of devs lack the insight to be able to look from the outside and say "hey, if I was coming here for the first time I'd be puzzled by this" -- and that grows to be a big problem when enough time has passed.
It came out the same year as NodeJS as well, so the whole JS ecosystem we know nowadays didn't even exist yet. That is, using JS for a tool like that didn't cross anyone's mind yet.
Closure compiler is much older than that though, it was written for Gmail (and used by most Google web properties). Closure is just the name of the OSS release.
If you want to create these kinds of tooling, would that mean you have to create your own parser first? Seems like multiple effort wasted on multiple environment.
Is there a way to unify these? i.e, just one parser that works for all languages that want to implement JS tooling.
I know this sounds more efficient, but this kind of attitude has serious downsides too. There is no "best" of each type of thing, it's subjective, so it's not necessarily duplicated effort to write your own parser. If you always use 3rd party libraries you also have less of a holistic view, it's hard to see different ways to do things... this is basically why the author of eslint didn't use something like the official typescript compiler, it simply has different goals and could never be as fast.
What a philosophical take on something super technically strict -- namely language parsing. Yes, there absolutely should be one and best way of doing it. We should get this done and finally move on to more productive activities.
I disagree, but Evan Wallace (who wrote eslint) puts it better than I could:
> Everything in esbuild is written from scratch.
> There are a lot of performance benefits with writing everything yourself instead of using 3rd-party libraries. You can have performance in mind from the beginning, you can make sure everything uses consistent data structures to avoid expensive conversions, and you can make wide architectural changes whenever necessary. The drawback is of course that it's a lot of work.
> For example, many bundlers use the official TypeScript compiler as a parser. But it was built to serve the goals of the TypeScript compiler team and they do not have performance as a top priority. Their code makes pretty heavy use of megamorphic object shapes and unnecessary dynamic property accesses (both well-known JavaScript speed bumps). And the TypeScript parser appears to still run the type checker even when type checking is disabled. None of these are an issue with esbuild's custom TypeScript parser.
> There are a lot of performance benefits with writing everything yourself instead of using 3rd-party libraries.
In my production experience correctness always trumps speed.
And language parsing is not a trivial task. Not every programmer can write a program to do it well (and fast). I know that so far I am not one of the programmers who can, for example.
I prefer to defer to the expertise of those who do it for a living and/or their academic pursuits (as long as they write clean and maintainable code).
So while the general point you quoted can absolutely be viable in many situations -- I've wrote stuff from scratch no small amount of times so I definitely can relate -- the original point here was language parsing where I believe your (and Evan's) take is prioritizing idealism to reality. And the reality is, most programmers aren't good with this stuff, and that's okay.
As for him and esbuild, I only have huge respect to show. But he's one of the few, not one of the many. He should be doing that and he did it very well (I use esbuild and adore it).
> So while the general point you quoted can absolutely be viable in many situations [...] the reality is, most programmers aren't good with this stuff, and that's okay.
I understand where you are coming from, I'm not suggesting this is some widely applicable principle (perhaps the emphasis in my previous comment was suggestive of that).
Absolutism never works in this arena: code reuse and libraries will always have it's place, and can have their own advantages (e.g shared correctness), but sometimes tearing it all down and writing from scratch also has it's place, and I think it is undervalued by comparison - Although I agree it takes a certain kind of person with not only enough experience and wisdom but a level of care and attention to detail that few are capable of.
A quick search for "rust", "kotlin" and "cpp" on npmjs.com shows a lot of packages. Not all are language tools, but I see stuff like linters, interpreters, compilers and lots of stuff related to WASM compilation and codegen.
It’s probably selection bias (as I tend to not enjoy working with node.js software, and would probably actively pick alternatives), but I can’t think of any in my bubble (Go, Rust, Python, distributed systems, ...).
JS is the dominant language on the web, but languages which can compile/bundle/etc. faster than JS can will make for a more pleasant developer experience. Especially when working on a large project.
Interesting list, but I don't quite get the idea. Is it implying JavaScript shouldn't be used to work on JavaScript, just for the sake of it? To me it almost sounds as arbitrary as "JS tooling not including the letter T in the project's name". What's the motivation of this list?
Nope, the list is implying that "here is a resource of tooling built for JS but NOT made in JS", not sure where you get the other implication from.
Just as "awesome-lists" about security is not implying that you should hack other people and so on.
I think the motivation is to show that just because you want to write a tool for the JS ecosystem, you don't have to use JS if you don't want to. And I like the idea, I think more languages and their communities should cross-pollinate, we can all learn a lot from each other, even though some like types and some do not.
They are much faster and memory-efficient than the tools written in JavaScript.
Part of it is language (e.g. obviously Rust is much better than JavaScript for algorithmic code) and part the fact that someone using another language is likely motivated to do a good job.