Readit News logoReadit News
franciscop · 4 years ago
I really hope Bun at least becomes a big player. Node.js has been very sluggish implementing new features, to the point where most people just don't trust them much anymore with this. Deno's direction is interesting but doesn't align well IMHO with dev's interests. Bun in exchange:

> Out-of-the-box .env, .toml, and CSS support (no extra loaders required).

This makes a lot of sense. Node.js should be "sherlocking" (integrating into the core) the most popular/common features devs use. It's crazy to me that, after 10+ years of `.env` being a common abstraction to manage environment variables, you still need to install a separated library because Node.js doesn't know to read the file itself. Same with yaml or toml.

Same with features like fetch(), which took 5 years from the Issue to be implemented in Node.js (and not for lack of collaborators, but for lack of wanting to merge it).

I'm happy though that Node.js is finally approving web features, and that the move to ESM has been finished, but they are moving so slow that I can def see a focused small team (might be too much for one person) overtaking Node.js.

andrew_ · 4 years ago
There's value to a system where everything is bolt-on as well.
t0astbread · 4 years ago
I have zero experience designing systems like that but I think the sweet spot would be a standard library that's modular and detached from the core. You could pull it in after setting up the core, replace it or upgrade it in parts but it's still maintained by one entity with a consistent level of quality.
franciscop · 4 years ago
Sure, but I'd argue that value is marginal, for 90%+ of people building websites you want a straightforward HTTP server that you can build and customize with plain JS, and for some deeper cases yes then that bolt-on system is useful as well.
davnicwil · 4 years ago
The problem with stuff like introducing .env support out the box is breaking backwards compatibility in a silent way, ie without any code changes and in a way that'd still run the same in dev, and pass tests on CI etc, and then just affect prod.

I know this shouldn't ever happen, but you can well imagine plenty of legacy/badly configured setups where it would. More pertinently, where it would no matter how loudly you warn about it in release notes etc.

When you're as mature and used a platform as node, you just can't risk things like this, unfortunately, no matter how more convenient it would be for the vast majority of users.

franciscop · 4 years ago
That would be very easily solved by adding a Node.js version or range in your package.json that specifies where the program is supposed to run, and treat major versions as breaking. There's a balance to be had here, and avoiding breaking anything at all costs is surely not really balanced and it's starting to add a lot of cruft that will be needed to be maintained long-term.

The solution is not to "warn loudly" here; specify a Node.js version in your package.json, and your code will continue working on that version. Upgrade the Node.js engine version, and then it's on the person upgrading making sure that nothing breaks. That's how virtually all platforms work (except the web itself, but that's not "versioned" so it's fair).

It's a bit more troublesome when changing core packages and considering dependencies, but the same could apply there.

atwood22 · 4 years ago
Apple breaks stuff all the time when updating their APIs. If you hold yourself to decisions made long in the past, then your platform will undoubtably become crusty and stale. Maybe Node maintainers are ok with that, but it shouldn’t be surprising when people go to greener pastures.
anonydsfsfs · 4 years ago
Node already has a solution for this: core modules are prefixed with "node:" to distinguish them from third-party modules. https://fusebit.io/blog/node-18-prefix-only-modules/?utm_sou...
inglor · 4 years ago
Node is a small team too but what features in particular are we missing?
franciscop · 4 years ago
I explained some right in my comment, currently missing are things like native support for .env, for .toml and .yaml. Support for ESM and fetch() was missing for way too long until recently.

Currently I don't see big gaps between web and node anymore, but I still find the APIs a bit messy and to e.g. do work with files I have to import all of the three `node:fs`, `node:fs/promises`, `node:path`. It'd be nice if at least `node:fs/promises` ALSO included non-promises from `node:fs`, like "createReadStream()" to avoid having to import another core module for that. Also WebCrypto should define the variable `crypto` as a global, like `fetch()` and `URL` do, for compat with the browser.

Just some ideas/examples from the top of my head, no much search/research was put into this so def take with a pinch of salt.

stephen · 4 years ago
This is probably naive, but I'd love to see one of the Node.js competitors, i.e. Bun or Deno, innovate on the "it takes ~forever to load node_modules every time I run a test/script/etc" problem.

I.e. Bun is improving "bun install" time for projects with large node_modules, and that's great, but what about, after it's on disk, having to parse+eval all of it from scratch every time I run a test?

As admitted, this is a very naive ask due to the interpreted/monkey-patching nature of the JS language, but I'll hand wave with "something something v8 isolate snapshots" + "if accomplishing this requires restrictions on what packages do during module load, deno is rebooting npm anyway..." and hope someone smarter than me can figure it out. :-)

qbasic_forever · 4 years ago
Deno doesn't use node_modules so you won't have a problem there. You specify each dependency as an import URL (either in the code or in an import map) and it grabs them directly (also using a local cache directory). There's no need in the deno world to even use a package manager.
stephen · 4 years ago
Ah yeah, you're right that Deno doesn't have a node_modules, but AFAIU it still downloads dependencies to "somewhere on disk" and then, every time your code runs, it re-evals all of them from scratch.

So, admittedly I was using "node_modules" as a shorthand for "the code that makes up my dependencies", and that AFAIU Deno has not implemented this "use a v8 snapshot to cache preloaded/pre-evald dependencies" optimization.

I.e. I want something like:

https://danheidinga.github.io/Everyone_wants_fast_startup/

ch4s3 · 4 years ago
> There's no need in the deno world to even use a package manager

That doesn't sound like I thing I would want.

t0astbread · 4 years ago
Bun and Deno both look really impressive and I hope development in this space continues at the massive pace it's happening right now. But I've tried using both on a small greenfield project recently (nothing fancy - just a static website with some custom build logic made by stitching together some templating engines and libraries) and I ended up reaching for Node again.

As mentioned in the article Bun still has a lot of incompatibilities with Node. For example, as far as I could tell scripts designed to be run with npx don't work right now. And I'm not sure what to make of binary lock files. Sure, it's more efficient but how do you know a `bun add` didn't change something it absolutely shouldn't have changed?

Deno has really fast TypeScript compilation and file watching built in which is awesome. I always loathed using tsc together with nodemon or some custom file watcher thing. And permissions are great. But the package index is more or less the most valuable asset of Node and Deno doesn't provide (and doesn't aim to provide) compatibility with most of it. Also, while I do like the idea of using URLs for dependency management, the way Deno does it with import maps and lock files feels extremely convoluted and too easy to mess up for me. NPM is more intuitive.

hayd · 4 years ago
" Deno doesn't provide (and doesn't aim to provide) compatibility with most of it"

which packages did you have issues with/need that didn't work? my understanding was with node compat flag deno supported quite a bit...

t0astbread · 4 years ago
At the time it was the "tailwindcss" package IIRC. I also just tried it again on the latest iteration of that codebase and ran into some import problems, seemingly related to "markdown-it" (version 13.0.1). First I got "SyntaxError: Cannot use import statement outside a module", when I changed the package.json file to add "type: module", as requested in the error message, I got a panic. If you want, I can open an issue for that.

Additionally, I'm currently using at least one function in "fs/promises" that doesn't seem to be supported yet ("opendir") and the "vm" module to evaluate some (trusted) JS.

Most of those issues can be worked around, I just decided to go with Node instead for the time being.

matesz · 4 years ago
Some of Bun's server speed can be attributed to uWebSockets, which has node bindings [1]. Of course this is just a small detail, since Bun is a huge project.

[1] https://github.com/uNetworking/uWebSockets/discussions/1466#...

joshxyz · 4 years ago
alex got good points here as usual.

it's been a long time since i've used node's built-in http module since I've found uwebsockets.js.

matesz · 4 years ago
Good to hear there are others successfully using uwebsockets.js! We are using it in very early stage project, not production yet. Can you share your experience using it?
dgreensp · 4 years ago
Installing a gigabyte of NPM packages (or whatever is being said to be faster) a little bit faster and a little bit worse is not that interesting IMO.

I actually use Deno right now because I only use a handful of third-party dependencies in my project, and I can use Deno’s bundler instead of webpack (even though that’s not its intended use). I’d rather simplify away the stuff that’s slow and complicated rather than making it faster and more complicated (less correct, less compatible).

endorphine · 4 years ago
When talking about JavaScript performance, aren't we essentially comparing Bun to V8?

If so, wouldn't it be very hard to beat V8's performance, given that there has been so much engineering effort for many years towards making it fast?

Or are we taking into account the speed of the package managers and bundlers as well?

Disclaimer: I haven't saw the benchmarks, if any.

jesse__ · 4 years ago
It's _mostly_ talking about the speed of the surrounding tools. They wrote a package manager that speaks NPM (presumably in C++ or zig instead of JS) which unsurprisingly completely dominates NPM in performance.

The other stuff he benches is built into the runtime ... HTTP requests, copying files, and a webserver Bun ships with. Presumably the Bun team wrote these tools with performance in mind, but the article doesn't compare features. I'm skeptical the webserver in particular is at feature parity with the ones he benched against, which makes the numbers look pretty watery to me.

It does not address the runtimes of JavaScriptCore (Webkit/Bun) vs V8 (node/deno) which, as you pointed out, are probably very similar.

ksbrooksjr · 4 years ago
For the simplest benchmarks the performance disparity might come down to the difference between JavaScriptCore (which Bun uses) and V8 (which Node uses), but for anything non-trivial there's a significant amount of overhead introduced by the Node runtime.

Someone wrote a barebones V8 wrapper called Just Js, which is based on V8 just like Node, but it crushed Node in the Techempower benchmarks [1].

[1] https://www.techempower.com/benchmarks/#section=data-r21&tes...

senttoschool · 4 years ago
>When talking about JavaScript performance, aren't we essentially comparing Bun to V8?

Bun is powered by JavascriptCore which is webkit. Webkit is developed by Apple. Safari typically outperforms Chrome on benchmarks.

xtian · 4 years ago
Benchmarks are shown on the project’s homepage
SahAssar · 4 years ago
Bun is currently lacking workers, which at least for me is a dealbreaker. I'd also think it'd be interesting to compare against justjs when it comes to speed, since that is a very small wrapper around V8 (much smaller than node/deno) and manages to score extremely high on techempower benchmarks (top 25 on all, first/second spot on a few): https://just.billywhizz.io/blog/on-javascript-performance-01...
tiffanyh · 4 years ago
Looks like Just-JS author is looking into swapping V8 for JSCore.

I’d be super interested in seeing just how much faster Just-JS would get using JSCore (like Bun uses).

https://Twitter.com/justjs14/status/1557856790897106944#m

SahAssar · 4 years ago
Interesting! A small wrapper switching the engine might be the most reasonable "real-world" benchmark there is.
eatonphil · 4 years ago
Just to clarify, Bun isn't a wrapper around V8. It's a wrapper around JSC.

I don't think Jared's benchmarks were benchmarking Bun and Node so much as V8 and JSC.

I don't think benchmarking Bun against justjs is going to change much.

SahAssar · 4 years ago
Considering the performance difference between node, deno and justjs I don't think that's right.

There is clearly a big difference between different wrappers and how to handle different things. For example some of them delegate most of HTTP handling to a native lib while IIRC node does a lot of that in its js stdlib.

jesse__ · 4 years ago
FWIW I read it as the opposite. I think most of the stuff he benched was explicitly not benching V8 against JSC. This is all me reading between the lines .. maybe you know more about JS runtimes than I do and can confirm/deny my theories?

Package manager perf: Bun (presumably) wrote a C++ or zig package manager that's integrated, and speaks NPM. I guess you could aruge that this benches their fast one against the npms V8 runtime, but.. that's a bit of a stretch for me.

Copying large files: I'd be surprised if any of the benched distributions rely on the javascript runtimes to copy files on disk.

HTTP Requests: Maybe distributions actually call into JSC/V8 for these .. I have no idea. I'd guess not, but this one sounds the most plausible case for "benching V8 against JSC".

verdverm · 4 years ago
What is up with buns binary config file? How does one check dependency versions and warn of security vulns? How can someone think binary format config is a good idea?

(edit): https://github.com/oven-sh/bun#why-is-it-binary (though this does not answer the last question, and only partially the second)

saghm · 4 years ago
Honestly, I can sort of see it making sense. The only time I need to look into my lockfile are to see the exact version of something I'm pulling in, and generally I think I would be fine doing something like `lock-file-tool --show-me xxxx`. In Rust, there's a `cargo tree` command to see the tree of your exact resolved dependencies for when you want to see the whole thing expanded, so I generally don't use the lockfile for that anyhow. I certainly don't ever update my lockfile by hand, so there'd be no loss of usability in that regard.

That said, I'm not super convinced that performance would necessitate this; if parsing the text file was really that slow, I think you could instead have a separate binary file created whenever the lockfile is changed that has a serialized representation of the lockfile along with a checksum to ensure that it's not out-of-sync, then hash the lockfile before using the binary representation. I guess it's possible that if the tooling is brittle or people try to edit their lockfile by hand, this might end up detecting an out-of-sync lockfile more often than not, but at that point I think the issue isn't really with the lockfile format.

bduffany · 4 years ago
There's an option to output a yarn-compatible lockfile. In practice, I think this means you'd need a branch protection rule to disallow a change to the binary lockfile without updating the yarn lockfile. I'm not sure that complexity is worth the performance gain of the binary format, personally. I think Bun should have an option (maybe in bunrc) to always use the human-readable format, though that detracts from the "batteries included" nature a bit.