Readit News logoReadit News
ogoffart · 2 years ago
I'm interrested in the WebGPU feature.

With Slint [1] we're working on a framework which allow to make a desktop GUI in Javascript/Typescript, without bringing a browser/webview. Currently, we do it by using binaries through napi-rs so we can bring in a window using the platform native API. And then we do some hack to merge the event loops.

But if Deno supports bringing up a window directly, this means we can just ship wasm instead of native binary for all platform. And also I hope event loop integration will be simplified.

Although we'd also need more API than just showing a window (mouse and keyboard input, accessibility, popup window, system tray, ...)

[1] https://slint.dev

Edit: I got excited a bit too early. The WebGPU feature doesn't include the API to launch a Window. One still need to rely on an extra library binary.

FrostKiwi · 2 years ago
Big fan of custom rendering approaches, but wouldn't such a Design system sidestep any and accessibility tools? Nothing for screenreader to hook into. Text and interfaces would be neither native nor DOM based.
pygy_ · 2 years ago
Slint uses https://github.com/AccessKit/accesskit to provide cross-platform a11y.
crowlKats · 2 years ago
it actually does: its not part of the WebGPU API directly itself, but its separate: https://deno.com/blog/v1.40#webgpu-windowing--bring-your-own...
CyberDildonics · 2 years ago
This doesn't really seem related, it seems like you're saying "maybe we will use this, here is a link to our commercial GUI library".

Also why would you want to make a GUI even more bloated by integrating a browser to get webgpu and webasm (and why webgpu instead of just webGL?). That might be easy for library makers but why would someone want a giant library to make a GUI when they already have electron if they don't care about 350MB binaries to pop up a window.

themerone · 2 years ago
They are talking about using webgpu for rendering to screen without going through a browser.

Webgpu and wasm without a web browser would be a nice way to distribute portable sandboxed code.

The reason for Webgpu is that it it has semantics closer to modern modern apis like Metal or Direct X.

PoignardAzur · 2 years ago
Have you talked to the Dioxus people recently?

They're working on a project called Dioxus Blitz; from what I'm told, they're trying to implement a minimal browser target, that provides some basic DOM-like features and renders with wgpu.

It's not exactly what you're hoping for, but you might find common ground.

https://github.com/DioxusLabs/blitz

(Also, the Linebender project is working on Masonry, with FFI as a medium-term goal.)

jkelleyrtp · 2 years ago
Our v2 of blitz is using firefox itself to resolve CSS and google's advanced WGPU renderer for high-speed text and vector graphics.

https://github.com/jkelleyrtp/stylo-dioxus

tmiku · 2 years ago
I poked around your website because I was curious if the name was a band reference. Love the Spiderland photo of the staff!
user3939382 · 2 years ago
I totally get the idea of bringing JS into new environments because it is so ubiquitous. On the other hand, I hate the language and wish we could replace it in the browser, the opposite direction. I'd be much happier with ClojureScript or PureScript or something being the standard with the ecosystem to go with it.
kybishop · 2 years ago
Typescript takes away a significant amount of the pain for me; the only hold up after that was getting an environment set up to compile it. Deno supporting typescript without any configuration is incredible.
tracker1 · 2 years ago
So use WASM with the language of your choice and best of luck to you with that.
dingdingdang · 2 years ago
Would love to see the compile situation fixed - the generated executables are ~90MB+ at this stage and do now allow compression without erroring out. Deploying ala Golang is not feasible at that level but could well be down the line if this dev branch is picked up again!

The exe output grew from from ~50MB to plus ~90MB from 2021 to 2024: https://github.com/denoland/deno/discussions/9811 which mean Deno is worse than Node.js's pkg solution by a decent margin.

bartlomieju · 2 years ago
Hi, Bartek from the Deno team here. We are actively looking into improving this situation and from initial investigations we were able to shave off roughly 40% of the baseline size, with more improvements possible in the future. Stay tuned for upcoming releases.
dingdingdang · 2 years ago
Tuned I am, happy to hear this is getting attention. Improvements in this domain would also enable Deno to be a more serious contender in the App space opened up via https://github.com/webui-dev/deno-webui and others.
wvh · 2 years ago
Currently the generated binary is not static anyway, so you still need some parts of the system installed to run code. To be more precise, you can't use a "from scratch" container image base, but need to use something that at the very least has libgcc installed, such as the distroless "cc" image (gcr.io/distroless/cc).
brundolf · 2 years ago
I'm not sure using deno compile as a way of deploying to a controlled environment has much benefit anyway. Unlike some other languages/runtimes, only a single system dependency is really needed (a new-enough deno installation) to run your code

In my view, deno compile is more about shipping command line tools to people with all sorts of personal environments (which may not have deno at all)

tracker1 · 2 years ago
I've been really happy with using Deno as a general scripting runtime... A shebang at the top and external dependencies are loaded to a shared path on run.

I do wish there was support for Linux distributions based on musl (Alpine) directly for smaller containers.

In general I like the Deno approach better than Node. Would be cool to see the UI tooling flushed out. A material or fluent based component library where Deno can be used like Flutter would be very cool indeed.

ijustlovemath · 2 years ago
I'm not sure what your requirements are, but I've had a good amount of success with converting Node.js libraries to native libraries by embedding a CommonJS module into the binary, then running the actual code through QuickJS. Much smaller binaries.

If you really are pressed for space, you could use upx, or store 7z compressed code and embed the 7z library to decompress before passing it along to QuickJS.

Here's a proof of concept: https://github.com/ijustlovemath/jescx

dns_snek · 2 years ago
Isn't QuickJS order(s) of magnitude slower than V8? That doesn't seem like a practical tradeoff to make outside of embedded.
littlestymaar · 2 years ago
> Deploying ala Golang is not feasible at that level but could well be down the line if this dev branch is picked up again!

While I agree with you that it's not optimal and should get fixed, 90MB doesn't sounds like it can stop you from deploying it either.

tracker1 · 2 years ago
It adds up though... Personally, I'm fine with Deno in the path and scripts with a shebang in practice.

If reach one of my scripts was a separate build output from Deno that would be several GB of space.

ecmascript · 2 years ago
When I download a modern game it's like 700GB so I donno why people complain about 100mb self contained deploys for javascript. Most of it is the international libraries anyway so.

I find it pretty ridicolous since I go to a website today and it's at least 15MB each time I refresh but 100MB on the server is a problem? Dude cmon.

atraac · 2 years ago
So the fact that everything around is poorly made, forbids him from whining that a certain solution is as bad as everything else? Shouldn't we criticize and point out stuff that is bad, no matter whether it's as bad as something else? It's because of attitude like yours that we get 15mb of javascript on every website, 500gb games and UIs that take seconds to load.
0x457 · 2 years ago
Modern games aren't 700Gb, there are a very few very large games. One of them is large on purpose to make to delete other games.

Updates for games are large because they aren't "just updated files", but a giant blob that can't be rearranged since it's optimized for loading order later.

Giant self-contained deploys like that are bad because their use case is CLI/GUI tools shipped to the end user. To give you context: my /usr/bin is ~900 binaries and total size is 109mb.

declaredapple · 2 years ago
100MB at 1Gbps is ~0.8s to load. This especially sucks for FaaS and CI/CDs. This makes it difficult to use for on-demand use cases that are latency sensitive.

That and that's likely 100MB that's going to be competing for memory now* , in FaaS environments especially that's a ton of resources and you increase latency to load that especially over network storage. This also inflates container sizes and just becomes a pain in general.

*Server environments you often don't want to be using swap, and the OS will likely consider it high priority anyway.

I could only agree with you if you have a single monolithic server, don't use containers, don't need to distribute it to others, and don't frequently need to run a CI/CD pipeline. Only then are there not a ton of downsides to a 10mb vs 100mb executable.

> When I download a modern game it's like 700GB so I donno why people complain about 100mb self contained deploys for javascript.

People have been complaining about game sizes for years. "You complain about a 100GB game, I don't know why you also complain about a 100mb self-executable for a javascript runtime" doesn't really make sense.

konart · 2 years ago
You download it to you game rig, not some cloud solution with limited resources to be paid for. Plus you may want to have more than one instance of service running.
surajrmal · 2 years ago
This is whataboutism. Different sizes are acceptable for different people based on context. I worry about 10s of kilobytes for things I work on for instance.
xtreme · 2 years ago
There are size limits when deploying on serverless or edge infrastructure so developers have to care about that. The providers also typically charge by compute seconds * memory consumed so a larger executable costs real money as well.
pachico · 2 years ago
I'm not a big fan of JavaScript but I admit I stayed away from it because I dislike nodejs and npm terribly.

I was forced to start coding again in JS some weeks ago and I wanted to try Deno. I must say it's been a very smooth and fast experience so far. Very well done!

nonethewiser · 2 years ago
Can you comment on what you prefer about it? I find npm/js pretty smooth and the rough edges of Deno seem to kill the purported improvements at this point. That was just my gut-take several months ago and I was already steeped in the node/npm ecosystem so I'm curious about your perspective.
tracker1 · 2 years ago
No Byzantine build config files,. As a start. Dependencies are downloaded to a shared location outside your project on demand instead of a separate npm install step.

TypeScript and esm/cjs usage without crazy syntax (writing modules, consuming cjs at least).

Lintinng and formatting in the box.

Can do shell scripts without package.json and nom install, just a shebang line at the top.

These are some of the things I like better.

pachico · 2 years ago
For starters Deno is much, much faster in both installation and runtime.

I am not sure if I'll have the chance to exploit other features it has (besides the built-in dotenv support).

Not so many years ago, even installing npm wasn't straight forward.

DanielHB · 2 years ago
NPM used to be reaaaaaallly bad due to lack of lockfiles and how it used to handle diamond dependencies Added to the propencity of JS projects to have a ton of deps...

It has mostly been sorted out by all package managers. The node_modules debacle is still a hotly contested topic, it creates a lot of problems, but it also solves a lot of them compared to alternative approaches.

Then you have install performance which is mostly fine by now in all package managers, but if you really have problems with it you can use pnpm or yarn2.

As the python ecosystem grows and dependency trees move away from "django only" you can see they having the same types of problems that JS used to have.

Quothling · 2 years ago
We use Typescript for virtually everything at my place of work. Not so much because it's a great language for the a lot of the backend, and we do use some c++ for bottlenecks, but because it's so much more productive to use a single language when you're a small team. Not only can everyone help each other, we can also share resources between the front and the back end, and we've several in-house libraries to help us with things like ODATA querying and API's since there aren't really any public packages that are in anyway useful for that. I guess we probably should've gone with something other than ODATA, but since we use a lot of Microsoft graph APIs which despite their names are ODATA, it made sense at the time. We don't have trouble with Node or NPM, and when we onboard new people, the tend to step right into our setup and like it. Granted, we've done some pretty extensive and very opinionated skeleton projects that you have to use if you want to pass the first steps in our deployment pipeline. This took a little bit of effort, and it's still a team effort to keep our governance up to something we all agree on, but with those in place, I've found it's genuinely a nice way to work. An example of how strict we are is how you can't have your functions approved without return types and you certainly can't change any of the linting or Typescript configs. Similarly you can't import 3rd party NPM packages which aren't vetted first, and we have no auto-update on 3rd party packages with out a two week grace period and four sets of human eyes. I'm not going to pretend that a all of these choices are in anyway universal, but it's what we've come together and decided works for us.

Anyway you're certainly not alone in your opinion, but I think that a lot of the bad reputation with Node and NPM comes from the amount of "change management" you need to do to "limit" the vast amount of freedom the tools give you into something that will be workable for your team. Once you get there, however, I've found it to be a much to work with than things like Nuget and dotnet, requirements.txt and python, Cargo and Rust and a lot of others. I do have a personal preference for yarn, but I guess that's mostly for nostalgic reasons. I also very much appreciate that things like PiPy are going down what is similar to the NPM route.

mattferderer · 2 years ago
Curious to hear your opinion on ODATA after using it on your projects. Pros, cons, anything.
wldlyinaccurate · 2 years ago
I deployed my first non-trivial Deno app to production in 2023. There were some teething issues with learning to keep the lock file in sync, especially in a repo with multiple entry points each with separate lock files. Some of the granular permissions stuff didn't work how I expected, to the point where I almost gave up and just allowed network to/from all hosts. But overall the experience was good, and I have positive feelings towards Deno. I look forward to seeing where they take it.
tracker1 · 2 years ago
I tend to put a shebang line on Deno scripts intended to be run... This has the permissions right there in the script.

Alternatively, like in my ~/bin, I'll put a shell script without a file extension to make it easier on myself.

olestr · 2 years ago
You can join the waitlist in the sneak peak of JSR linked in the end of the article: https://jsr.io/waitlist

I'm curious on what the Deno team is building here.

throwitaway1123 · 2 years ago
Like others have said, it's going to be a new package registry. It was unofficially announced at SeattleJS Conf 2023: https://www.youtube.com/watch?v=Dkqs8Mcxbvo&t=424s
webdood90 · 2 years ago
I didn't see any major reason for why they were creating another registry.

I thought their whole thing was to maintain backwards compatibility, not introduce new, redundant standards?

vwkd · 2 years ago
Here's the specific talk https://youtube.com/watch?t=822&v=dipwQfcV0AU (CascadiaJS channel) or https://youtube.com/watch?t=822&v=5DX49vzLfqw (Deno channel)
cdoremus · 2 years ago
Here's what they have to say on the JSR site:

Why Choose JSR? A True TypeScript-First Environment: Efficient type checking and no transpilation hassles—write in TypeScript and deploy directly.

Performance and Usability at the Forefront: With integrated workspaces and seamless NPM integration, JSR puts usability first.

Secured and Accessible Modules: All modules in JSR are exposed over HTTPS, ensuring your code is always secure.

Open Source, Community-Driven: Built by developers, for developers, JSR is shaped by the real-world needs and contributions of the JavaScript community.

esperent · 2 years ago
JSR = JavaScript Registry (from the site). It seems fairly clear this is a package registry, i.e. an NPM alternative.
simultsop · 2 years ago
But they promoted they will not need one at any time, to extent they will never build one. There was only deno.land as a point to discover libraries or what community builds.
rapnie · 2 years ago
Hmm, curious too. Maybe, given the few hints, an alternative to npmjs.com registry.
andyferris · 2 years ago
I assume it's an npm competitor? Likely with a different technical design.
kebman · 2 years ago
"Jupyter, the open source notebook tool, added support for JavaScript and TypeScript using Deno. This means data science, visualization, and more can all be done using modern JavaScript and TypeScript and web standards APIs."

I love this! At the same time, who would want to do this, given Python's excellent support for numbers and mathematics? And what about Haskell?

inbx0 · 2 years ago
The people who already know JS/TS and would like to occasionally do something interesting with a piece of data. With Python, most of my time goes into googling how that list filtering / mapping syntax went again or some other basic level stuff that I do every day with JS. I know Python, but I'm not fluent in it. And I will likely never be fluent in it, because my Python use cases are so infrequent.
skybrian · 2 years ago
Maybe you want to share examples of how to do things in TypeScript using a notebook? Although, Observable [1] is another way to do that, for JavaScript at least.

[1] https://observablehq.com/

maelito · 2 years ago
Simply because using the same language for both Web and server is incredible.
tracker1 · 2 years ago
Easier dependencies.
apatheticonion · 2 years ago
Deno is such a great project. I would love to see greater support for embedding it into a Rust host process.

I'm writing a JavaScript bundler and need a Node.js runtime to execute plugins. Deno's executable has fantastic Node support (at least, good enough for my use case) however the deno-core crate is super barebones and difficult to embed.

At this stage I can't simply add the deno runtime into my Rust application, I need to copy/paste internal crates from the Deno executable and wire them up myself (without documentation on how).

I'd love to see expansion for my use case - Deno could become the "plugin runtime" for the JS tooling world if it had a nice embed story.

Right now I am just going with a Nodejs child process that I talk to from the Rust host process using stdio. In my tests, the stdio approach has 10x the communication latency when compared to an embedded Deno runtime (that adds ~1 second per round trip message in a project with 100k assets)

rtcode_io · 2 years ago
Deno Deploy dropped from 35 GCP regions to just 12: https://news.ycombinator.com/item?id=39127598
vorticalbox · 2 years ago
sure but is that a bad thing? this regions probably see next to or zero use and cutting them frees up money to be used on other things.
qprofyeh · 2 years ago
They’re operating like a startup, which like you say should be perfectly fine. Can’t pretend like a giant when you don’t have the bank account for that.

Though they’re also selling a developer ecosystem. A lock-in. Are you willing to bet your company’s own software tools on a vendor that could go bankrupt from their other (hosting) business? The question is if Deno hosting dies, will Deno as a platform still thrive?

re-thc · 2 years ago
> sure but is that a bad thing? this regions probably see next to or zero use

If it's a startup looking for growth then definitely. Future prospects may no longer signup because the needed region isn't available.

maeil · 2 years ago
If anyone from Supabase is reading here, your main marketing page still says Edge Functions run on 29 regions. Since they run off of Deno Deploy, it seems like this needs updating.
kiwicopple · 2 years ago
thanks - will update. It is actually hosted on our own side but I don’t think we are in 29 regions and clearly forgot to update this when we migrated. I’ll figure out the exact number and create a PR