Readit News logoReadit News
aazaa · 6 years ago
> ... Deno is (and always will be) a single executable file. Like a web browser, it knows how to fetch external code. In Deno, a single file can define arbitrarily complex behavior without any other tooling.

> ...

> Also like browsers, code is executed in a secure sandbox by default. Scripts cannot access the hard drive, open network connections, or make any other potentially malicious actions without permission. The browser provides APIs for accessing cameras and microphones, but users must first give permission. Deno provides analogous behaviour in the terminal. The above example will fail unless the --allow-net command-line flag is provided.

The Deno feature that seems to draw the most fire is dependency management. Some skeptics may be latching onto the first point without deeply considering the second.

Deno is just doing the same thing a browser does. Like a browser, there's nothing that JavaScript running on sandboxed Deno can do that a browser can't - in principle. So the security concerns seem a little over the top.

The one caveat is that once you open the sandbox on Deno, it appears you open it for all modules. But then again, that's what NPM users do all the time - by default.

As far as criticisms around module orchestration, ES modules take care of that as well. The dependency graph forms from local information without any extra file calling the shots.

This seems like an experiment worth trying at least.

bgdam · 6 years ago
See the thing about the sandbox is that it's only going to be effective for very simple programs.

If you're building a real world application, especially a server application like in the example, you're probably going to want to listen on the network, do some db access and write logs.

For that you'd have to open up network and file access pretty much right off the bat. That combined with the 'download random code from any url and run it immediately', means it's going to be much less secure than the already not-that-secure NPM ecosystem.

danShumway · 6 years ago
> That combined with the 'download random code from any url

What protection does NPM actually give you?

Sure, they'll remove malware as they find it, but it is so trivially easy to publish packages and updates to NPM, there effectively is no security difference between an NPM module and a random URL. If you wouldn't feel comfortable cloning and executing random Github projects, then you shouldn't feel comfortable installing random NPM modules.

> and run it immediately

NPM packages also do this -- they can have install scripts that run as the current user, and have network access that can allows them to fetch, compile, and execute random binaries off the Internet.

From a security point of view, Deno is just making it clear up-front that you are downloading random code snippets, so that programmers are less likely to make the mistake of trusting a largely unmoderated package repository to protect themselves from malware.

I lean towards calling that a reasonably big security win on its own, even without the other sandboxing features.

jeswin · 6 years ago
> That combined with the 'download random code from any url and run it immediately', means it's going to be much less secure than the already not-that-secure NPM ecosystem.

What deno does is move package management away from the framework distribution. This is great - one thing I hate about node is that npm is default and you get only as much security as npm gives you. (You can switch the npm repo, but it's still the overwhelming favourite because it's officially bundled.)

Deno can eventually give you:

  import lib from 'verified-secure-packages.com'
  import lib from 'packages.cloudflare.com'
So you'll be able to pick a snippet repository based on your risk appetite.

half-kh-hacker · 6 years ago
Both the network and disk access permissions are granular, which means you can allow-write only to your logs folder, and allow net access only to your DB's address.
teleclimber · 6 years ago
> For that you'd have to open up network and file access pretty much right off the bat.

For the network access I have an issue[0] open that asks to separate permissions to listen on the network from permission to dial. Also, along the way I want to have the ability to let the OS pick the port as an option.

Permissions are meant to work by whitelisting. So you wouldn't open access to the whole system just to talk to your DB, or to operate on some files.

[0] https://github.com/denoland/deno/issues/2705

ricardobeat · 6 years ago
Maybe this will develop into a standard of multi-process servers (real micro services you could say), where the permissions are only given to a slice of the application.
mrkurt · 6 years ago
Sometimes it's ok to think "this project isn't for me" and just leave it be. The cynical-security-concern act is boring.
littlestymaar · 6 years ago
For the use-case you describe, your just going to need network access: no file access and no process-forking needed, this is a big surface attack reduction.

Moreover Idk how granular the network permission is, but if its implementation is smart, you could block almost all outbound network access except the ones to your DB and the few API you may need to contact.

fao_ · 6 years ago
> means it's going to be much less secure than the already not-that-secure NPM ecosystem.

I have only the bare minimum of like, experience with nodejs. Would you mind fleshing out why that is so?

kybernetikos · 6 years ago
> For that you'd have to open up network and file access pretty much right off the bat.

I think that overall you're right, but it's worth noting that deno can restrict file system access to specific folders and can restrict read and write separately. It's plausible to me that you could have a web server that can only access specific folders.

_hl_ · 6 years ago
I don't think running a public web server application is one of the envisioned use cases here. It looks like a tool for quickly and dirtily getting some job done. But I agree that to get something useful done, you probably need to open up a bunch of permissions, so you're still running arbitrary code on your machine.
nine_k · 6 years ago
It's always a good idea to run in a container, which limits the ports you can listen on, directories allowed for writing and reading, and can have its own firewall to limit outgoing connections.

If you don't need the firewall, you can just run in a chroot under a low-privilege user.

I mean, if you do otherwise, you are not following best practices and the voice of reason.

skybrian · 6 years ago
The manual looks pretty sketchy, but it seems you can limit file access to certain files or directories and that could be used to just give it access to the database and log files.
diffrinse · 6 years ago
Looking at the flags, one can envision future updates providing flags for scoping to directories, PIDs, domain/IP ranges
amolo · 6 years ago
I don't think thats very accurate. You really need to gi watch the first Deno video made by Ryan Dahl at JSConf.
TACIXAT · 6 years ago
If I am building a real world application I'm going to vet the libraries I use.
epr · 6 years ago
Simple solution to the dependency management (spitballing): a directory of files where the filename is the module name. Each file is simply:

  <url><newline><size in bytes><newline><hash>
And then in an application:

  import { serve } from deno.http.server;
If you want nested levels so deno.X.X wouldn't be a ton of files you could possibly just do nested directories so deno/http/server would equate to deno.http.server.

Most people would want the option to do dependency management on a per-project basis as well. Simply allow a command-line parameter to provide one or more other directories to source from first (besides presumably the global one for your installation).

If we wanted the file to be generated automatically, maybe something like this:

  import { serve } from deno.http.server at "https://deno.land/std@0.50.0/http/server.ts";

SahAssar · 6 years ago
Until someone thinks that it should follow redirects which probably leads to the same thing that got apt: https://justi.cz/security/2019/01/22/apt-rce.html

Not saying that makes it a bad idea, but importing/downloading trusted code over http(s) is not simple even if the protocol sorta is.

peartips · 6 years ago
> This seems like an experiment worth trying at least.

Yup! I am really excited about Deno and curious about how popular it will be in a few years.

VWWHFSfQ · 6 years ago
I guess I'm wondering why Deno is targeting V8 instead of Servo? Maybe I'm mistaken, but Servo [0] and Stylo [1] are both production-ready browser scripting and styling engines implemented in Rust.

[0] https://servo.org/

[1] https://wiki.mozilla.org/Quantum/Stylo

dralley · 6 years ago
>Servo [0] and Stylo [1] are both production-ready browser scripting and styling engines implemented in Rust.

Servo is absolutely not production ready. A couple of particular pieces of Servo, such as Stylo and WebRender, can be considered production-ready, but no so much the project as a whole.

Gaelan · 6 years ago
Servo uses Firefox's SpiderMonkey, which is written in C++, as its JavaScript implementation.
Cldfire · 6 years ago
Servo is an experimental project designed to build and test components that can be integrated into Firefox. It relies on Gecko for JS.
petercooper · 6 years ago
If you're getting into Deno and want to keep up with new stuff from the ecosystem on a regular basis, we're now publishing https://denoweekly.com/ .. issue 2 just went out minutes after the 1.0 release. I've been doing JavaScript Weekly for 487 issues now, so this is not a flash in the pan or anything :-D

Of course, Deno has an official Twitter account as well at https://twitter.com/deno_land :-)

ArtWomb · 6 years ago
I suppose .land is the new .dev now ;)

Am curious how Parallelism could be handled in the runtime? Besides exposing WebWorkers, would shared memory be a possibility? V8 looks like its heading toward a portable WebAssembly SIMD accelerator.

>>> Promises all the way down

Async / await is a great pattern for render loops by resolving to continuous window.requestAnimationFrame calls. Here is a nifty example computing a Buddhabrot and updating new data quite smoothly:

http://www.albertlobo.com/fractals/async-await-requestanimat...

lenkite · 6 years ago
Root certificate not trusted for https://denoweekly.com/ on both chrome and firefox.
gpm · 6 years ago
Maybe they fixed this in the last 2 hours, but it works for me (firefox, linux).
petercooper · 6 years ago
Hmm, interesting! Thanks for the report. I just ran it through Qualys SSL Labs and everything passed. (We got a B though because we still support TLS 1.1.)

It's a multi-domain Let's Encrypt X3 certificate and I believe most LE users will be in a similar boat now.

notjustanymike · 6 years ago
Keep up the good work, JS weekly is a wonderful resource.
wintorez · 6 years ago
Good to see you here!
swyx · 6 years ago
right on top of it Peter! nice
petecoop · 6 years ago
Great :)
nojvek · 6 years ago
> TSC must be ported to Rust. If you're interested in collaborating on this problem, please get in touch.

This is a massive undertaking. TSC is a moving target. I occasionally contribute to it. It’s a fairly complex project. Even the checker + binder (which is the core of TS) is pretty complex.

One idea that comes to mind is to work with Typescript team that they are only using a subset of JS such that tsc can be compiled down web assembly and have llvm spit out a highly optimized binary. This not only benefits demo, but the rest of the internet.

TSC has done some great architectural changes in the past like doing mostly functional code, rather than lots of classes.

The target we should be aiming for is a powerful typed language like typescript that complies very quickly to webasshmbly that can run in guaranteed sandbox environments.

torb-xyz · 6 years ago
There already exists and experimental compiler that takes a subset of TypeScript and compiles it to native[1]. It might be able to target wasm instead of asm.

Also: If I'm not entirely mistaken Microsoft initially planned to have a TypeScript-specific interpreter in Explorer. This also might indicate that something like that could be possible.

1: https://www.microsoft.com/en-us/research/publication/static-...

Tade0 · 6 years ago
I wonder how possible it would be to just use this:

https://github.com/swc-project/swc

It's still not feature-complete, but there aren't any alternatives written in Rust that I know of.

a_humean · 6 years ago
SWC does not do any typechecking. It is equivalent to babel.
WorldMaker · 6 years ago
This does seem like a dangerous side-path unrelated to the Deno project's needs.

From the description, it doesn't sound like Deno needs the type information for V8 optimizations (I thought they had explored that, but I don't recall, and the description here is unclear), so maybe switching to more of a two pass system of a simple as possible "type stripper" (like Babel's perhaps?) and leave tsc compilation for type checking as a separate background process. Maybe behind some sort of "production mode" flag so that type errors stop debug runs but in production assume you can strip types without waiting for a full compile?

Maybe even writing a type stripper in Rust isn't a bad idea, but definitely trying to capture all of tsc's functionality in Rust seems like a fool's errand.

nojvek · 6 years ago
Typescript already has transpile-only mode that lets it run without performing those checks and just emit.

I use it with ts-node all the time for my docker images that require fast startup.

node -r ts-node/register/transpile-only xyz.ts

grandinj · 6 years ago
v8 has the ability to snapshot a program just after it loads, but before it executes. If you snapshot after doing some kind of warmup, to trigger the right optimisations, you get something that should fire up ready to go, which is probably the main problem - the compiler being repeatedly invoked and parsed from javascript and compiled on the fly.
z3t4 · 6 years ago
One problem with taking the v8 snapshot and using it as a binary executable is that it will probably be much slower then running v8 live. Although the startup time will be 10x faster. The runtime will be 10x slower.
WorldMaker · 6 years ago
The notes here mention that V8 snapshots also didn't provide the speed-up/optimization Deno was hoping for.
karyon · 6 years ago
There is https://github.com/AssemblyScript/assemblyscript. It's not using llvm, but it's compiling a subset of typescript to webassembly.
baxuz · 6 years ago
I see the sass / node-sass fiasco all over again...
leetrout · 6 years ago
This is referring to lib-sass being in C?
bgdam · 6 years ago
The dependency management is highly questionable for me. Apart from the security concerns raised by others, I have huge concerns about availability.

In it's current form, I'd never run Deno on production, because dependencies have to be loaded remotely. I understand they are fetched once and cached, but that will not help me if I'm spinning up additional servers on demand. What if the website of one of the packages I depend on goes down, just as I have a huge spike in traffic?

Say what you want about Node's dependency management, but atleast I'm guaranteed reproducible builds, and the only SPOF is the NPM repositories, which I can easily get around by using one of the proxies.

fsloth · 6 years ago
Why can't you download all the packages you use actually with your source code? That's how software has been built for decades...

I'm a desktop developer so I understand I'm the dinosaur in the room but I've never understood why you would not cache all the component packages next to your own source code.

Since this is straighforward to do I presume there is some tradeoff I've not thought about. Is it security? Do you want to get the latest packages automatically? But isn't that a security risk as well, as not all changes are improvements?

Cthulhu_ · 6 years ago
For Node, the main tradeoff is number and size of files. Usually the distribution of a node module (that which is downloaded into node_modules) contains the source, documentation, distribution, tests, etc. In my current project, it adds up to 500MB already.

They would do well to have an option to optimize dependencies for vendoring.

softawre · 6 years ago
You're right. We call this "vendoring" your dependencies. And it's a good way to do things.
emerongi · 6 years ago
You can commit your node_modules folder into your repository if you'd like.
IshKebab · 6 years ago
That is exactly what NPM does.
progx · 6 years ago
So build your own npm?
batmansmk · 6 years ago
Hi! The response to your fears are in the announcement. "If you want to download dependencies alongside project code instead of using a global cache, use the $DENO_DIR env variable." Then, it will work like node_modules.
bgdam · 6 years ago
Ah, in this case, I would then have to commit my dependencies into my VCS to maintain reproducible builds. I'm not sure I like that solution very much either. I've seen node_modules in multiple GBs, and I'm sure Deno's dependency sizes are going to be similar.
fermienrico · 6 years ago
I think the primary way to manage dependencies should be in a local DIR and optionally, a URL can be specified.

The default in Deno is questionable choice. Just don't fuck with what works. Default should be safest followed by developers optionally enabling less safe behaviors.

thayne · 6 years ago
That might work for some projects, but can quickly blow up the size of the repo.

I don't think it it is an unsolvable problem. For example, other solutions could be using a mirror proxy to get packages, instead of directly from the source, or pre-populating the deno dir from an artifact store. It would be nice to have documentation on how to do those though.

ryanseys · 6 years ago
It's just a URL right? So could you not mirror the packages to your own server if you're so concerned, or better yet import from a local file? Nothing here seems to suggest that packages must be loaded from an external URL.
bgdam · 6 years ago
> or better yet import from a local file

And this is different from NPM how? Except that I've now lost all the tooling around NPM/Yarn.

PudgePacket · 6 years ago
Then you either vendor as others have said, or use the built in bundle tool to produce a single js file of all your code including dependencies.

https://deno.land/manual/tools/bundler

CGamesPlay · 6 years ago
Several responses to your concern but all of them seem to say "you can cache dependencies". How does Deno find dependencies ahead of runtime? Does Deno not offer dynamic imports at all?

If I have an application that uses, say, moment.js and want to import a specific locale, typically this is done using a dynamic import.

merpnderp · 6 years ago
You’d deploy builds that include all the dependencies. This isn’t Node where you have to download all deps on your production server, they are built into your deployed files.
bananabreakfast · 6 years ago
Ever used Go? Not that different.
stryan · 6 years ago
Go supports "vendor" folders for storing dependencies locally after initial download. That combined with Go Modules means you can handle everything locally and (I believe) reproducible.
29athrowaway · 6 years ago
Reproducible builds, sure. Security? that's a different story.

https://github.com/ChALkeR/notes/blob/master/Gathering-weak-...

- node ships with npm

- npm has a high number of dependencies

- npm does not implement good practices around authentication.

Can someone compromise npm itself? probably, according to that article.

Deleted Comment

Aeolun · 6 years ago
How is this different from requiring npm to be up to install packages?
didip · 6 years ago
I like what Deno is selling. URL like import path is great, I don't know why people are dismissing it. It is easy to get up-and-running quickly.

Looks like my personal law/rule is in effect again: The harsher HN critics are, the more successful the product will be. I have no doubt Deno will be successful.

gintery · 6 years ago
Your law is hilarious. I tend to check the comments before reading a post: if they say the idea is terrible, I know I should read it.
sradman · 6 years ago
The GoLang-like URL import and dependency management are indeed an innovation in simplicity while simultaneously offering better compatibility with browser JavaScript.

Perhaps the HN-hate is not about simplified greenfield tech as much as it is about breaking established brownfield processes and modules.

Aldo_MX · 6 years ago
My concern is what happens when popular-library.io goes down or gets hacked?

Or how about attack vectors like DNS poisoning? or government-based firewalls?

I know there's this[1], but somehow I still feel uneasy because the web is so fragile and ephemeral...

At the very least I would like to have the standard library offline...

[1] https://github.com/denoland/deno/blob/master/docs/linking_to...

mekster · 6 years ago
> what happens when popular-library.io goes down or gets hacked?

What is anyone going to do about it? Anything has a chance of getting hacked or goes down just when you need it, be it GitHub, npmjs.org...

Blaming the tool for not having a protection against DNS poisoning is a bit far fetched.

dbrgn · 6 years ago
You can always download the scripts and host them yourself, right?
TechBro8615 · 6 years ago
Do you also share these concerns about golang? Isn’t it basically the same system?
kreetx · 6 years ago
This seems to be the case, yes. It's like the critics unconsciously know it's better, and that is where their energy comes from.
mekster · 6 years ago
More like "There can't be a better stuff than what I'm accustomed to and like" feel.
oblio · 6 years ago
> It is easy to get up-and-running quickly.

Almost all successful, mainstream, techs are like that. From a purely technical perspective, they are awful (or where awful at launch), they were just adopted because they were easy to use. When I say awful, I mean for professional use in high impact environments: financial, healthcare, automotive, etc.

Examples: VB/VBA, Javascript, PHP, MySQL, Mongo, Docker, Node.

Few people would argue that except for ease of use and ubiquity, any of these techs were superior to their competitors at launch or even a few years after.

After a while what happens is that these techs become entrenched and more serious devs have to dig in and they generally make these techs bearable. See Javascript before V8 and after, as an example.

A big chunk of the HN crowd is high powered professional developers, people working for FAANGs and startups with interesting domains. It's only normal they criticize what they consider half-baked tech.

tolmasky · 6 years ago
Forget the (reasonable) security and reliability concerns people have already brought up with regard to importing bare URLs. How about just the basic features of dealing with other people's code: how am I supposed to update packages? Do we write some separate tool (but not a package management tool!) that parses out import URLs, increments the semver, and... cURLs to see if a new version exists? Like if I am currently importing "https://whatever.com/blah@1.0.1", do I just poll to see if "1.0.2" exists? Maybe check for "2.0.0" too just in case? Is the expectation that I should be checking the blogs of all these packages myself for minor updates? Then, if you get past that, you have a huge N-line change where N is every file that references that package, and thus inlines the versioning information, instead of a simple and readable one-line diff that shows "- blah: 1.0.1, + blah: 1.0.2".
emerongi · 6 years ago
Another thing is that with package.json every dependency can say which versions of its dependencies it works with. This lets you update a dependency that is used by other dependencies and only have a single version (most up to date) of it. Some package managers also let you completely override a version that one of your dependencies uses, allowing you to force the use of a newer version.

With Deno, both of these use cases seem way harder to satisfy. None of your dependencies can say "hey, I work with versions 1.1 to 1.3 of this dependency", instead they link to a hardcoded URL. The best chance of overriding or updating anything is if Deno supports a way to shim dependencies, but even then you might need to manually analyze your dependency tree and shim a whole bunch of URLs of the same library. On top of that, as soon as you update anything, your shims might be out of date and you need to go through the whole process again. To make the whole process easier, Deno could compare checksums of the dependencies it downloads and through that it could show you a list of URLs that all return the same library, but this would be like the reverse of package.json: instead of centrally managing your dependencies, you look at your dependencies after they have been imported and then try to make sense of it.

pknopf · 6 years ago
> None of your dependencies can say "hey, I work with versions 1.1 to 1.3 of this dependency"

That's a real problem that needs to be solved.

Also, what happens when two lib A and lib B depend on different versions of lib C? Each have their own scoped instance of C?

mrkurt · 6 years ago
The Deno solution is either:

* A deps.ts that handles all external dependencies and re-exports them

* Import maps

Neither of these really give you a way to do the equivalent of "npm update". But I almost never want to update all my packages at once.

tolmasky · 6 years ago
You don’t like checking for security updates?
mileycyrusXOXO · 6 years ago
I prefer a unix approach where each tool does a single thing.

The runtime does runtime things and someone can build a package manager to do package manager things.

The benefit of having runtime + package management bundled into one is you have an opinionated and standard way of doing things - the downside is if the package manager stinks then the whole ecosystem stinks.

mekster · 6 years ago
Just come up with a convention like host a file called version.ts that lists all available versions. Brute forcing for available versions sounds dumb.
tolmasky · 6 years ago
Yes, that is of course the point. There’s an infinite number of possible later versions to check for. The suggestion to poll for new versions using cURL was sarcastic. These “conventions” you speak of actually get handled by... package managers! If not everyone who hosts a package needs to “know” about magic files and make sure they make it following a spec that isn’t enforced by anyone and doesn’t break immediately but rather much later when someone tries to update. It’s like everyone managing their own database row with an informally agreed upon database schema.
c-cube · 6 years ago
Maybe a convention will arise, that you do all the imports in one file (basically like a `package.json` file) and import that from the rest of your code? It seems hackish to me but could work.
CGamesPlay · 6 years ago
This is explicitly listed as best practice by Deno [1], but it doesn't handle the updating problem at all.

[1] https://deno.land/manual/linking_to_external_code#it-seems-u...

austincheney · 6 years ago
You have to stop thinking in terms of NPM where it takes 1000000000 packages to do anything. A Deno application is designed to be distributed as a single file. You can override the default behavior to have NPM like stupidity, but if that is really your goal why bother moving to Deno in the first place?
tolmasky · 6 years ago
Forget 10000000 packages. Many languages often make use of 10s of packages. If I have several projects, each with around 10 packages, and no automated way to just check if all my projects’ respective dependencies have security updates that could be applied, it seems to go against the stated security goal.

Separately I’m not sure what is enforcing this “small dependency graph” aside from making it hard to import things I guess. I wouldn’t be surprised if you end up with the normal behavior of people coming up with cool things and other people importing them.

seleniumBubbles · 6 years ago
Congratulations on the 1.0 release! I've been using Deno as my primary "hacking" runtime for several months now, I appreciate how quickly I can throw together a simple script and get something working. (It's even easier than ts-node, which I primarily used previously.)

I would love to see more focus in the future on the REPL in Deno. I still find myself trying things in the Node.js REPL for the autocomplete support. I'm excited to see how Deno can take advantage of native TypeScript support to make a REPL more productive: subtle type hinting, integrated tsdocs, and type-aware autocomplete (especially for a future pipeline operator).

indemnity · 6 years ago
Seconded, a Deno TS REPL would be amazing, but they probably have a few bigger fish to fry yet :)
IggleSniggle · 6 years ago
> bigger fish to fry

> fish

I see what you did there, and I approve.

damagednoob · 6 years ago
I evaluated replacing ts-node with deno but if I use -T and install ts-node globally that seems equivalent to deno to me.

I think stepping outside the npm ecosystem is going to be a bigger issue then people think.

Pyrodogg · 6 years ago
Repl.it recently announced a Deno REPL https://repl.it/languages/deno
MuffinFlavored · 6 years ago
I really wish they had docker-compose / Terraform support. Just not sure at what point that becomes "free" hosting.
feihcsim · 6 years ago
i wonder if it's conceivable to ever write typescript in a REPL
regularfry · 6 years ago
There are both ocaml and haskell repls, so it can be done with languages whose type systems are the focus. Not sure if there's anything specific about typescript that would make it hard, though.
pedalpete · 6 years ago
Does anyone else see the import directly from URL as a larger security/reliability issue than the currently imperfect modules?

I'm sure I'm missing something obvious in that example, but that capability terrifies me.

batmansmk · 6 years ago
I thought a lot about it, and it seems as secure as node_modules, because anybody can publish to npm anyway. You can even depend to your non-npm repo (github, urls...) from a npm-based package.

If you want to "feel" as safe, you have import maps in deno, which works like package.json.

Overall, I think Deno is more secure because it cuts the man in the middle (npm) and you can make a npm mirror with low effort, a simple fork will do. Which means you can not only precisely pin which code you want, but also make sure nobody knows you use those packages either.

Take it with an open mind, a new "JSX" or async programming moment. People will hate it, then will start to see the value of this design down the road.

pimterry · 6 years ago
> I thought a lot about it, and it seems as secure as node_modules, because anybody can publish to npm anyway

npm installs aren't the same as installing from a random URL, because:

* NPM (the org) guarantees that published versions of packages are immutable, and will never change in future. This is definitely not true for a random URL.

* NPM (the tool) stores a hash of the package in your package-lock.json, and installing via `npm ci` (which enforces the lockfile and never updates it in any case) guarantees that the package you get matches that hash.

Downloading from a random URL can return anything, at the whims of the owner, or anybody else who can successfully mitm your traffic. Installing a package via npm is the same only the very first time you ever install it. Once you've done that, and you're happy that the version you're using is safe, you have clear guarantees on future behaviour.

ryanbrunner · 6 years ago
One advantage of having a centralized repository is that the maintainers of that repository have the ability to remove genuinely malicious changes (even if it's at the expense of breaking builds). Eliminating the middle man isn't always a great thing when one of the people on the end is acting maliciously.
mrkurt · 6 years ago
It's basically the same "exposure" as importing a random npm, but it has the benefit if being explicit when you do it.

It's also exactly what the websites you visit do. ;)

eyelidlessness · 6 years ago
> It's basically the same "exposure" as importing a random npm, but it has the benefit if being explicit when you do it.

This is definitely false. For all the problems with the NPM registry and the Node dependency situation, an NPM package at a specific version is not just at the whims of whatever happens to be at the other end of a URL at any given moment it's requested. This is a huge vulnerability that the Node/NPM world does not currently have.

xg15 · 6 years ago
> It's basically the same "exposure" as importing a random npm, but it has the benefit if being explicit when you do it.

I'm not sure how this works in detail here, but at least in NPM you got a chance to download packages, inspect them and fix the versions if so desired. Importantly, this gave you control over your transitive dependencies as well.

This seems more like the curl | bash school of package management.

Edit: This is explained in more detail at https://deno.land/manual/linking_to_external_code and indeed seems a lot more sane.

> It's also exactly what the websites you visit do. ;)

Well yes, and it causes huge problems there already - see the whole mess we have with trackers and page bloat.

Roboprog · 6 years ago
Sure do. I wonder if they have a checksum mechanism like browsers do?

You can add an “integrity” attribute to script tags in the browser.

https://developer.mozilla.org/en-US/docs/Web/Security/Subres...

james-mcelwain · 6 years ago
One advantage of urls is that you can link to a specific git sha, tag, or branch for a dependency, e.g. on github.
bgdam · 6 years ago
It's not just about the integrity. The url may very well provide what they claim to provide, so checksums would match, but it's the direct downloading and running of remote code that is terrifying.

This is pretty much like all the bash one-liners piping and executing a curl/wget download. I understand there are sandbox restrictions, but are the restrictions on a per dependency level, or on a program level?

If they are on a program level, they are essentially useless, since the first thing I'm going to do is break out of the sandbox to let my program do whatever it needs to do (read fs/network etc.). If it is on a per dependency level, then am I really expected to manage sandbox permissions for all of my projects dependencies?

Saaster · 6 years ago
Does Deno have some built in way to vendor / download the imports pre-execution? I don't want my production service to fail to launch because some random repo is offline.
PinkMilkshake · 6 years ago
PudgePacket · 6 years ago
You can also use the built in bundle command to bundle all of your dependencies and your code into a single, easily deployable file. https://deno.land/manual/tools/bundler.
afiori · 6 years ago
Deno caches local copies and offer control on when to reload them. in term of vendoring you can simply download everything yourself and use local paths for imports.
doctoboggan · 6 years ago
I assume you would just download the packages and serve them yourself.
ecares · 6 years ago
espacially since https is not enforced! https://github.com/denoland/deno/issues/1063
fendy3002 · 6 years ago
CMIIW, wouldn't enforced https means you can't use intranet or localhost url?
jppope · 6 years ago
More than likely programming as a whole will get better because of this...

Do you trust this thing? Better off developing it yourself, or working with something you trust then.

tvbusy · 6 years ago
deno requires that you give the process explicitly which permissions it has. I think it's much better than praying that a package has not gone rough like with node. If you don't trust the remote script, run it without any permission and capture the output. Using multiple process with explicit permissions are much safer.
dceddia · 6 years ago
I'm wondering about the practicality of importing from URLs. I didn't see it addressed, but an import like this will be awfully hard to remember.

    import { serve } from "https://deno.land/std@0.50.0/http/server.ts";
Anyone know if there are alternatives or a plan for this aside from "use an IDE to remember it for you"?

mrkurt · 6 years ago
The convention is to make a `deps.ts` and re-export what you need. Like this: https://deno.land/x/collections/deps.ts

I don't find versioned URLs much more difficult to work with than package@<version> though.

speedgoose · 6 years ago
You are not alone, this is very unsafe in my humble opinion.
austincheney · 6 years ago
How is it any different than how it works in the browser?
frank2 · 6 years ago
Does it also terrify you when code running in a browser does it?
Saaster · 6 years ago
The code running in my browser isn't a multi-tenant production server, with access to the filesystem and DBs.