Readit News logoReadit News
rwmj · 3 years ago
Sort of related to this (it wouldn't solve this issue, but I think it would improve overall ecosystem security & quality):

I do wish curl would split the curl backend protocols (http, ftp etc) into separate loadable modules so that we (downstream distro packagers) can reduce the total attack surface through packaging changes.

For example, we could have:

  /usr/lib/libcurl/curl-http.so
  /usr/lib/libcurl/curl-ftp.so
packaged separately as "libcurl-http" and "libcurl-ftp". Curl clients which only want HTTP would be set to depend on "libcurl-http" only, so the FTP support wouldn't even be on the system unless a package needs it.

There is a way to whitelist protocols already in curl (CURLOPT_PROTOCOLS(3)), but that requires modifying existing programs, and I think this could be used in addition.

There are lots of weird modules in curl (telnet, gopher, pop3 -- which don't get me wrong I think is great!) but they should not be a part of the default install of most Linux distros.

jackblemming · 3 years ago
Do you have any evidence that this would provide security improvements besides the heuristic of reducing surface area? Have there been past exploits?
rwmj · 3 years ago
CVE-2013-0249 (in the qemu curl driver) was an exploit where the entire qemu process could be exploited because of a bug in curl's SASL driver, which could be invoked remotely by redirecting an http[s]:// URL initiated by qemu. This was fixed by changing qemu to use CURLOPT_PROTOCOLS(3) (as I detailed in my initial posting above) so that curl wouldn't try to redirect to SASL connections starting from an initial HTTP request.

IMHO it would be a lot better instead of having to change every possible curl client, to have some kind of distro-level limit on what code might be run by curl.

Of course I would still say if you're using curl, you really must use CURLOPT_PROTOCOLS or have a good excuse why not. The above change is just a backstop.

gary_0 · 3 years ago
Would there still be a batteries-included curl package that some software/packages could depend on, if they don't want to worry about which protocol is supported? I'd hate for users throwing an FTP/whatever link at a tool that used to "just work" suddenly getting a "protocol not found" error.
rwmj · 3 years ago
I imagine yes you'd want a backwards compatible package that pulls in everything. Packages that require that might(?) be a red flag.

Again it's about hardening the default, not making it secure in every imaginable configuration.

dwwoelfel · 3 years ago
You can build curl without support for ftp.
rwmj · 3 years ago
Yes you're quite right that you can do this when building curl:

  ./configure --disable-ftp
But then you end up with a libcurl that can never support FTP clients. However FTP is still a useful protocol in some circumstances, perhaps very limited these days, but still used. I think that it's better to expose this through a module system reflected into the distribution packages. It makes things much more visible.

fulafel · 3 years ago
I would be very miffed if my application suddenly autodetected a proxy (whether it-dept or adversary installed) and decided to break the end-to-end security property of TLS that my app relied on. But maybe in the best case this would have been limited to tunneling (HTTP CONNECT) proxies that pass TLS traffic as-is?

But I guess this is just speculation now that it wasn't implemented.

benmmurphy · 3 years ago
I don’t think curl supports proxies that terminate TLS so it shouldn’t interfere with TLS end to end security.
tyingq · 3 years ago
I would guess curl would add a --autoproxy switch or similar.
stjohnswarts · 3 years ago
Yeah I think the curl programmers are far too knowledgable to have proxy discovery set as a default.
password4321 · 3 years ago
What libraries does curl use?
tyingq · 3 years ago
There's a list here: https://curl.se/docs/libs.html

Edit: Summary:

  SSL/TLS: OpenSSL, mbed, GnuTLS, NSS TLS, wolfSSL
  decompression: zlib
  ldap: OpenLDAP LDAP support
  kerberos/spnego: heimdal, MIT Kerberos
  http2: nghttp2
  async dns: c-ares
  idn domains: libidn
  scp/sftp: libssh2

Macha · 3 years ago
Note this isn't complete, it lists only 5 out of the 14 TLS libraries that cURL can be built for: https://daniel.haxx.se/blog/wp-content/uploads/2021/02/Scree... (source: https://daniel.haxx.se/blog/2021/02/09/curl-supports-rustls/ )

Of course, it does include the two that cover 95+% of uses in OpenSSL and GnuTLS.

ape4 · 3 years ago
At least libproxy itself doesn't have any dependencies

https://libproxy.github.io/libproxy/ says "no external dependencies within libproxy core (libproxy plugins may have dependencies)"

mistrial9 · 3 years ago
open-source curl, or corporate curl?

Apple and MSFT build their own replacements for TLS and use those, others? part of a trend to replace and extend GPL software, and control critical network functions on their platforms ?

but then, others seem to want to make TLS, too ?

    [ Windows native SSL/TLS, schannel ]
    [ Apple OS native SSL/TLS, secure-transport ]
    [ GNU TLS ]
    [ mbedtls, wolfssl, mesalink, bearssl, rust-tls, nss]

ushakov · 3 years ago
He’s concerned about adding 1 dependency, while you add thousands to your node_modules without the slightest doubt

What a class act

eCa · 3 years ago
To be fair, most other projects aren’t as far up-river as curl. Any project they depend on becomes at least as important as curl itself.
blowski · 3 years ago
Exactly, everything’s a trade-off. And the trade-offs will be different for a very widely used system package like curl compared to, say, a small in-house ERP system or low-traffic website.
fweimer · 3 years ago
For a generally usable implementation, libproxy needs to be able to process proxy auto-config (PAC) files, which are small (or not so small) Javascript programs. These programs are expected the parse the URL and decide which proxy server to use (or a direct connection). It's not declarative at all.

Therefore, libproxy is not a trivial dependency.

vokspon · 3 years ago
True but curl wouldn't need to implement a particularly optimised JavaScript runtime to handle PAC programs, there would be no point in e.g. bundling V8 or ChakraCore with it.
manfre · 3 years ago
I appreciate when app maintainers set a standard for their dependencies. I wish it wasn't so rare.

Node needs a static dependency checker to prevent using anything that has isodd or similar useless modules in its dependency graph.

ldoughty · 3 years ago
Well, yes....

In my general experience, "System Admins" have a security-first mindset along with years of focusing on how to maintain a secure system AND blessing from management to be slower for end-product reliability and security of business assets/data.

Developers, *who can certainly have this mindset and skill set as well*, are typically forced to be more business-oriented with management-dictated objectives/deadlines that more strictly limits their ability to invest the time, so you get "the best I can do in the allotted sprint/deadline time minus the other committed work I need to do".

So inevitably the application will let something through, and it becomes to System Admins' responsibility to protect the company... And that seems to be okay with most businesses.... Developers can move quick fast and dirty, and the operations/admins will try to limit the scope of the blast.

It adds several hours or days of work for a developer to add secret storing/retrieval/rotation compared to just hard-coding a key in the environment, so if they have 2 days to implement a feature, they are not going to spend 2 days implementing extra safeguards unless someone speaks up and pushes back (and the other side relents)

pjmlp · 3 years ago
Which is why DevOps, sorry system admins, then have automated tooling for detecting those kind of workarounds, and then we get a sad developer that has to fix a broken pipeline deployment.

In case of places that take security seriously that is.

manuelabeledo · 3 years ago
To be fair, is there any other way around while working with node?

Everything in the ecosystem is a dependency.

nickjj · 3 years ago
It depends on what you're doing.

I wrote https://www.npmjs.com/package/esbuild-copy-static-files only using the Node standard library. It's a small library to make copying static files with esbuild more efficient, it avoids copying files that didn't change. It's useful for watching and copying files in development.

It was funny writing the readme file though where "No 3rd party dependencies" ended up being called out as a feature. I wouldn't have thought to call that out as a primary feature in any other language's ecosystem.

lightswitch05 · 3 years ago
I’ve written a library with zero production dependencies. Of course I have Jest as a development dependency which pulls in all sorts of stuff. It would be difficult to make a zero-dependency library. As for not having production libraries, this was my experience:

1. Using the https module directly was more work than I expected, especially with error handling. This made me really look forward to the new Fetch API coming out.

2. No CLI parser. Its not like parsing args is a LOT of work - but its also something that is already solved and having to write support for that directly was a bummer

3. No logging library. This one was pretty easy. Create a little class with logging levels. Again this is something that is very common that would have been nice to use a package for.

rockwotj · 3 years ago
For the most part, it's also really easy to add node dependencies compared to C/C++. Is it surprising folks take the past of least resistance?
pixl97 · 3 years ago
In reverse I've seen a banking apps pull in a gigabyte of node dependencies and am thinking there's no possible way they'll ever be able to validate the security of everything.
ushakov · 3 years ago
> To be fair, is there any other way around while working with node?

breaking robot noises

shp0ngle · 3 years ago
Well, my application has no tests, badly designed API and no documentation too. So its fiiine
pjmlp · 3 years ago
On most of the projects I am involved with, if the modules haven't been cleared for use, the CI/CD pipeline will fail, regardless of working on dev's machine.

If only all projects were as conscious.

wkdneidbwf · 3 years ago
this is an inane comparison.

Deleted Comment