Readit News logoReadit News
Posted by u/jamesfisher 3 years ago
Ask HN: Why does every package+module system become a Rube Goldberg machine?
A programming language has a "core language" plus a package/module system. In each successful language, the core language is neat-and-tidy, but the package/module system is a Rube Goldberg machine. See JavaScript/TypeScript, Python, or C/C++.

Lots of brain cycles are spent on "programming language theory". We've roughly figured out the primitives required to express real-world computation.

In contrast, we apparently have no "package management theory". We have not figured out the primitives required to express dependencies. As a result, we keep building new variants and features, until we end up with <script>, require(), import, npm, yarn, pnpm, (py)?(v|virtual|pip)?env, (ana)?conda, easy_install, eggs and wheels ...

Is it just a "law of software" that this must happen to any successful language? Or are there examples of where this it has not happened, and what can we learn from them? Is there a "theory of package management", or a "lambda calculus of package management" out there?

tazjin · 3 years ago
> In contrast, we apparently have no "package management theory". We have not figured out the primitives required to express dependencies

We have a good hunch. The basic theory behind Nix definitely goes in the right direction, and if we look away from all the surface-level nonsense going on in Nix, it's conceptually capable (e.g. [0]) of being a first-class language dependency manager.

For this to work at scale we'd need to overcome a couple of large problems though (in ascending order of complexity):

1. A better technical implementation of the model (working on it [1]).

2. A mindset shift to make people understand that "binary distribution" is not a goal, but a side-effect of a reasonable software addressing and caching model. Without this conceptual connection, everything is 10x harder (which is why e.g. Debian packaging is completely incomprehensible - their fundamental model is wrong).

3. A mindset shift to make people understand that their pet programming language is not actually a special snowflake. No matter what the size of your compilation units is, whether you call modules "modules", "classes" or "gorboodles", whether you allow odd features like mutually-recursive dependencies and build-time arbitrary code execution etc.: Your language fits into the same model as every other language. You don't have to NIH a package manager.

This last one is basically impossible at the current stage. Maybe somewhere down the line, if we manage to establish such a model successfully in a handful of languages and people see for themselves, but for now we have to just hold out.

[0]: https://code.tvl.fyi/about/nix/buildGo

[1]: https://cs.tvl.fyi/depot/-/tree/tvix/

jnxx · 3 years ago
And Guix has put that in an beautiful form. There are two things which make Guix special:

1. The package definitions are just a normal, battle-proven, very well defined general-purpose, functional-style supporting programming language (Scheme).

2. There is no conceptual difference between a package definition in the public Guix system, and a self-written package definition which a developers makes to build and test his own package, or to build and run a specific piece of software. The difference is equally small as between using an Emacs package, and configuring that package in ones .emacs configuration file.

noobermin · 3 years ago
I think 3 is probably true for language developers, but for users, language repos feel like they are needed because like none of the linux distros really ship everything you need, or even a reliable fraction without installing a bunch of dumb hacks to get it to work. It's much easier just to do "pip install" so that's where the demand is.

And sure, may be you're right that distro packaging is the "wrong model," again, that is the problem then for distros, users are stuck using ubuntu or whatever so they don't have the option to do the "right" thing, so they do use the mishmash of packaging/repo systems as just the cost of doing business.

carlmr · 3 years ago
Apart from not shipping what you need, you often only get a fraction of the versions you need.

For development you may want to test your code against multiple versions of the system libraries. This is not easy using a distro package manager.

puffoflogic · 3 years ago
Nix is not capable of becoming a first class dependency manager because it does not manage dependencies. It does not attempt any version or feature resolution. Besides, saying Nix could become a package manager is like saying that C could become a package manager.
otabdeveloper4 · 3 years ago
Nix is an umbrella for a stack of various solutions.

Nixpkgs doesn't do version or feature resolution, but other tools (that are not nixpkgs) can and do.

Dead Comment

pastage · 3 years ago
2) In Debian a binary package has almost always just been a cache, are we thinking about Debian packaging in completely different ways? Are you perhaps talking about the aspect of doing an "build world"[0] that bsd, Gentoo and now Nix are better at?

[0] Edit: rebuilding your dependencies if you need it and handling that seemlesly CAN be hard on Debian. Something that even Nix struggles with even if they are best in class by far. It is also completely different from the notion of compiling a program. I wonder what you consider the goal of a package system.

tazjin · 3 years ago
> are we thinking about Debian packaging in completely different ways?

Quite likely! The whole concept of separately building a source/binary package, and then uploading/"deploying" that binary package, already violates the notion of being "just a cache" for me. There might be a tool in Debian where I can seamlessly say "do not download this package from the repository, but build it locally" - but even if so, it would surprise me if it can give me the same guarantees as Nix (i.e. guarantees about the artifact being repeatable, and being addressable by its inputs rather than a human-curated tag or version number).

> Something that even Nix struggles with

Nix only struggles with it to the degree that some of the higher-level abstractions in the existing package set (which are built on top of the fundamental model, not part of it) can be confusing/underdocumented, but conceptually this is a simple thing to do in Nix. Even practically at this point it is usually simple - unless you're dealing with Haskell or Javascript, of course.

> I wonder what you consider the goal of a package system

I want it to let me describe a desired state, and then make it so. That state is best represented as a graph, like a Merkle tree, of the instructions for the individual steps and a way to address them. An individual step is a transformation (i.e. some program) executed over some sources, yielding some result (likely in the filesystem). I want any distribution of binaries to be a result of using this addressing scheme, and looking up/copying an already built equivalent artifact for that thing. I want a text file containing the string "Hello HN" to be represented by the same abstraction that represents a specific .so file, the `emacs` package, the configuration of my entire system, the configuration of a cluster of systems and so on. I want this system to be programmable so I can work around the shortcomings that its original designers missed.

Nix (and Guix) do large parts of this already, and are conceptually (though not yet technically) capable of doing all of it. Technically, so is something like Bazel - but its complexity and maintenance requirements make that prohibitively expensive (basically only Google can use Bazel like that, and even they have delineated areas where they "give up" on wrapping things in Bazel).

pjc50 · 3 years ago
Debian packages can have pre/postinstall scripts that mutate the overall state of the system in arbitrary ways. This is both (a) necessary to make some thing work and (b) ruins the theoretical model.
folex · 3 years ago
Yes!

I'm in a team that works on a pet prog lang for distributed systems, and we did some research of using an existing package managing systems. We've settled on NPM for now, but god I wish there would be a better generic package manager out there.

mijoharas · 3 years ago
Not a generic package manager, but it's probably worth calling out asdf as the generic version manager[0] (maybe you're already aware of it, but it's a generic replacement for nvm, rvm, virtualenv, *vm, which supports any language based on plugins.)

Again, maybe you're already aware of it, but I think it's a nice example of genericising a concern common to many languages which sounds similar to what you're asking for (albeit unfortunately in a slightly different space).

[0] https://github.com/asdf-vm/asdf

pxc · 3 years ago
Check out Denxi. Might be the closest thing to what you're looking for today, and is informed by the state of the art in package management without totally imposing strict discipline on all packages.

https://docs.racket-lang.org/denxi-guide/index.html

drpyser22 · 3 years ago
Did you compare nix to npm? Is there a good comparison out there?
unicornmama · 3 years ago
If the community just focused on creating a great package manager instead of an Everything Monster, and “meeting developers where they are”, then Nix might do something great.
potamic · 3 years ago
What's a good source, for someone who has never used nix, to read about its conceptual model and how it is different from contemporary dependency managers?
rkrzr · 3 years ago
We wrote a blog post on how Nix helps us solve some of the problems that we encountered with other package managers like e.g. apt and pip.

https://www.channable.com/tech/nix-is-the-ultimate-devops-to...

(It doesn't go very much in depth on the conceptual model, but touches on the the main ideas)

tazjin · 3 years ago
There really isn't one that I know of. There's the original thesis[0], but it's from a different time and it focuses only on the package management (~ for a distribution) aspect.

Everything else is mostly written to teach people how to use Nix, and the more recent the thing is the more it will focus on surface-level features of the C++ implementation of Nix.

[0]: https://edolstra.github.io/pubs/phd-thesis.pdf

jakewins · 3 years ago
Have you used Go or Rust package ecosystems?

My experience is that the older gen languages you mention had to invent package management, made lots of understandable mistakes and now are in a backwards compat hellscape.

Rust and Go built their packaging story with the benefit of lessons learned from those other systems, and in my experience the difference is night and day.

tsimionescu · 3 years ago
Go package management is a mess of hacks. It doesn't even have a repository, instead relying on source control systems to do the actual work, with special hacks for each source control system to define the artifacts that can be downloaded (e.g. using Git tags with some magic format, or Perforce commit Metadata). It requires you to physically move all code to a new folder in your version control if you want to increase the major version number. It requires users of your code to update all of their import statements throughout their code whenever you move your hosting. It relies on DNS for package identity. It takes arcane magic to support multiple Go modules in the same repo.

I can go on, but it's a terrible hodge-podge of systems. It works nicely for simple cases (consuming libraries off Github), but it's awful when you go into details. And it's not even used by its creators - since Google has a monorepo and they actually use their internal universal build tool to just compile everything from source.

gwd · 3 years ago
> It relies on DNS for package identity.

The flip side of this is that it never has to worry about naming collisions or namespacing: Your public package name must be a URL you control.

Additionally, there is no requirement for a centralized package facility to be run. The Golang project is currently running pkg.go.dev, but that's only been in the last few years; and if they decided to get rid of it, it wouldn't significantly impact the development environment.

Finally, the current system makes "typo-squatting attacks" harder to do. Consider the popular golang package github.com/mattn/go-sqlite3. The only way to "typosquat" the package is to typosquat somewhere up the dependency tree; e.g., by creating github.com/matn/go-sqlite3 or something. You can't typosquat github.com/mattn/sqlite3, or github.com/mattn/go-sqlite, because you don't own those namespaces; whereas with non-DNS-based package systems, the package would be called `go-sqlite3`, and `sqlite3` or `go-sqlite` would be much easier to typosquat.

All those things I find really valuable; and honestly it's something I wish the Rust ecosystem had picked up.

> It requires users of your code to update all of their import statements throughout their code whenever you move your hosting.

This is a necessary cost of the item above. It can be somewhat annoying, but I believe this can be done with a one-line change to the go.mod. I'd much rather occasionally deal with this.

> It requires you to physically move all code to a new folder in your version control if you want to increase the major version number.

And the benefit of this is that legacy code will continue to compile into the future. I do tend to find this annoying, but it was explicit trade-off that was decided back when they were developing their packaging system.

Packaging is a hard problem, with lots of trade-offs; I think Go has done a pretty good job.

One way in which Go and Rust have it easier than Python or Node is that the former only have to deal with developers; the latter have to deal with both developers and users, whose requirements are often at odds with one another.

jen20 · 3 years ago
Some of your criticism is reasonable, and I’m no fan of Go’s module system as a standalone artifact, but much of your criticism is unfounded.

> It requires you to physically move all code to a new folder in your version control if you want to increase the major version number.

This is untrue.

> It requires users of your code to update all of their import statements throughout their code whenever you move your hosting.

This is only true if not using a vanity URL, but is sadly often the case.

> It takes arcane magic to support multiple Go modules in the same repo.

I don’t know what you’re calling arcane magic here, but we maintain repos at work with 6-7 go modules in without it being an issue whatsoever, and no “arcane magic” required, so I’m going to go ahead and say this is untrue too.

mseepgood · 3 years ago
> It requires you to physically move all code to a new folder in your version control if you want to increase the major version number

That's simply not true. That's only one way you can do it. Another way is to create a branch.

skovati · 3 years ago
Interesting read on the official golang.org proxy DoSing git.sr.ht

https://sourcehut.org/blog/2023-01-09-gomodulemirror/

omgtehlion · 3 years ago
Idk about Go, but Rust’s cargo seems nice, clean yet powerful.

That was my impression some time ago.

But last week I attempted to compile a couple of (not very big) tools from cargo. And it ended up downloading hundreds of dependencies and gigabytes of packages.

Looks like node_modules.jpg all over again :(

junon · 3 years ago
As someone who contributed somewhat extensively to the node_modules problem early on, Cargo is definitely better than the JS ecosystem in this regard.

Further, another major difference is that you don't need those dependencies after you've built. You can blow them away. Doing that with node is not as straightforward, and in many cases, not possible.

verdverm · 3 years ago
> Idk about Go

I wrote a post highlighting Go's mod system: https://verdverm.com/go-mods/

imo, it is the best designed dependency system I know of. One of the nice things is that Go uses a shared module cache so there is only one copy on your computer when multiple projects use the same dependency@version

folex · 3 years ago
Being in the Rust full time for last 2 or 3 years: it is quite a pain to setup a release process for big Rust workspace.

Version incrementing, packaging wasms, dancing around code generation – all doable, but not standardized.

There's a release-please to automate all that, but it's not an easy task to set it up in all of your repos.

Besides, if in addition to Rust projects, you have projects in other languages like JavaScript, then you have to do it twice and struggle with understanding all of the package management systems provided by all languages you have.

A single swiss-army-knife package manager would be amazing.

pjmlp · 3 years ago
Doesn't look like it, see module drama in Go, while Rust is having an npm like ecosystem of tiny crates.

Plus none of them handle binary library distribution as some of the packing models that came before them.

tomcam · 3 years ago
Can you give me a one-liner about module drama in Go?
xiphias2 · 3 years ago
Npm has lots of not understandable mistakes: it should already be as good with ES6+TS as other systems. Import should just work everywhere.

We should have left Commonjs a long time ago, while keeping backwards compatibility.

At the same time what I see with the node+npm system is that everything is just ,,it just doesn't work by default''.

Having 10 other package managers doesn't work either, they are faster, but don't solve this problem.

eurasiantiger · 3 years ago
Just use .mjs extension for your code, add extensions to your imports and everything does just work.
SuperSandro2000 · 3 years ago
But it is a pain for a distro to upgrade a vulnerable crate dependency
dboreham · 3 years ago
Go and Rust don't have packages. They have tooling to pull library code from git and build it in-place to be linked into a local project. That's not (really) packaging.
steveklabnik · 3 years ago
Cargo does not “pull library code from git” unless you expressly ask for it to.

And given that packages have to depend on other packages, and cannot depend on a git repository, that feature is mostly useful for testing bug fixes, private repos for leaf packages, stuff like that.

ChrisRackauckas · 3 years ago
Julia and Rust seem to have package systems that are fine and manageable. I think these are really just major problems in Javascript, Python, and C/C++ (exactly the languages you mention) because the kind of widespread OSS code sharing just didn't exist to the extent it does today back when those languages were designed. People shared code through email, but didn't expect one button to pull in 200 Github repositories, and thus weren't built with the expectations required to make that be stable.

Back when those languages were designed, you'd manually download the few modules you need, if you downloaded any packages at all. In C you'd normally build your own world, since it came before the www times, and C++ kind of inherited that. But languages which came out later decided that we now live in a world where most of the code that is executed is packages, most likely packages which live on Github. So Julia and Rust build this into the language. Julia in particular with the Project.toml and Manifest.jl for fully reproducing environments, its package manager simply uses git and lets you grab the full repository with `]dev packagename`, its package registry system lets you extend with private package worlds.

I think the issue is that dependencies are central, so you can never remove old package systems because if that's where the old (and rarely updated) dependencies live, then you need to keep it around. But for dependencies to work well, you need all dependencies to be resolved using the same package system. So package systems don't tend to move very fast in any language, whatever you had early has too much momentum.

Hendrikto · 3 years ago
Tying package management to GitHub seems convenient in the short term, but will be the baggage of the next generation.

I cringe hard when I see projects depending on git repos, without pinning a version or commit.

staunton · 3 years ago
It's not actually tied to GitHub. If GitHub died tomorrow, they would easily be able to move on and host the packages somewhere else. There's no way to do this without hosting. Also, there is a way to pin a version or commit. Julia for example always stores the exact commit information for all packages in the "Manifest" file. There are also straightforward ways to demand certain versions and package maintainers have the opportunity to specify compatibility and requirements precisely.
ChrisRackauckas · 3 years ago
Julia and Rust all work with using versions tied to git tags. Package management isn't tied to Github but git: packages can live on Gitlab or BitBucket, though they generally don't and that's the choice of package developers. Because of that there are tie-ins to make Github really nice to use, but for example with Julia the only piece that is truly Github based is the fact that the General registry lives in a Github repo (https://github.com/JuliaRegistries/General), but could easily migrate to another git platform if it needed to.
derefr · 3 years ago
What I really want to know, is why package development is such a Rube Goldberg machine. Not for programming-language packages, per se, but rather for OS packages of simple programming-language packages.

Have you tried to package a random Python/Ruby/etc. CLI program, for Debian? Or how about for Homebrew? Each one involves a cacophony of PL scripts calling shell-scripts calling PL scripts, and internal/undocumented packager subcommands calling other internal/undocumented packager subcommands. It takes ~forever to do a checked build of a package in these systems, and 99% of it is just because of how spread out the implementation of such checks is over 100 different components implemented at different times by different people. It could all be vastly simplified by reimplementing all the checks and transforms in a single pass that gradually builds up an in-memory state, making assertions about it and transforming it as it goes, and then emitting it if everything works out. You know — like a compiler.

megous · 3 years ago
That's a Debian specific quirk. There are simpler, more cohesive package management systems, like eg. pacman, where package maintainers's tools mastery is not an arcane art in itself.
denton-scratch · 3 years ago
It annoys me that every gee-whizz new language needs to have it's own package-management system.

There's no reason why a package-management system needs to be language-specific; dependencies are often cross-language. Hell, even some blocks of code contain more than one language.

The package-management system is responsible for deploying packages. The way a package is deployed should depend on the operating environment, not on the language. These language-specific packaging arrangements typically deploy into some private part of the file-system, organized in its own idiosyncratic way.

Using git as a repository is just nuts. Git is privately-owned, and stuffed with all kinds of unaudited junk. You can't audit everything you install by hand; so these systems force you to install unaudited code, or simply not install.

I've been using Debian derivatives for years. I appreciate having an audited repository, and an installation system that deploys code to more-or-less predictable filesystem locations.

nicoburns · 3 years ago
> Using git as a repository is just nuts. Git is privately-owned, and stuffed with all kinds of unaudited junk.

Do you mean github? Git is open source and one of the few pieces of software that works in a truly distributed fashion.

denton-scratch · 3 years ago
You're right; I meant github. Sorry.
arcturus17 · 3 years ago
> Git is privately-owned, and stuffed with all kinds of unaudited junk.

Also: what do I care that people store unaudited and insecure stuff in there?

pjmlp · 3 years ago
Because cross platform packages isn't a thing, and not everyone wants to create a package for every platform out there.
yamtaddle · 3 years ago
The reasons language-specific package managers are popular:

1) "npm install" or "go get" or what have you, works on every platform (barring bugs), while "apt install" only works on some.

2) Most platform package managers aren't good at handling multiple versions of dependencies (which, neither are many language package managers, but they're easier to sandbox away with supplemental tools than system package managers are)

3) Most platform package managers lag way behind language-specific package managers, and may also lack tons and tons of packages that are available on those.

> Git is privately-owned

Git... hub, you mean?

maple3142 · 3 years ago
Probably because your Debian doesn't work on other Linux distros like CentOS, Arch Linux ..., let alone MacOS and Windows. The holy grail would be a package manager that is both cross language and cross platform at the same time, but it seems too hard to be implemented in general.

Anaconda is probably the closest as it also package many non python packages. Nix is also similar but it doesn't support Windows at all (without WSL)

randomdata · 3 years ago
> Git is privately-owned, and stuffed with all kinds of unaudited junk. [...] I've been using Debian derivatives for years

Git is owned by the same owner as Linux. If you've been using Debian derivatives for years it seems you must have some trust to give that private entity? Unless the derivative you speak of is Debian GNU/k*BSD?

Furthermore, if you give trust to Debian derivative projects, why not trust their Git builds? If you trust everything else in their distribution Git is a curious omission. Do you have a personal beef with Torvalds or something?

Sebb767 · 3 years ago
> There's no reason why a package-management system needs to be language-specific; dependencies are often cross-language. Hell, even some blocks of code contain more than one language.

This sounds a lot like a case of https://xkcd.com/927/ . Languages have different ways of importing and installing dependencies, trying to create a package manager over all of those is just going to end up making things even more complex, especially if you target all platforms at once.

> Using git as a repository is just nuts. Git is privately-owned, and stuffed with all kinds of unaudited junk.

Git is fully open source. Are you confusing Git and GitHub?

tazjin · 3 years ago
> Languages have different ways of importing and installing dependencies

They're not actually different. They call things differently, and they have different methods of passing the required lookup paths/artifacts/sources to their compilers/interpreters/linkers, but in the end all of them are conceptually the same thing.

perrygeo · 3 years ago
I'd say we have a fairly good existing theory that explains modern package management: Conway's Law.

We self-organize into communities of practice and select the package management strategy that works best. Reaching across communities to develop a grand centralized strategy that fits everyone's needs would be __possible__, but involves significant communication and coordination overhead. So instead we fracture, and the tooling ecosystem reflects ad-hoc organizational units within the community.

Ecosystems like Rust cargo that have batteries included from the start have an advantage, virtually all Rust developers have a single obvious path to package management because of this emergent social organization.

Ecosystems like Python's seem like the wild west, there is deep fracturing (particularly between data science and software engineering) and no consensus on the requirements or even the problems. So Python fractures further, ironically in a search for something that can eventually unify the community. And python users feel the strain of this additional overhead every day, needing to side with a team just to get work done.

I'd argue both of these cases are driven by consequences easily predictable from Conway's Law.

MuffinFlavored · 3 years ago
Is there a name for a phenomena/theory that states "the more people you add to a problem while asking them to solve it in an opinionated way, the more likely you are to get conflict/fragmentation/less people agreeing overall as consensus gets diluted and number of possibilities/opinions increases?"
sixstringtheory · 3 years ago
Maybe it should be called Lydgate’s Law: “You can please some of the people all of the time, you can please all of the people some of the time, but you can’t please all of the people all of the time.”
neilv · 3 years ago
The Racket module system, designed by Matthew Flatt, is great.

But to fully appreciate it, it helps to understand syntax transformation in Racket. Once the rigorous phase system forces non-kludgy static rules about when things are evaluated, your syntax transformers and the code on which they depend could cause a mess of shuffling code among multiple files to solve dependency problems... until you use submodules with the small set of visibility rules, and then suddenly your whole tricky package once again fits in a single file cleanly.

I leveraged this for some of my embedded doc and test experiments, without modifying the Racket core. (I really, really like single-source-file modules that embed doc, test, and package metadata all in the same file, in logical places.)

jrockway · 3 years ago
You mention Python and Node, which are programming language that unusually require end users to have the text of your program and all its dependencies on their own machine, and the languages store some parts of your program in /usr/lib and other parts of your program in your source directory. (npm does a little better here and at least puts the dependencies in your project directory.) Those constraints make development and packaging hard no matter what.

Python and Node both need a way to compile the code down to a single statically-linked binary like more modern languages (Go, Rust), solving the distribution problem once and for all.

There are module systems that aren't insane, like Go's module system. It uses semantic versioning to mediate version conflicts. Programs can import multiple major versions of the same module. The module requirements files ensure that every checkout of the code gets the exact same bytes of the code's dependencies. The compiler is aware of modules and can fetch them, so on a fresh workstation, "go test ./..." or "go install ./cmd/cool-thing" in a module-aware project works without running any other command first. It is actually so pleasant to use that I always think twice "do I want to deal with modules" before using a language like Javascript or Python, and usually decide "no".

npm and pip are the DARK AGES. That's why you're struggling. The community has been unable to fix these fundamental flaws for decades, despite trying in 100 incompatible ways. I personally have given up on the languages as a result. The standard library is never enough. You HAVE to solve the module problem on day 1.

briffle · 3 years ago
if npm and pip are DARK AGES what does that mean perl's CPAN is?
jrockway · 3 years ago
I haven't used Perl for 15 years, but it was pretty miserable back then. Obviously I had my workflow and didn't run into too many problems, but I was certainly nervous about how I could share my work with other people. (Putting it into prod wasn't that bad. I don't know why, but it always went OK. CPANPLUS helped at the time.) I used to teach Perl trainings as a side job, and people would literally be in tears over @INC. It was bad.

I don't use Python much these days, but it's not as bad as Perl 15 years ago. I see blog posts like "how to set up the perfect Python dev environment with Docker" and it makes me very sad, but at least teams are getting their work done. The edge cases of Python packaging, though, are really really bad. For example, the C compiler that Python was compiled with (and C library; musl vs. glibc) affects installability of modules through pip. This really forces your Linux distribution to be the arbiter of what modules you can use, which is always too out of date for development. Also, the exact requirements (what are the sha256s and URLs of the code files that this code depends on) depends on architecture, compiler, etc. As far as I can tell, you have to run code to find the whole dependency tree. That is the fatal flaw I see with Python's packaging system.

I spent a lot of time trying to use Bazel to unify all the work my company does across languages. Python was the killer here. We do depend on a lot of C extensions, and I have a C toolchain built with Bazel (so that arm64 mac users can cross-compile to produce Linux releases; maybe a dumb requirement, but it would be too unpopular to discriminate against certain developer devices), but getting Python built with that toolchain, and using that Python to evaluate requirements.txt didn't work. (There are bugs open; they all end with "Python has to fix this".) As a result, you don't get compatible versions of, say, numpy with this setup. Dealbreaker. Partially Bazel's fault, partially Pyton's fault.

(It pained me to use Bazel for Go, but it did all work. While the workflow wasn't as nice as what Go has built in, easy things were hard and hard things were possible. I had working binaries within a few hours, buildable on any supported workstation type, with object and test caching in the cloud.)