My hope is that conda goes away completely. I run an ML cluster and we have multi-gigabyte conda directories and researchers who can't reproduce anything because just touching an env breaks the world.
pyenv was problematic because you needed the right concoction of system packages to ensure it compiled python with the right features, and we have a mix of MacOS and Linux devs so this was often non-trivial.
uv is much faster than both of these tools, has a more ergonomic CLI, and solves both of the issues I just mentioned.
I'm hoping astral's type checker is suitably good once released, because we're on mypy right now and it's a constant source of frustration (slow and buggy).
> uv is much faster than both of these tools
conda is also (in)famous for being slow at this, although the new mamba solver is much faster. What does uv do in order to resolve dependencies much faster?
#!/usr/bin/env -S uv run --script
# /// script
# requires-python = ">=3.11"
# dependencies = [ "modules", "here" ]
# ///
The script now works like a standalone executable, and uv will magically install and use the specified modules.As long as you have internet access, and whatever repository it's drawing from is online, and you may get different version of python each time, ...
My fear is that the performance difference might add up once use it on more and more part. I imagine it uses a lot more memory. Plus once Fil-C gets adopted in the mainstream it might lower the need for devs to actually fix the code and they might start just relying on Fil-C.
To be fair, systemd itself is corporate shite to begin with and I wouldn't mind seeing it being replaced with something written in a language with memory safety.
Well, the program would still halt upon memory flaw, so there would still be a need to fix it