> Similarly, uv does not yet generate a platform-agnostic lockfile. This matches pip-tools, but differs from Poetry and PDM, making uv a better fit for projects built around the pip and pip-tools workflows.
Do you expect to make the higher level workflow independent of requirements.txt / support a platform-agnostic lockfile? Being attached to Rye makes me think "no".
Without being platform agnostic, to me this is dead-on-arrival and unable to meet the "Cargo for Python" aim.
> uv supports alternate resolution strategies. By default, uv follows the standard Python dependency resolution strategy of preferring the latest compatible version of each package. But by passing --resolution=lowest, library authors can test their packages against the lowest-compatible version of their dependencies. (This is similar to Go's Minimal version selection.)
> uv allows for resolutions against arbitrary target Python versions. While pip and pip-tools always resolve against the currently-installed Python version (generating, e.g., a Python 3.12-compatible resolution when running under Python 3.12), uv accepts a --python-version parameter, enabling you to generate, e.g., Python 3.7-compatible resolutions even when running under newer versions.
This is great to see though!
I can understand it being a flag on these lower level, directly invoked dependency resolution operations.
While you aren't onto the higher level operations yet, I think it'd be useful to see if there is any cross-ecosystem learning we can do for my MSRV RFC: https://github.com/rust-lang/rfcs/pull/3537
How are you handling pre-releases in you resolution? Unsure how much of that is specified in PEPs. Its something that Cargo is weak in today but we're slowly improving.
Thanks Ed! Your work as always is a big source of inspiration.
> Do you expect to make the higher level workflow independent of requirements.txt / support a platform-agnostic lockfile? Being attached to Rye makes me think "no".
Yes, we absolutely do. We don't do this today, the initial scope is intentionally limited. But in the next phase of the project, we want to extend to multi-platform and multi-version resolution.
> How are you handling pre-releases in you resolution? Unsure how much of that is specified in PEPs. Its something that Cargo is weak in today but we're slowly improving.
This is something we talked with Jacob about quite a bit. Turns out (as you know) it's a very hard problem. For the initial release, we added a constraint: our default behavior is that if you want to use a pre-release, you _have_ to specify the package as a first-party dependency and use a pre-release marker in the version specifier. (We also support globally enabling and disabling pre-releases.) So, we basically don't support "transitive" pre-releases right now -- but we give you a dedicated error message if your resolution fails for that reason.
Any plans to tackle the Python version installation side of things and make it as seamless as rustup has? I've previously used `pyenv install` for this, but it would be nice to fold it into one tool.
> How are you handling pre-releases in you resolution? Unsure how much of that is specified in PEPs.
The living version of PEP 440 has a bit on how pre-releases are handled[1]. The basic version is that the installer shouldn't select them at all, unless the user explicitly indicates that they want a pre-release. Once opted into, they're ordered by their phase (alpha, beta, rc) and pre-release increment (e.g. `.beta.1 > `.alpha.2`).
How does a language ecosystem that bakes "there should be one-- and preferably only one --obvious way to do it" into the interpreter* as a statement of values end up with such a convoluted packaging story?
Simple: Python is old. It predates most shiny languages by decades, and so does its packaging-trip. I mean, most modern packaging can shine today, because they could learn from the imperfections of their forerunners. Additionally, python has a very wide field to cover, far wider than most other languages, which makes things more complicate and thus people are more open for experiments, which leads to so many different solutions.
Python is old, but pip itself had its 1.0 release 8 months after Ruby's Bundler did and 6 years after Apache Maven, both of which are substantially better package managers than pip and both of which are also for 90s-era programming languages, not the new shiny ones.
Age is part of the problem, but it's not the whole problem. Ruby manages to have a package manager that everyone loves, and Java has settled on a duopoly. Meanwhile Python has someone making a serious attempt at a new package manager every few years.
Packaging is hard but it's not hard enough to wholly explain pip.
Lock files alone are a proven piece of technology that pretty much just works across all modern package managers, and yet the best that pip can recommend to date is `pip freeze > requirements.txt`, which is strictly inferior than what is available in other package managers because there's no hash verification and no distinction between transitive dependencies and explicit dependencies.
That pip still doesn't have an answer for lock files is a sign of a deep problem in the Python package management world, not the natural result of packaging being hard.
Python refuses to have packages (folders don't count) or accept build steps. Everything follows from there, making it much harder to have a good solution.
Any plans to adopt Rye's approach of using standard Python builds to cover the installing of different Python versions?
I feel uv should provide a way to install different python versions to truly cover an end-to-end tool. The current approach of searching for existing virtualenvs of conda envs helps but I was hoping to completely remove the need for another package/dependency manager.
(Taken from the docs)
If a --python-version is provided to pip compile (e.g., --python-version=3.7), uv will search for a Python interpreter matching that version in the following order:
- An activated virtual environment based on the VIRTUAL_ENV environment variable.
- An activated Conda environment based on the CONDA_PREFIX environment variable.
- A virtual environment at .venv in the current directory, or in the nearest parent directory.
- The Python interpreter available as, e.g., python3.7 on macOS and Linux. On Windows, uv will use the same mechanism as py --list-paths to discover all available Python interpreters, and will select the first interpreter matching the requested version.
- The Python interpreter available as python3 on macOS and Linux, or python.exe on Windows.
> I had to guess, that’s the path that the Astral team would take as well - expand ruff’s capabilities so it can do everything a Python developer needs. So the vision that Armin is describing here might be achieved by ruff eventually. They’d have an advantage that they’re not a single person maintenance team, but the disadvantage of needing to show a return to their investors.
Looks awesome. I find that pip is usually pretty fast for me, and when it's not, it is mostly because it has to download so much data or wait for native libraries to compile in the background (or anything involving cuda which always seems to take forever). What really needs some help with speed is conda, which is just so absurdly slow for literally anything, even on ridiculously powerful machines.
What are some of the reasons that teams use conda (and related tools) today? As a machine learning scientist, I used conda exclusively in the mid-2010s because it was the only framework that could reliably manage Python libraries like NumPy, PyTorch, and so on, that have complex binary dependencies. Today, though, pip install works fine for those packages. What am I missing?
For me personally, I prefer conda because it is dependency resolution (mamba), virtual environments, and a package repository (conda-forge) all from one base miniconda installation. And for all of my use cases, all of those just work. Dependency solving used to be painfully slow, mamba solved that. Packages used to be way behind the latest, setting conda-forge as my default solved that.
After fiddling with different solutions for years and having to start fresh with a new Python install, I've been using nothing by miniconda for years and it just works
> Today, though, pip install works fine for those packages.
pip install works, but pip's dependency management doesn't seem to (for Pytorch, specifically) which is why projects that have pip + requirements.txt as one of their installation methods will often have separate pytorch installation instructions when using that method, though if the same project supports conda installation it will be a one-stop-shop installation that way.
One reason to choose one over the other is the dependencies they’re bundled with. Take numpy. With PyPI, it’s bundled with OpenBLAS, and with conda, it’s bundled with Intel MKL, which can be faster. See https://numpy.org/install/#
From my testing in rye it’s significantly faster in day to day. There are numbers in the blog post obviously but it’s the feeling you get using it locally that makes it much more fun to use.
I did some basic testing today and uv was around 2x faster than pip for a clean venv and cold cache on a real, decent sized project. With warm uv cache it was incredibly fast, under 10 sec.
So uv uses pubgrub-rs, a rust implementation of the pubgrub version solving algorithm originally written for the dart language. I suppose I shouldn't be surprised, but I always loved Pub the Dart/Flutter packaging tool. It's the only thing out there that comes close to cargo that I know of. Fun to see these chains of inspiration reach across languages.
Sell build-related services to companies. Imagine GitHub Actions, but it's cleanly built into your Python tooling in some reasonable way, so it's just the natural thing to use. I think it's pretty straightforward, although we'll see whether it works out for them.
npm, Inc. was an unsuccessful business that got acquired for a pittance because MS can afford to throw away a few million dollars in case it turns out that having control over npm is useful. The team working on it isn't very large but I'd still be surprised if it's actually operating at a profit.
NPM did not go well, and selling to Microsoft happened after the team there fell apart. In my view some of that is leadership issues, and some of that is pressure from a struggling business.
I read their about page and it seems they want to make Python dev more productive. Maybe they have a lot of projects using it and are tired of the tooling/packaging BS. I could definitely see someone making billions allocate under 1% of it towards fixing that.
Improving Python is especially cheap compared to the productivity that could be unleashed. Surprised it isn't done more often. Only microsoft has shown significant interest, which is a shame. Perhaps changing.
A lot of uncalled for pessimism here in the comments. If python folks reading haven’t used ruff yet, i can highly recommend it. The astral folks have proven themselves to me already. Looking forward to more and more of their rust built python tooling.
> Similarly, uv does not yet generate a platform-agnostic lockfile. This matches pip-tools, but differs from Poetry and PDM, making uv a better fit for projects built around the pip and pip-tools workflows.
Do you expect to make the higher level workflow independent of requirements.txt / support a platform-agnostic lockfile? Being attached to Rye makes me think "no".
Without being platform agnostic, to me this is dead-on-arrival and unable to meet the "Cargo for Python" aim.
> uv supports alternate resolution strategies. By default, uv follows the standard Python dependency resolution strategy of preferring the latest compatible version of each package. But by passing --resolution=lowest, library authors can test their packages against the lowest-compatible version of their dependencies. (This is similar to Go's Minimal version selection.)
> uv allows for resolutions against arbitrary target Python versions. While pip and pip-tools always resolve against the currently-installed Python version (generating, e.g., a Python 3.12-compatible resolution when running under Python 3.12), uv accepts a --python-version parameter, enabling you to generate, e.g., Python 3.7-compatible resolutions even when running under newer versions.
This is great to see though!
I can understand it being a flag on these lower level, directly invoked dependency resolution operations.
While you aren't onto the higher level operations yet, I think it'd be useful to see if there is any cross-ecosystem learning we can do for my MSRV RFC: https://github.com/rust-lang/rfcs/pull/3537
How are you handling pre-releases in you resolution? Unsure how much of that is specified in PEPs. Its something that Cargo is weak in today but we're slowly improving.
> Do you expect to make the higher level workflow independent of requirements.txt / support a platform-agnostic lockfile? Being attached to Rye makes me think "no".
Yes, we absolutely do. We don't do this today, the initial scope is intentionally limited. But in the next phase of the project, we want to extend to multi-platform and multi-version resolution.
> How are you handling pre-releases in you resolution? Unsure how much of that is specified in PEPs. Its something that Cargo is weak in today but we're slowly improving.
This is something we talked with Jacob about quite a bit. Turns out (as you know) it's a very hard problem. For the initial release, we added a constraint: our default behavior is that if you want to use a pre-release, you _have_ to specify the package as a first-party dependency and use a pre-release marker in the version specifier. (We also support globally enabling and disabling pre-releases.) So, we basically don't support "transitive" pre-releases right now -- but we give you a dedicated error message if your resolution fails for that reason.
The living version of PEP 440 has a bit on how pre-releases are handled[1]. The basic version is that the installer shouldn't select them at all, unless the user explicitly indicates that they want a pre-release. Once opted into, they're ordered by their phase (alpha, beta, rc) and pre-release increment (e.g. `.beta.1 > `.alpha.2`).
[1]: https://packaging.python.org/en/latest/specifications/versio...
* Try running `python -c 'import this' | sed -n 15p`
Age is part of the problem, but it's not the whole problem. Ruby manages to have a package manager that everyone loves, and Java has settled on a duopoly. Meanwhile Python has someone making a serious attempt at a new package manager every few years.
You will notice that every package management solution from all your favourite languages will have its own set of tradeoffs.
Lock files alone are a proven piece of technology that pretty much just works across all modern package managers, and yet the best that pip can recommend to date is `pip freeze > requirements.txt`, which is strictly inferior than what is available in other package managers because there's no hash verification and no distinction between transitive dependencies and explicit dependencies.
That pip still doesn't have an answer for lock files is a sign of a deep problem in the Python package management world, not the natural result of packaging being hard.
And quite a few where the tradeoffs are minor enough to be well worth some sanity & consistency.
When it comes to package managers, as you noted, the situation is even more ironic, as nicely depicted in https://xkcd.com/1987/.
I feel uv should provide a way to install different python versions to truly cover an end-to-end tool. The current approach of searching for existing virtualenvs of conda envs helps but I was hoping to completely remove the need for another package/dependency manager.
(Taken from the docs)
> I had to guess, that’s the path that the Astral team would take as well - expand ruff’s capabilities so it can do everything a Python developer needs. So the vision that Armin is describing here might be achieved by ruff eventually. They’d have an advantage that they’re not a single person maintenance team, but the disadvantage of needing to show a return to their investors.
By the way, the creator of mamba started his own company at https://prefix.dev/
They want to essentially leverage the conda(-forge) infrastructure to build a new cross-platform, cross-language, cargo-like package manager: pixi
[0] https://github.com/mamba-org/mamba
https://conda.org/blog/2023-11-06-conda-23-10-0-release/
After fiddling with different solutions for years and having to start fresh with a new Python install, I've been using nothing by miniconda for years and it just works
Using only „Pythons native tools“ like pip and venv simply works nowadays so good that I wonder about the purpose of many tools like poetry etc. etc.
pip install works, but pip's dependency management doesn't seem to (for Pytorch, specifically) which is why projects that have pip + requirements.txt as one of their installation methods will often have separate pytorch installation instructions when using that method, though if the same project supports conda installation it will be a one-stop-shop installation that way.
npm -> Yarn was life changing performance-wise.
I wonder what pip -> uv is.
I’m loving the pace of release with this crop of speed obsessed projects and I cannot wait for astral to tackle typing.
Python is used by loads of scientists, academics, and other non software engineers. Those people need an easy way to use python.
Improving Python is especially cheap compared to the productivity that could be unleashed. Surprised it isn't done more often. Only microsoft has shown significant interest, which is a shame. Perhaps changing.