Readit News logoReadit News
epage · 2 years ago
Congrats!

> Similarly, uv does not yet generate a platform-agnostic lockfile. This matches pip-tools, but differs from Poetry and PDM, making uv a better fit for projects built around the pip and pip-tools workflows.

Do you expect to make the higher level workflow independent of requirements.txt / support a platform-agnostic lockfile? Being attached to Rye makes me think "no".

Without being platform agnostic, to me this is dead-on-arrival and unable to meet the "Cargo for Python" aim.

> uv supports alternate resolution strategies. By default, uv follows the standard Python dependency resolution strategy of preferring the latest compatible version of each package. But by passing --resolution=lowest, library authors can test their packages against the lowest-compatible version of their dependencies. (This is similar to Go's Minimal version selection.)

> uv allows for resolutions against arbitrary target Python versions. While pip and pip-tools always resolve against the currently-installed Python version (generating, e.g., a Python 3.12-compatible resolution when running under Python 3.12), uv accepts a --python-version parameter, enabling you to generate, e.g., Python 3.7-compatible resolutions even when running under newer versions.

This is great to see though!

I can understand it being a flag on these lower level, directly invoked dependency resolution operations.

While you aren't onto the higher level operations yet, I think it'd be useful to see if there is any cross-ecosystem learning we can do for my MSRV RFC: https://github.com/rust-lang/rfcs/pull/3537

How are you handling pre-releases in you resolution? Unsure how much of that is specified in PEPs. Its something that Cargo is weak in today but we're slowly improving.

charliermarsh · 2 years ago
Thanks Ed! Your work as always is a big source of inspiration.

> Do you expect to make the higher level workflow independent of requirements.txt / support a platform-agnostic lockfile? Being attached to Rye makes me think "no".

Yes, we absolutely do. We don't do this today, the initial scope is intentionally limited. But in the next phase of the project, we want to extend to multi-platform and multi-version resolution.

> How are you handling pre-releases in you resolution? Unsure how much of that is specified in PEPs. Its something that Cargo is weak in today but we're slowly improving.

This is something we talked with Jacob about quite a bit. Turns out (as you know) it's a very hard problem. For the initial release, we added a constraint: our default behavior is that if you want to use a pre-release, you _have_ to specify the package as a first-party dependency and use a pre-release marker in the version specifier. (We also support globally enabling and disabling pre-releases.) So, we basically don't support "transitive" pre-releases right now -- but we give you a dedicated error message if your resolution fails for that reason.

thaliaarchi · 2 years ago
Any plans to tackle the Python version installation side of things and make it as seamless as rustup has? I've previously used `pyenv install` for this, but it would be nice to fold it into one tool.
woodruffw · 2 years ago
> How are you handling pre-releases in you resolution? Unsure how much of that is specified in PEPs.

The living version of PEP 440 has a bit on how pre-releases are handled[1]. The basic version is that the installer shouldn't select them at all, unless the user explicitly indicates that they want a pre-release. Once opted into, they're ordered by their phase (alpha, beta, rc) and pre-release increment (e.g. `.beta.1 > `.alpha.2`).

[1]: https://packaging.python.org/en/latest/specifications/versio...

techwizrd · 2 years ago
PyTorch doesn't work well with platform-agnostic lockfiles. It's a constant source of issues for me when using Poetry.
jvolkman · 2 years ago
The vast majority of pypi packages are not PyTorch, however.
OJFord · 2 years ago
That is great. Fuzzing would be cool too - just completely randomise the versions within the claimed compatibility constraints.
endgame · 2 years ago
How does a language ecosystem that bakes "there should be one-- and preferably only one --obvious way to do it" into the interpreter* as a statement of values end up with such a convoluted packaging story?

* Try running `python -c 'import this' | sed -n 15p`

PurpleRamen · 2 years ago
Simple: Python is old. It predates most shiny languages by decades, and so does its packaging-trip. I mean, most modern packaging can shine today, because they could learn from the imperfections of their forerunners. Additionally, python has a very wide field to cover, far wider than most other languages, which makes things more complicate and thus people are more open for experiments, which leads to so many different solutions.
lolinder · 2 years ago
Python is old, but pip itself had its 1.0 release 8 months after Ruby's Bundler did and 6 years after Apache Maven, both of which are substantially better package managers than pip and both of which are also for 90s-era programming languages, not the new shiny ones.

Age is part of the problem, but it's not the whole problem. Ruby manages to have a package manager that everyone loves, and Java has settled on a duopoly. Meanwhile Python has someone making a serious attempt at a new package manager every few years.

yen223 · 2 years ago
Because packaging is a very complex problem, and it's rare that any one packaging solution can get everything right in the first try.

You will notice that every package management solution from all your favourite languages will have its own set of tradeoffs.

lolinder · 2 years ago
Packaging is hard but it's not hard enough to wholly explain pip.

Lock files alone are a proven piece of technology that pretty much just works across all modern package managers, and yet the best that pip can recommend to date is `pip freeze > requirements.txt`, which is strictly inferior than what is available in other package managers because there's no hash verification and no distinction between transitive dependencies and explicit dependencies.

That pip still doesn't have an answer for lock files is a sign of a deep problem in the Python package management world, not the natural result of packaging being hard.

lucideer · 2 years ago
This is absolutely true, but I haven't seen any language ecosystems that have gotten things wrong as often as Python.

And quite a few where the tradeoffs are minor enough to be well worth some sanity & consistency.

pjc50 · 2 years ago
Python refuses to have packages (folders don't count) or accept build steps. Everything follows from there, making it much harder to have a good solution.
stared · 2 years ago
Well, there are languages better at incorporating the Zen of Python than Python itself, vide (full disclosure: my blog post) https://p.migdal.pl/blog/2020/03/types-test-typescript/.

When it comes to package managers, as you noted, the situation is even more ironic, as nicely depicted in https://xkcd.com/1987/.

mixmastamyk · 2 years ago
One way to do things becomes impossible to after thirty years unless one wants to make large, breaking changes. After Python 3:

    >>> from __future__ import braces
      File "<stdin>", line 1
    SyntaxError: not a chance

surfingdino · 2 years ago
What's your view on how Golang deals with this problem? Serious question.
ddanieltan · 2 years ago
Any plans to adopt Rye's approach of using standard Python builds to cover the installing of different Python versions?

I feel uv should provide a way to install different python versions to truly cover an end-to-end tool. The current approach of searching for existing virtualenvs of conda envs helps but I was hoping to completely remove the need for another package/dependency manager.

(Taken from the docs)

  If a --python-version is provided to pip compile (e.g., --python-version=3.7), uv will search for a Python interpreter matching that version in the following order:

  - An activated virtual environment based on the VIRTUAL_ENV environment variable.

  - An activated Conda environment based on the CONDA_PREFIX environment variable.

  - A virtual environment at .venv in the current directory, or in the nearest parent directory.

  - The Python interpreter available as, e.g., python3.7 on macOS and Linux. On Windows, uv will use the same mechanism as py --list-paths to discover all available Python interpreters, and will select the first interpreter matching the requested version.

  - The Python interpreter available as python3 on macOS and Linux, or python.exe on Windows.

nindalf · 2 years ago
Feel like I called this 11 days ago - https://news.ycombinator.com/item?id=39251014

> I had to guess, that’s the path that the Astral team would take as well - expand ruff’s capabilities so it can do everything a Python developer needs. So the vision that Armin is describing here might be achieved by ruff eventually. They’d have an advantage that they’re not a single person maintenance team, but the disadvantage of needing to show a return to their investors.

eigenvalue · 2 years ago
Looks awesome. I find that pip is usually pretty fast for me, and when it's not, it is mostly because it has to download so much data or wait for native libraries to compile in the background (or anything involving cuda which always seems to take forever). What really needs some help with speed is conda, which is just so absurdly slow for literally anything, even on ridiculously powerful machines.
rogue7 · 2 years ago
For conda there is mamba [0], a drop-in replacement that's really fast.

By the way, the creator of mamba started his own company at https://prefix.dev/

They want to essentially leverage the conda(-forge) infrastructure to build a new cross-platform, cross-language, cargo-like package manager: pixi

[0] https://github.com/mamba-org/mamba

maxnoe · 2 years ago
Since last year, conda itself is actually using the mamba solver:

https://conda.org/blog/2023-11-06-conda-23-10-0-release/

daniel_grady · 2 years ago
What are some of the reasons that teams use conda (and related tools) today? As a machine learning scientist, I used conda exclusively in the mid-2010s because it was the only framework that could reliably manage Python libraries like NumPy, PyTorch, and so on, that have complex binary dependencies. Today, though, pip install works fine for those packages. What am I missing?
blactuary · 2 years ago
For me personally, I prefer conda because it is dependency resolution (mamba), virtual environments, and a package repository (conda-forge) all from one base miniconda installation. And for all of my use cases, all of those just work. Dependency solving used to be painfully slow, mamba solved that. Packages used to be way behind the latest, setting conda-forge as my default solved that.

After fiddling with different solutions for years and having to start fresh with a new Python install, I've been using nothing by miniconda for years and it just works

Ringz · 2 years ago
Unfortunately, far too often: tradition.

Using only „Pythons native tools“ like pip and venv simply works nowadays so good that I wonder about the purpose of many tools like poetry etc. etc.

dragonwriter · 2 years ago
> Today, though, pip install works fine for those packages.

pip install works, but pip's dependency management doesn't seem to (for Pytorch, specifically) which is why projects that have pip + requirements.txt as one of their installation methods will often have separate pytorch installation instructions when using that method, though if the same project supports conda installation it will be a one-stop-shop installation that way.

tehnub · 2 years ago
One reason to choose one over the other is the dependencies they’re bundled with. Take numpy. With PyPI, it’s bundled with OpenBLAS, and with conda, it’s bundled with Intel MKL, which can be faster. See https://numpy.org/install/#
paulddraper · 2 years ago
Yeah, I'm curious how much uv is actually faster.

npm -> Yarn was life changing performance-wise.

I wonder what pip -> uv is.

laborcontract · 2 years ago
Bun, ruff/uv, polars.. all have been major qol improvements.

I’m loving the pace of release with this crop of speed obsessed projects and I cannot wait for astral to tackle typing.

the_mitsuhiko · 2 years ago
From my testing in rye it’s significantly faster in day to day. There are numbers in the blog post obviously but it’s the feeling you get using it locally that makes it much more fun to use.
driverdan · 2 years ago
I did some basic testing today and uv was around 2x faster than pip for a clean venv and cold cache on a real, decent sized project. With warm uv cache it was incredibly fast, under 10 sec.
eigenvalue · 2 years ago
npm seems to have gotten a lot faster lately. All that competition from yarn and now bun seems to have pushed them to focus on optimization.
ZeroCool2u · 2 years ago
So uv uses pubgrub-rs, a rust implementation of the pubgrub version solving algorithm originally written for the dart language. I suppose I shouldn't be surprised, but I always loved Pub the Dart/Flutter packaging tool. It's the only thing out there that comes close to cargo that I know of. Fun to see these chains of inspiration reach across languages.
hprotagonist · 2 years ago
A VC-backed pip-and-more doesn't make sense to me. It's 2024: what's the revenue model when the free money printer's on the fritz?
dj_gitmo · 2 years ago
That was one of my first questions, but Anaconda exists https://www.anaconda.com/download/

Python is used by loads of scientists, academics, and other non software engineers. Those people need an easy way to use python.

nateglims · 2 years ago
I knew they sold something, but I am amazed to learn they have 300 employees.
lacker · 2 years ago
Sell build-related services to companies. Imagine GitHub Actions, but it's cleanly built into your Python tooling in some reasonable way, so it's just the natural thing to use. I think it's pretty straightforward, although we'll see whether it works out for them.
pininja · 2 years ago
NPM comes to mind. I’m imagining private package management and team support https://www.npmjs.com/products
plorkyeran · 2 years ago
npm, Inc. was an unsuccessful business that got acquired for a pittance because MS can afford to throw away a few million dollars in case it turns out that having control over npm is useful. The team working on it isn't very large but I'd still be surprised if it's actually operating at a profit.
jitl · 2 years ago
NPM did not go well, and selling to Microsoft happened after the team there fell apart. In my view some of that is leadership issues, and some of that is pressure from a struggling business.
Kwpolska · 2 years ago
NPM’s got Microsoft/GitHub behind it. I doubt those features bring in any serious money, given the abundance of free alternatives.
LtWorf · 2 years ago
I'm curious to see what the pydantic start up will do.
mixmastamyk · 2 years ago
I read their about page and it seems they want to make Python dev more productive. Maybe they have a lot of projects using it and are tired of the tooling/packaging BS. I could definitely see someone making billions allocate under 1% of it towards fixing that.

Improving Python is especially cheap compared to the productivity that could be unleashed. Surprised it isn't done more often. Only microsoft has shown significant interest, which is a shame. Perhaps changing.

capital_guy · 2 years ago
A lot of uncalled for pessimism here in the comments. If python folks reading haven’t used ruff yet, i can highly recommend it. The astral folks have proven themselves to me already. Looking forward to more and more of their rust built python tooling.