Readit News logoReadit News
heisig · 9 months ago
I recently switched to uv, and I cannot praise it enough. With uv, the Python ecosystem finally feels mature and polished rather than like a collection of brittle hacks.

Kudos to the uv developers for creating such an amazing piece of software!

matsemann · 9 months ago
Yeah, switched to writing python professionally ~4 years ago, and been low key hating the ecosystem. From a java and javascript background, it's mostly been npm/mvn install and it "just works". With python, there's always someone being onboarded that can't get it to work. So many small issues. Have to have the correct version per project, then have to get the venv running. And then installing it needs to build stuff because there's no wheel, so need to set up a complete c++ and rust toolchain etc., just to pull a small project and run it.

uv doesn't solve all this, but it's reduced the amount of ways things can go wrong by a lot. And it being fast means that the feedback-loop is much quicker.

gunalx · 9 months ago
I cannot share the same experiences. mvn is a buggy mess, randomly forgetting dependencies, and constantly needing a full clean to not die on itself. npm and the entire js ecosystem feels so immature with constant breaking changes, and circular dependency hell, when trying to uppgrade stuff.
notpushkin · 9 months ago
Python has been mostly working okay for me since I switched to Poetry. (“Mostly” because I think I’ve run into some weird issue once but I’ve tried to recall what it was and I just can’t.)

uv felt a bit immature at the time, but sounds like it’s way better now. I really want to try it out... but Poetry just works, so I don’t really have an incentive to switch just yet. (Though I’ve switched from FlakeHeaven or something to Ruff and the difference was heaven and hell! Pun in’tended.)

Wulfheart · 9 months ago
Ok, you convinced me to give it a try. Tbh, I am a casual user of python and I don't want to touch it unless I have a damn good reason to use it.
mlnj · 9 months ago
You do not need a damn good reason for this. Just try it out on a simple hello world. Then try it out on a project already using poetry for eg.

uv init

uv sync

and you're done

I'd say if you do not run into the pitfalls of a large python codebase with hundreds of dependencies, you'll not get the bigger argument people are talking about.

jorvi · 9 months ago
> I am a casual user of python and I don't want to touch it unless I have a damn good reason to use it.

I... what? Python is a beautiful way to evolve beyond the troglodyte world of sh for system scripts. You are seriously missing out by being so pertinently against it.

ffsm8 · 9 months ago
Now, if I hadn't read literally the same message for Pipenv/Pipfile and poetry before, too...

Python is going through package managers like JS goes through trends like classes-everywhere, hooks, signals etc

OutOfHere · 9 months ago
There have been incremental evolutionary improvements that were brought forth by each of the packages you named. uv just goes a lot further than the previous one. There have been others that deserve an honorary mention, e.g. pip-tools, pdm, hatch, etc. It's going to be very hard for anything to top uv.
amelius · 9 months ago
But how does it work with components that require libraries written in C?

And what if there are no binaries yet for my architecture, will it compile them, including all the dependencies written in C?

matrss · 9 months ago
IMO if you require libraries in other languages then a pure python package manager like uv, pip, poetry, whatever, is simply the wrong tool for the job. There is _some_ support for this through wheels, and I'd expect uv to support them just as much as pip does, but they feel like a hack to me.

Instead there is pixi, which is similar in concept to uv but for the conda-forge packaging ecosystem. Nix and guix are also language-agnostic package managers that can do the job.

dagw · 9 months ago
UV is not (yet) a build system and does not get involved with compiling code. But easily lets you plug in any build system you want. So it will let you keep using whatever system you are currently using for building your C libraries. For example I use scikit-build-core for building all of my libraries C and C++ components with cmake and it works fine with uv.
sirfz · 9 months ago
Yes it'll build any dependency that has no binary wheels (or you explicitly pass --no-binary) as long as said package supports it (i.e. via setup.py/pyproject.toml build-backend). Basically, just like pip would
freeamz · 9 months ago
the_mitsuhiko · 9 months ago
Unlike uv this tool is unlikely to solve problems for the average Python user and most likely will create new ones.
IshKebab · 9 months ago
Not a surprise. I said it before and I'll say it again, all the competing projects should just shut up shop for the good of Python. uv is so much better it's like pushing penny farthings after the safety bike has been invented.
nikisweeting · 9 months ago
That's rough for all the creators of poetry, pdm, pipenv, etc. to hear. They put in a ton of great work over the last decade, but I fear you may be right.
alwyn · 9 months ago
I quite really like pdm! I can see why maybe poetry but especially pipenv might be replaced with uv, but what's the value of uv over pdm beyond performance? It ticks all my boxes otherwise.
slightwinder · 9 months ago
They served their purpose for the decade, so they can be happy that they did their thing to pave the road for a good successor. uv some day will also find it successor, this is how software lives. Celebrate the life, don't cry for how it ends.
rendaw · 9 months ago
Oh yeah, pipenv which was a shoddy mess that used personal connections and reputation to get promoted on the python website and poetry where the developer did a good job dismissing requests to support common use cases (like overriding dependencies).
qwertox · 9 months ago
I've read so much positive feedback about uv, that I'd really like to use it, but I'm unsure if it fits my needs.

I was heavily invested into virtualenv until I had to upgrade OS versions, which upgraded the Python versions and therefore broke the venvs.

I tried to solve this by using pyenv, but the need of recompiling Python on every patch wasn't something which I would accept, specially in regards to boards like Raspberry Pis.

Then I tried miniconda which I initially only liked because of the precompiled Python binaries, and ultimately ended up using pyenv-managed miniforge so that I could run multiple "instances" of miniforge and therefore upgrade miniforge gradually.

Pyenv also has a plugin which allows to set suffixes to environments, which allows me to have multiple miniforges of the same version in different locations, like miniforge-home and miniforge-media, where -home has all files in the home dir and -media has all files on a mounted nvme, which then is where I put projects with huge dependencies like CUDA inside, not cluttering home, which is contained in a VM image.

It works really great, Jupyter and vscode can use them as kernels/interpreters, and it is fully independent of the OS's Python, so that OS upgrades (22.04 -> 24.04) are no longer an issue.

But I'm reading about all these benefits of uv and wish I could use it, but somehow my setup seems to have tied my hands. I think I can't use uv in my projects.

Any recommendations?

Edit: Many of my projects share the same environment, this is absolutely normal for me. I only create a new environment if I know that it will be so complex that it might break things in existing environments.

the_mitsuhiko · 9 months ago
I’m a bit confused why uv is not an option for you. You don’t need to compile Python, it manages virtualenvs for you, you can use them with Jupyter and vscode. What are you missing?
qwertox · 9 months ago
So the only difference is that Conda also isolates "system" libraries (like libcublasLt.so), or does uv also do this?

It's not that uv is not an option for me, I made this move to miniforge before uv was on my radar because it wasn't popular, but I'm still at a point where I'm not sure if uv can do what I need.

be7a · 9 months ago
Have you checked out https://github.com/prefix-dev/pixi? It's built by the folks who developed Mamba (a faster Conda implementation). It supports PyPI dependencies using UV, offers first-class support for multi-envs and lockfiles, and can be used to manage other system dependencies like CUDA. Their CLI also embraces much of the UX of UV and other modern dependency management tools in general.
datadeft · 9 months ago
I have moved to uv few months back and never looked back. I use it with venv and it works very well. There is a new environment handling way with uv:

- uv init new-py-env

- cd new-py-env

- uv add jupyter

- uv build

These are executed super fast. Not sure if this could help your situation but it is worth to be aware of these.

secondcoming · 9 months ago
The python ecosystem has become a disaster. Even reading your post gave me a headache.
mihaic · 9 months ago
I keep reading praise about uv, and every single time I never really understand what problems it addresses.

I've got a couple quite big Django projects for which I've used venv for years, and not once have I had any significant issues with it. Speed at times could have been better and I would have liked to have a full dependency list lock file, but that never caused me issues.

The only thing that comes to mind is those random fails to build of C/C++ dependencies. Does uv address this? I've always seen people rave about other benefits.

chippiewill · 9 months ago
The benefit that uv adds is it's a one-stop-shop that's also wicked fast.

If you use venv then you have extra steps because you have to explicitly create the venv, then explicitly install the deps there with pip. If your project is designed for a specific python version then developers have to manage that separately (usually pyenv these days).

For people building apps uv replaces venv, pip and pyenv, while being way faster at doing all three of those (you can completely rebuild the virtualenv and install the dependencies from scratch in under a second usually because uv is faster at creating a virtualenv than venv and is very quick at relinking the dependencies from a package cache).

hansihe · 9 months ago
What makes it so great for me is the effortlessness.

I often use Python for quick one off scripts. With UV I can just do `uv init`, `uv add` to add dependencies, and `uv run` whatever script I am working on. I am up and running in under a minute. I also feel confident that the setup isn't going to randomly break in a few weeks.

With most other solutions I have tried in the Python ecosystem, it always seemed significantly more brittle. It felt more like a collection of hacks than anything else.

aneidon · 9 months ago
You can even inline the dependencies:

https://docs.astral.sh/uv/guides/scripts/#declaring-script-d...

That plus this: https://news.ycombinator.com/item?id=42855258

Makes it pretty seamless for one-off scripts.

ashikns · 9 months ago
I'm in the same boat. Sure it's nice and better, but I haven't felt so much annoyance with the python ecosystem that I desperately need something better. I use VS Code and it takes care of venv automatically, so I am biased by that.
brylie · 9 months ago
As an aside, I can't praise the Wagtail CMS highly enough. It sets a high bar for usability and accessibility of the auto-generated content management UI.

The developer experience is top notch with excellent documentation and many common concerns already handled by Wagtail or Django. A significant amount of Wagtail-specific code is declarative, essentially describing data model, relationships, and UI fields. The parts that you don't need stay out of the way. It's also agnostic of the type of front-end you want, with full and automatic support for headless mode with JavaScript client, using traditional Django templates SSR, or using a dynamic approach like HTMX.

Kudos to the Wagtail team!

ThibWeb · 9 months ago
ty! We have no plans to rewrite Wagtail in Rust but I hope there’s ways in which we can make the developer experience better, particularly around dependencies management
ZuLuuuuuu · 9 months ago
PyCharm also added uv support in their latest versions.

We recently switched to PDM in our company because it worked very well in our tests with different package/dependency managers. Now I am rethinking if we should switch to uv while PDM usage is still not very wide-spread in our company. But PDM works very well, so I am not sure whether to keep using it.

ThibWeb · 9 months ago
With the caveat I only have the package installers usage data for Wagtail downloads – pdm usage has fallen off a cliff, from 0.2% of downloads in January 2024, to 0.01% in January 2025. Roughly matches the uptake of uv.

Doesn’t make pdm bad in itself but that means there’ll be fewer pdm users around to report bugs, potentially fewer contributors to it too, fewer resources, etc.

ZuLuuuuuu · 9 months ago
Indeed, on one hand PDM works great, but on the other hand we wouldn't want to choose a package manager which might not be maintained anymore after a few years because there are just not many users of it.
chippiewill · 9 months ago
Back when PDM was still pushing __pypackages__ for standardisation I think PDM made sense, but honestly I don't think it adds anything over uv and is just going to be slower for the most part.
BerislavLopac · 9 months ago
As much as I am glad that it looks like one solution is being more and more accepted as the golden standard, I'm a little disappointed that PDM [0] -- which has been offering pretty much everything uv does for quite some time now -- has been completely overlooked. :(

[0] https://pdm-project.org

porridgeraisin · 9 months ago
pdm actually supports using uv as the resolver

https://pdm-project.org/en/latest/usage/uv/

TOMDM · 9 months ago
For the uninitiated what is the benefit of UV over pip?

I've been working with pip for so long now that I barely notice it unless something goes very wrong.

NeutralForest · 9 months ago
- uv is aware of your dependencies, you can add/remove development dependencies, create group of development dependencies (test, lint, dev, etc) and add or remove those and only those at will. You can add dependencies and optional dependencies for a project as well, think my_app[cli,standard]. You don't need to have different requirements.txt for each case nor do you need to remove things by hand as you'd do in pip, since it doesn't remove deps when you remove a package for example. As a result, you can remove {conda,poetry,...} from your workflows.

- uv can install python and a virtualenv for you. Any command you run with `uv run` from the root of a repo will be aware of its environment, you don't even need to activate a virtualenv anymore. This replaces {pyenv, pyenv-virtualenv, virtualenvwrapper,...}.

- uv follows the PEPs for project config (dependencies, optional dependencies, tool configs) in the pyproject.toml so in case uv dies, it's possible to migrate away for the features are defined in the PEPs. Which is not the case for say, poetry.

- uv has a lock file and it's possible to make deps platform specific (Windows, Linux, MacOS, etc). This is in compliance with a PEP but not supported by all tools.

- uv supports custom indexes for packages so you can prefer a certain index, for example your company package index or pytorch's own index (for ML work).

- very fast, makes local dev very seamless and is really helpful in CI/CD where you might just setup and tear down python envs a lot.

Also, the team is responsive on Github so it's easy to get help.

adrian17 · 9 months ago
Does this also replace, or work well with tox? We currently use it to run basic CI/local workflows (`tox -e lint` for all linters, `tox -e py310`, `tox -e py312` to run tests suites on chosen interpreters' environments), and to set up a local environment with package installed in-place (so that we can run `myprogram -arg1 -arg2` as if it was installed via `pip`, but still have it be editable by directly editing the repo).

With how much the ecosystem is moving, I don't know whether the way we're doing it is unusual (Django and some other big projects still have a tox.ini), obsolete (I can't find how ux obsoletes this), or perfectly fine and I just can't find how to replace pip with ux for this use case.

TOMDM · 9 months ago
Honestly this sounds more likely to replace some workflows I historically would have done with Docker.

The pain of creating a python environment that is durable across different deployments had me going for the nuclear option with full containerisation.

mbeex · 9 months ago
...

- uv tool replaces pipx etc.

- uv pip --tree replaces pipdeptree (including 'inverse' mode)

- ...

rschiavone · 9 months ago
Not only it's faster, it also provides a lock file, `uvx tool_name` just like `npx`, and a comprehensive set of tools to manage your Python version, your venv and your project.

You don't need `pyenv`, `poetry` and `pipx` anymore, `uv` does all of that for you.

shellac · 9 months ago
> over pip

It's a much more complete tool than pip. If you've used poetry, or (in other languages) cargo, bundler, maven, then it's like that (and faster than poetry).

If you haven't, in addition to installing dependencies it will manage and lock their versions (no requirements.txt, and much more robust), look after the environment (no venv step), hold your hand creating projects, and probably other things.

Edit to add: the one thing it won't do is replace conda et al, nor is it intended to.

atoav · 9 months ago
The problems start as soon as your scripts should run on more than your own computer.

If you pip install something, you install it on the system python (the python binary located at sys.executable). This can break systems if the wrong combination of dependencies comes together. This is why you should never install things via pip for other people, unless you asked them first.

Now how else would you install them? There is a thing called virtual environments, which basically allows you to install pip dependencies in such way, they are only there within the context of the virtual environment. This is what you should do when you distribute python programs.

Now the problem is how do you ensure that this install to the virtual environment uses specific versions? What happens when one library depends on package A with version 1.0 and another library depends on a package with version 2.0? Now what happens if you deploy that to an old debian with an older python version.. Before uv I had to spend literal days to resolve such conflicts.

uv solves most of these problems in one unified place, is extremely performant, just works and when it does not, it tells you precisely why.

BiteCode_dev · 9 months ago
The whole explaination is here: https://www.bitecode.dev/p/a-year-of-uv-pros-cons-and-should

The td;rd is that is has a lot less modes of failure.

montebicyclelo · 9 months ago
It brings way more to the table than just being fast, like people are commenting. E.g. it manages Python for your projects, so if you say you want Python 3.12 in your project, and then you do 'uv run python my script.py', it will fetch and run the version of Python you specified, which pip can't do. It also creates lock files, so you know the exact set of Python package dependencies that worked, while you specify them more loosely. Plus a bunch of other stuff..
globular-toast · 9 months ago
The only advantage over pip is it's faster. But the downside is it's not written in Python.

The real point of uv is to be more than pip, though. It can manage projects, so basically CLI commands to edit your `pyproject.toml`, update a lockfile, and your venv all in one go. Unlike earlier tools it implements a pretty natural workflow on top of existing standards where possible, but for some things there are no standards, the most obvious being lockfiles. Earlier tools used "requirements.txt" for this which was quite lacking. uv's lockfile is cross-platform, although, admittedly does produce noisier diffs than requirements.txt, which is a shame.

chippiewill · 9 months ago
As a straight pip replacement, yeah it's mostly just faster. Although it does have a few breaking changes that make it more secure (it has a more predictable way of resolving packages that reduce the risk of package squatting).
jonatron · 9 months ago
Faster.
maratc · 9 months ago
Ok, and what's the advantage for the people who don't have "my pip is too slow" problem?
prashnts · 9 months ago
I can't stress how fast it is when using on resource constrained envs like a Pi Zero.

I intend to use system python there but previously poetry will simply crash the whole Pi while installing itself.