Readit News logoReadit News
quickslowdown · a year ago
I highly, highly recommend uv. It solves & installs dependencies incredibly fast, and the CLI is very intuitive once you've memorized a couple commands. It handles monorepos well with the "workspaces" concept, it can replace pipx with "uv tool install," handle building & publishing, and the docker image is great, you just add a FROM line to the top and copy the bin from /uv.

I've used 'em all, pip + virtualenv, conda (and all its variants), Poetry, PDM (my personal favorite before switching to uv). Uv handles everything I need in a way that makes it so I don't have to reach for other tools, or really even think about what uv is doing. It just works, and it works great.

I even use it for small scripts. You can run "uv init --script <script_name.py>" and then "uv add package1 package2 package3 --script <script_name.py>". This adds an oddly formatted comment to the top of the script and instructs uv which packages to install when you run it. The first time you run "uv run <script_name.py>," uv installs everything you need and executes the script. Subsequent executions use the cached dependencies so it starts immediately.

If you're going to ask me to pitch you on why it's better than your current preference, I'm not going to do that. Uv is very easy to install & test, I really recommend giving it a try on your next script or pet project!

actinium226 · a year ago
The script thing is great. By the way those 'oddly formatted' comments at the top are not a uv thing, it's a new official Python metadata format, specifically designed to make it possible for 3rd party tools like uv to figure out and install relevant packages.

And in case it wasn't clear to readers of your comment, uv run script.py creates an ephemeral venv and runs your script in that, so you don't pollute your system env or whatever env you happen to be in.

fluidcruft · a year ago
I generally agree but one thing I find very frustrating (i.e. have not figured out yet) is how deal with extras well, particularly with pytorch. Some of my machines have GPU, some don't and things like "uv add" end up uninstalling everything and installing the opposite forcing a resync with the appropriate --extra tag. The examples in the docs do things like CPU on windows and GPU on Linux but all my boxes are linux. There has to be a way to tell it that "hey I want --extra GPU" always on this box. But I haven't figured it out yet.
shawnz · a year ago
Getting the right version of PyTorch installed to have the correct kind of acceleration on each different platform you support has been a long-standing headache across many Python dependency management tools, not just uv. For example, here's the bug in poetry regarding this issue: https://github.com/python-poetry/poetry/issues/6409

As I understand it, recent versions of PyTorch have made this process somewhat easier, so maybe it's worth another try.

DrBenCarson · a year ago
It sounds like you’re just looking for dependency groups? uv supports adding custom groups (and comes with syntactic sugar for a development group
0xcoffee · a year ago
quickslowdown · a year ago
You can control dependencies per platform

https://docs.astral.sh/uv/concepts/projects/dependencies/#pl...

Not sure if it's as granular as you might need

satvikpendem · a year ago
This happened to me too, that is why I stopped using it for ML related projects and stuck to good old venv. For other Python projects I can see it being very useful however.
ibic · a year ago
I'm not sure if I got your issue, but I can do platform-dependent `index` `pytorch` installation using the following snippet in `pyproject.toml` and `uv sync` just handles it accordingly.

[tool.uv.sources] torch = [{ index = "pytorch-cu124", marker = "sys_platform == 'win32'" }]

synergy20 · a year ago
i use uv+torch+cuda on linux just fine,never used the extra flag, i wonder what's the problem here?
baby_souffle · a year ago
I didn't know that UV would now edit the script for you. That is just icing on the cake!

For the curious, the format is codified here: https://peps.python.org/pep-0723/

midhun1234 · a year ago
Can confirm this is all true. I used to be the "why should I switch" guy. The productivity improvement from not context switching while pip installs a requirements file is completely worth it.
scribu · a year ago
The install speed alone makes it worthwhile for me. It went from minutes to seconds.
BoorishBears · a year ago
I was working on a Raspberry Pi at a hackathon, and pip install was eating several minutes at a time.

Tried uv for the first time and it was down to seconds.

mcintyre1994 · a year ago
That scripting trick is awesome! One of the really nice things about Elixir and its dependency manager is that you can just write Mix.install(…) in your script and it’ll fetch those dependencies for you, with the same caching you mentioned too.

Does uv work with Jupyter notebooks too? When I used it a while ago dependencies were really annoying compared to Livebook with that Mix.install support.

uasi · a year ago
uv offers another useful feature for inline dependencies, which is the exclude-newer field[1]. It improves reproducibility by excluding packages released after a specified date during dependency resolution.

I once investigated whether this feature could be integrated into Mix as well, but it wasn't possible since hex.pm doesn't provide release timestamps for packages.

> Does uv work with Jupyter notebooks too?

Yes![2]

[1] https://docs.astral.sh/uv/guides/scripts/#improving-reproduc... [2] https://docs.astral.sh/uv/guides/integration/jupyter/

para_parolu · a year ago
As a person who don’t work often on python code but occasionally need to run server or tool I find UV blessing. Before that I would beg people to help me just not to figure out what combination of obscure python tools I need. Now doing “uv run server.py” usually works.
insane_dreamer · a year ago
uv is great and we’re switching over from conda for some projects. The resolver is lightning fast and the toml support is good.

Having said that, there are 2 areas where we still need conda:

- uv doesn’t handle non-python wheels, so if you need to use something like mkl, no luck

- uv assumes that you want to use one env per project. However with complex projects you may need to use a different env with different branches of your code base. Conda makes this easy - just activate the conda env you want — all of your envs can be stored in some central location outside your projects — and run your code. Uv wants to use the project toml file and stores the packages in .venv by default (which you don’t want to commit but then need different versions of). Yes you can store your project venv elsewhere with an env var but that’s not a practical solution. There needs to be support for multiple .toml files where the location of the env can be specified inside the toml file (not in an env var).

serjester · a year ago
You may want to checkout uv’s workspaces - they’re very handy for large mono repos.
bogdart · a year ago
You can create another venv in the same folder with different name. ‘uv venv my-name’ does the thing.
ibic · a year ago
I happened to use uv recently for a pet project, and I totally agree with you. It's really really good. I couldn't believe its dependency resolution and pulling can be so fast. Imho, it's the python package manager (I don't know the most suitable name to categorize it) done right, everything just works, the correct way.
y1zhou · a year ago
Adding dependencies to the script directly was a game-changer. I was able to write a script for a friend with no coding background at all and everything ran smoothly on his machine. No more rabbit holes of bundling Python packages and setting up environments!
samuell · a year ago
Did you try pixi [1] too?

[1] https://pixi.sh/latest/

quickslowdown · a year ago
I have tried Pixi, I was a big fan, but at the time it involved a LOT of manually editing text files & I didn't want to deal with that. And now I have a hard time imagining installing from Conda could possibly be faster than the sheer speed of uv
crabbone · a year ago
> It solves & installs dependencies incredibly fast

If you are lucky, and you don't have to build them, because the exceptionally gifted person who packaged them didn't know how to distribute them and the bright minds running PyPI.org allowed that garbage to be uploaded and made it so pip would install that garbage by default.

> can replace pipx with "uv tool install,"

That's a stupid idea. Nobody needed pipx in the first place... The band-aid that was applied some years ago is now cast in stone...

The whole idea of Python tools trying to replace virtual environment, but doing it slightly better is moronic. The virtual environments is the band-aid. It needs to go. The Python developers need to be pressured into removing this garbage, and instead working on having program manifests or something similar. Python has virtual environments due to incompetence of its authors and unwillingness to make things right, once that incompetence was discovered.

----

NB. As it stands today, if you want to make your project work well, you shouldn't use any tools that install packages by solving dependencies and downloading them from PyPI. It's not the function of the tool doing that, it's the bad design of the index.

The reasonable thing to do is to install the packages (for applications) you need during development, figure out what you actually need, and then store the part you need for your package to work locally. Only repeat this process when you feel the need to upgrade.

If you need packages for libraries, then you need a way to install various permutations within allowed versions: no tool for package installation today knows how to do it. So, you might as well not use any anyways.

But, the ironic part is that nobody in Python community does it right. And that's why there are tons of incompatibilities, and the numbers increase dramatically when projects age even slightly.

quickslowdown · a year ago
Uv addresses all of your complaints in an elegant way. It's ok if you don't understand its purpose, this list is confusing to me, it's incredibly specific and barely related to uv.

Dead Comment

IshKebab · a year ago
Uv really fixes Python. It takes it from "oh god I have to fight Python again" to "wow it was actually fast and easy".

I think all the other projects (pyenv, poetry, pip, etc.) should voluntarily retire for the good of Python. If everyone moved to Uv right now, Python would be in a far better place. I'm serious. (It's not going to happen though because the Python community has no taste.)

The only very minor issue I've had is once or twice the package cache invalidation hasn't worked correctly and `uv pip install` installed an outdated package until I `uv clean`ed. Not a big deal though considering it solves so many Python clusterfucks.

javchz · a year ago
Agree. I mostly do front end in my day job, and despite JavaScript being a bit of a mess lang, dealing with npm is way better than juggling anaconda, miniforge, Poetry, pip, venv, etc depending on the project.

UV is such a smooth UX that it makes you wonder how something like it wasn’t part of Python from the start.

baq · a year ago
+1

…but we did have to wait for cargo, npm (I include yarn and pnpm here) and maybe golang to blaze the ‘this is how it’s done’ trail. Obvious in hindsight.

Aeolun · a year ago
More importantly, migrating from npm, to pnpm, to yarn, to bun, is very nearly seamless. Migrating in the Python ecosystem? Not anywhere close.
EdwardDiego · a year ago
Feels like you're doing it wrong if you're dealing with all of those.
zelphirkalt · a year ago
You mean off the job you have to juggle all those tools? On the job that would be kind of crazy, to allow every project its own tool chain.
dilawar · a year ago
True.

I had to give up on mypy and move to pyright because mypy uses pip to install missing types and they refuse to support uv. In the CI pipeline where I use UV, I don't have a pip installed so mypy complains about missing pip.

Of course I can do it by myself by adding typing pkgs to requirement.txt file then what's the point of devtools! And I don't want requirements.txt when I already got pyproject.toml.

Once you get used to cargo from rust, you just can't tolerate shitty tooling anymore. I used to think pip was great (compared to C++ tooling).

WhyNotHugo · a year ago
Mypy doesn't install anything by default, you're probably setting the `--install-types` flag somehow.
IshKebab · a year ago
Pyright is waaay better than Mypy anyway so I'd say they did you a favour.
tacitusarc · a year ago
I think their only big gap is the inability to alias general project non-python scripts in uv. This forces you to use something like a justfile or similar and it would be much more ergonomic to just keep it all in uv.
guappa · a year ago
uv belongs to a startup. They will surely introduce some wacky monetisation scheme sooner or later.

I wouldn't get too used to it.

IshKebab · a year ago
Maybe, but even if that is the case it's sooooo much better that even the worst case (fork when they try to monetise it) is way better than any alternatives.
loeber · a year ago
Strong agree. The respectful act of other package managers would be consider themselves deprecated and point to uv instead.
baq · a year ago
The risk is obviously uv losing funding. I kinda hope the PSF has thought about this and has a contingency plan for uv winning and dying/becoming enshittified soon after.
tootie · a year ago
Every time people have debates over the merits of languages I always put developer environment at the top of my list. Build tools, IDE, readable stack traces. Those things boost productivity for more than concise list comprehensions or any gimmicky syntax thing. It's why Python always felt stone age to me despite have such lovely semantics.
albert_e · a year ago
I am sold. Sign me up.

I have never used virtual environments well -- the learning curve after dealing with python installation and conda/pip setup and environment variables was exhausting enough. Gave up multiple times or only used them when working through step wise workshops.

If anyone can recommend a good learning resource - would love to take a stab again.

kyawzazaw · a year ago
Which companies run heavily (either solely or huge parts) run on Python? They should take initiative and start blogging.
crabbone · a year ago
Uv doesn't fix anything. The fixing that Python needs is the removal of the concept of virtual environments and fixing the import and packaging systems instead.

The only thing it does, it makes bad things happen faster. Who cares...

IshKebab · a year ago
Well maybe "fixes" is the wrong word. It certainly fixes the bad UX caused by virtual environments.

Basically it handles the virtual environments for you so you don't have to deal with their nonsense.

But you're right it doesn't fix it in the same way that Deno did.

robertlagrant · a year ago
I totally disagree. Having a single vendor with that much power is a bad idea. If the PSF were able to focus on tooling rather than their current focus, they would be great stewards of this sort of thing. Sadly I doubt that will happen, in which case I think many options is the best approach.
IshKebab · a year ago
> If the PSF were able to focus on tooling rather than their current focus

Well yeah maybe if the PSF were able to get their shit together it wouldn't have taken a single third party vendor to do it for them. But they weren't and it did, so here we are.

kubav027 · a year ago
I am pretty happy with poetry for near future. I prefer using python interpreters installed by linux package manager. In cloud I use python docker. Poetry recently added option to install python too if I changed my mind.

I have already setup CI/CD pipelines for programs and python libraries. Using uv would probably save some time on dependency updates but it would require changing my workflow and CI/CD. I do not think it is worth the time right now.

But if you use older environments without proper lock file I would recommend switching immediately. Poetry v2 supports pyproject.toml close to format used by uv so I can switch anytime when it would look more appealing.

Another thing to consider in long term is how astral tooling would change when they will need to make money.

js2 · a year ago
> I prefer using python interpreters installed by linux package manager.

uv will defer to any python it finds in PATH as long as it satisfies your version requirements (if any):

https://docs.astral.sh/uv/concepts/python-versions/

It also respects any virtual environment you've already created, so you can also do something like this:

   /usr/bin/python3 -m venv .venv
   .venv/bin/pip install uv
   .venv/bin/uv install -r requirements.txt # or
   .venv/bin/uv run script ...
It's a very flexible and well thought out tool and somehow it manages to do what I think it ought to do. I rarely need to go to its documentation.

> Using uv would probably save some time on dependency updates but it would require changing my workflow and CI/CD.

I found it very straightforward to switch to uv. It accommodates most existing workflows.

irjustin · a year ago
I'm pretty much with you and still trying to figure out why I want to switch away from pyenv+poetry.

I get that uv does both, but I'm very happy with pyenv+poetry combo.

Old baggage, but I came from the rvm world which attempted to do exactly what uv does, but rvm was an absolute mess in 2013. rbenv+bundler solved so many problems for me and the experience was so bad that when I saw uv my gut reaction was to say "never again".

But this thread has so many praises for it so one day maybe i'll give it a try.

armanckeser · a year ago
Uv dependency solving is light years faster than poetry. If you are working on actual projects with many dependencies, poetry is a liability
WhyNotHugo · a year ago
Yeah, using the package manager is the logical choice and usually the most likely one to work.

IIRC, uv downloads dynamically linked builds of Python, which may or may not work depending on your distribution and whether linked libraries are locally available or not. Not sure if things have changed in recent times.

kylecordes · a year ago
UV is such a big improvement that it moves Python from my "would use again if I had to, but would really not look forward to it" pile to my "happy to use this as needed" pile. Without disparaging the hard work by many that came before, UV shows just how much previous tools left unsolved.
crabbone · a year ago
It doesn't do anything differently beside the speed... Why do people keep praising it so much? It doesn't solve any of the real problems... come on. The problems weren't the tools, the problems are the bad design of the imports and packaging systems which cannot be addressed by an external tool: the language needs to change.
ptx · a year ago
What are the design problems with the imports and packaging systems? How do they need to change?
TheIronYuppie · a year ago
For scripting... HIGHLY recommend putting your dependencies inline.

E.g.:

  #!/usr/bin/env python3
  # /// script
  # requires-python = ">=3.11"
  # dependencies = [
  #     "psycopg2-binary",
  #     "pyyaml",
  # ]
  # ///
Then -

  uv run -s file.py

maleldil · a year ago
How does this interact with your code editor or IDE? When you edit the file, where does the editor look for information about the imported third-party libraries?
AlphaSite · a year ago
Usually the VENV and import lines are enough
marcthe12 · a year ago
Do you need a wrapper script for scripts in the PATH or execve? I would usualy chmod+x the script but I am not sure here.
Manfred · a year ago
If you want to make it work regardless of where uv is installed, you can use the following shebang line:

  #!/usr/bin/env uv run --script

JimDabell · a year ago
Discussed here:

> Using uv as your shebang line

https://news.ycombinator.com/item?id=42855258

Since `env` doesn’t pass multiple arguments by default, the suggested line uses `-S`:

   #!/usr/bin/env -S uv run --script

tetha · a year ago
Not at a laptop to try this right now, but shouldn't this be possible with the shebang? Something along the lines of:

    #!/home/tetha/Tools/uv run

runjake · a year ago
For my use cases, uv is so frictionless it has effectively made Python tolerable for me. I primarily discovered it via Simon Willison's (@simonw) blog posts[1]. I recommend his blog highly.

1. https://simonwillison.net/tags/uv/

vslira · a year ago
I'm using exclusively uv for personal projects - and small prototypes at work - and I can't recommend it enough.

Uv makes python go from "batteries included" to "attached to a nuclear reactor"

scratchyone · a year ago
i’ve started slipping uv into production work projects along with an auto generated requirements.txt for anyone who doesn’t wanna use uv. hoping i can drive adoption on my team while still leaving an alternative for people who don’t wanna use it
globular-toast · a year ago
You mean `uv pip compile pyproject.toml > requirements.txt`?
avidphantasm · a year ago
I have been using Python for 20 years, and have been an intermediate to advanced user of it for last 5-7 years. I use it mostly for scientific computing (so lots of Numpy, SciPy, etc.), IoT data processing, and also for some microservices that don’t need to be super fast. I publish and maintain a few packages in PyPI and conda (though I almost never use conda myself), including a C++ library with Python bindings generated by SWIG (SWIG wouldn’t be my first choice, but I inherited it).

In what I’ve done, I’ve never found things like pipenv, let alone uv, to be necessary. Am I missing something? What would uv get?

crabbone · a year ago
If you need to package for Anaconda, uv has nothing to offer you. It's a replacement for a number of PyPA tools, so it's not compatible with Anaconda tools.

The selling point of uv is that it does things faster than the tools it aims to replace, but on a conceptual level it doesn't add anything substantially new. The tools it aims to replace were borne of the defects in Python import and packaging systems (something that Anaconda also tried to address, but failed). They are not good tools designed to do things the right way. They are band-aids designed to mitigate some of the more common problems stemming from the bad design choices in the imports and packaging systems.

My personal problem with tools like uv is that, just like Web browsers in the early days of the Web tried to win users by tolerating the mistakes made by the Web site authors, it allows to delay the solution of the essential problems that exist in Python infrastructure by offering some pain relief to those who are using the band-aid tools.