A lot of our core packaging development is now happening in uv [1]. Rye uses uv under the hood, so as we improve uv, Rye gets better too.
E.g., we recently added support for "universal" resolution in uv, so you can generate a locked requirements.txt file with a single resolution that works on all platforms and operating systems (as opposed to _just_ the system you're running on). And Rye supports it too in the latest release.
It works properly with PyTorch. For what it's worth at $DAYJOB we switched from Poetry to Rye starting from version 0.15 (even before it supported uv) for that reason initially.
Any ongoing work/plans to follow for cross-platform lock files?
This is one concern that would prevent the team I'm on from moving. We do ML work, so I'll use a pytorch-based project for my example. The desired state is that all dependencies are in pyproject.toml, e.g., from which you can generate a set of lockfiles from an AArch64 Mac or an AMD64 Windows workstation for the following platform configurations:
1. Mac with default (MPS) pytorch
2. Windows with CUDA pytorch
3. AArch64 Linux with CPU pytorch
4. AMD64 Linux with CPU pytorch
5. AMD64 Linux with CUDA pytorch
P.S. Big thanks for making Ruff, we're huge fans of linting and formatting both.
I have almost the same problem. With Poetry, I managed to work around it using this technique, involving a combination of groups and extras: https://github.com/lucaspar/poetry-torch
It's a hacky workaround, but it seems to work so far. It would be much nicer to see this solved in a better way, though!
Poetry does cross platform lockfiles, but the absence of cross platform lockfiles is one of the reasons why uv benchmarks significantly faster (on top of using Rust for the resolve)
Great work. I've switched to using Rye now as I used to have to occasionally setup a new computer to work on a project and it was always a complete pain (pyenv+venv+pip).
Given that you guys are in charge of both uv and rye, why keep two alive at the same time? Why not just kill rye now to avoid fragmentation, and confusion that comes with the burden of having to choose between the two?
You have a lot of firms that care about predictable and performant building that are shifting to you, and one of the things that comes up is SBOM generation for ingestion into tools like guac.
Your recently added ability to unpin dependencies so devs are more encouraged to stay compatible as they dev, then generate a correct explicit requirements.txt for reproducibility, makes both vuln management and the SBOM step a far easier thing than poetry etc.
How does it compare to Pipenv and Poetry? I had some problems every time I used Poetry. I wanted to like it but it hanged or took forever and similar things often.
Setting up a new project is: rye init && rye sync
Adding a dep is: rye add flask && rye sync
You can pin your python version in the pyproject.toml
Migrating from an established project is a little hard than it should be. Importing the requirements.txt into the pyproject.toml is not a good idea as uv gets itself in a twist with all the low-level dependencies that exist in the requirements.txt. I've never tried it with a poetry made pyproject.toml, report back if you try it.
On the whole its a good experience, fast and straight forward.
This is awesome. I’ve really struggled with cross-platform resolutions in my bazel setup, as our services are built for containers that run in k8s, but we also want to be able to build and run locally for scripts, tests, etc. I have a branch that uses PDM, which works much better, but there are still a ton of hacks in it. Rye looks like it could help quite a bit.
Most packages don't do that. You can get really far by assuming that all of a package's wheels have the same set of dependencies (maybe using environment markers), and that its sdist always returns the same list of dependencies (also maybe using environment markers). No, it's not perfect, but it's also what Poetry and PDM do as far as I know.
I like how you ignore the "best practices" for packaging created by PyPA (i.e project.toml and friends) and just do requirements.txt cowboy stuff.
Don't get me wrong, both are hilariously bad, but I like to see more chaos in Python infrastructure. You pushing against PyPA (rightfully) delegitimizes them.
> No Python Binary Distributions: CPython builds from python.org are completely inadequate. On some platforms you only get an .msi installer, on some you literally only get tarballs. The various Python distributions that became popular over the years are diverging greatly and cause all kinds of nonsense downstream. This is why this Project uses the indygreg standalone builds. I hope that with time someone will start distributing well maintained and reliable Python builds to replace the mess we are dealing with today.
And here is info about those particular indygreg builds.
I've never really experienced the problem they are describing. Any official Python build I've gotten from python.org has worked. Every normal old Python I've gotten from my distribution's package manager has worked. Every Python included with an official Docker container image has worked.
I'm sure their special builds will also work just fine, but their non-standardness gives me pause. There's even a list of behavior quirks. Why are we adding quirks to something that has no problems? And the fact that the rye philosophy seems so intent on using them, turns me off from it compared to the alternatives that sensibly default to using the Python you already have available on your OS.
I'm just guessing, but I imagine the scenario goes like this:
1. Work at a company that runs some ancient version of CentOS or something.
2. The system package manager doesn't have anything newer than Python 3.6.
3. Official binaries from python.org print an inscrutable error about the glibc version when you try to run them.
4. Building Python from source requires some new system dependencies, takes forever, and then produces an even less scrutable error about a missing header file.
5. Googling that error leads to a ./configure flag that works around the issue, which upstream GCC fixed in 2017.
6. Success?
If you haven't seen that error #3 before, or dealt with "manylinux" build environments, you've dodged an entire world of pain. The same goes for scripting installs on Windows, and for the part of that page that's talking about "limiting the CPU instructions that can be used" :')
I have been in the #3 hell, almost exactly how you described but it was always about SSL and its missing headers. On my desktop wiki, the most important section about Python is the one that contains the incantations required to compile SSL, setting myriad of variables and using pyenv to build a newer (3.10/3.11/3.12) Python.
Python building, packaging and deployment has two extreme states: the king's highway and the hall of a thousand knives. If the portable Python suggestions do not make sense to you, then consider yourself lucky, because you have managed to stick to the highway.
I regularly download the python source code, compile it with standard prod optimizations, then install to /use/local/python${version}. This works extremely consistently since python 3.7 (released in 2018). In my experience, these commands are so stable and consistent they could be automated away. What might the author's issue or underlying protest be?
I've also compiled python from source a good amount, and it usually works... until some thing where I realize some standard lib wasn't compiled because I was missing an optional dependency. But some lib assumes that it was always included cuz the standard distro is.
I think it's easy to compile Python, but it's easy to end up just having to go re-compile it for some random component that was not compiled in the "standard" fashion.
If you have a good test suite this stuff shows up quite loudly though. At one point the core issue is more collaborators wanting to not have to compile things themselves.
(And to "automating away" as a comment... indygreg's releases _are_ this! Someone has done the work for us all)
Same. We build our own Python and have been running it for years without a single hiccup. Not sure what the big fuss is. Pyenv does the same thing.
The concern could be absolved by simply improving the docs with the most recommended compile flags. I think they are actually noted there. Also of note our build time decreased substantially with llvm.
Well an obvious issue is that you have to do that!
Also I think a big issue is the inconsistency between platforms. For example the official Python installed doesn't include python3.exe (frankly because the devs are idiots), but the one from the Microsoft app store does!
If you stay on one platform you wouldn't see those issues.
> No Python Binary Distributions: CPython builds from python.org are completely inadequate. On some platforms you only get an .msi installer, on some you literally only get tarballs.
I'm just guessing, but they could mean that there are no macOS/Windows binaries for security releases of older Python versions. You can't, for example, download an installer for Python 3.10.14. The last Windows installer is for Python 3.10.11 (April 5, 2023).
I tried Rye during its first days. It would (without any indication) download some custom build of Python, which was dynamically linked so won't work unless you're running a distribution similar to the build environment.
Linux distributions ARE NOT binary compatible, you can't just build Python on one distro and run in on another. You need to statically link everything to do that.
The way different Pythons from PyPI don't work can be, for example that various optional modules are (not) included. For example, for my own Python installs, I build without tkinter. I have no use for this module, and it's always a burden to find the outdated XWidget libraries necessary for this pile of garbage to build.
Seeing how this module is basically useless, a lot of Linux distros also exclude it from the Python they package. But PyPI builds try to include everything.
There are few more modules like that.
Another aspect is various defaults that help Python locate its parts or control its loading. Eg. the site module or sysconf. For various reasons various vendors may configure these differently. This may result in some packages being broken upon install.
I.e. Python programmers are notoriously bad at creating packages and understanding how they work (also, Wheel format is a dumpster fire of nonsense). So, a lot of developers don't understand the consequences of packaging anything that's not strictly Python source code (which, frankly, should never have been packaged! but hey, remember, Wheel? dumpster fire? So... where was I...) anyways, native libraries packaged with python source may end up in some random place Python doesn't look at, and consequently fails to load, or other data files end up in the wrong place because the developer packaging them after countless trial and error has made it work on their computer, with their particular Python installation, but they still don't know why it worked.
Similarly, if a package wants to provide header files so that other packages can compile against the native modules the package provides... oh-ho-ho, bet you didn't know that's possible, right?! Well, don't bother. It mostly doesn't work anyways. And there's a bunch more stuff like that.
As a "typical" user, you might have never encountered any of the issues above, but me, as a Python infra person who's summoned to extinguish fires started by talented Python programmers by using tools like the one in OP deals with this stuff pretty regularly. So, I can relate to the disappointment with any aspect of Python infrastructure. There has never been a case of me discovering something in Python infra and looking at it with admiration. At best it's passable, but in most instances it's hilariously bad.
Original author of Rye here: there are no official Python builds outside of macOS and Windows and the macOS builds cannot be installed programmatically. They also cannot be placed in any location other than the dedicated framework location which often causes issues for people who do not expect specific versions to be installed. Quite often installing the macOS builds of Python breaks other tools that expect a different version to be there.
I’m glad regular Python versions work for you, and you can register them with Rye. That’s very intentionally supported.
The goal of rye is to reach higher. It wants to see how close we can get to an experience you can get from node or Rust. That includes getting access to Python itself.
I have been programming Python since 2.2 and have helped countless of people over the years with their Python challenges. From mentoring to tutoring to just helping on IRC and other places. So many times people had a completely broken Python installation or ran into compilation errors. Even pyenv ships patches to make Python compile cleanly in some cases.
The indygreg builds have issues, no doubt about it. In an ideal world the PSF would distribute portable Python builds for all platforms.
As someone with a large project that depends on the standard readline that was a major hiccup when moving to rye. Luckily there's a gnureadline package.
I love Rye. It does what it says on the tin. It makes the whole venv/Python-version/packaging process actually pleasant, and it’s invisible to someone used to Python-official usage (pyproject.toml et al). And it makes Python feel like Cargo, which is a great thing to work with too.
If like me, you've ignored poetry and friends and stuck with pip-tools (congrats!), uv (used by rye internally) is a drop in replacement.
IMHO pip-tools was always the far nicer design than poetry, pipenv etc as it was orthogonal to both pip and virtualenv (both of which have been baked into Python for many years now). I would argue Rye is the iterative, standards compliant approach winning out.
Beyond the speedups from Rust, it's nice to have some opinionated takes on where to put virtualenvs (.venv) and how to install different Python versions. It sounds small, but since wheels fixed numpy installs, sane defaults for these and a baked in pip-tools is basically all that was missing. Talking of which, what has been the point of anaconda since binary wheels became a thing?
> what has been the point of anaconda since binary wheels became a thing?
When you need python + R + some linked or CLI binary in an isolated environment. Also you will use the same tool to manage this environment across multiple OSs (e.g. no OS specific `apt`, `brew`, etc).
I still love miniconda for DS work. If you want to setup a project to process some videos using some python libraries, you can use conda to install a specific version of ffmpeg into the project without worrying about your system installation.
Lot's of random C/C++/Fortran libraries that can be used directly from conda and save a massive headache.
As somebody who tried to pick up Python after hearing there was one way to do everything…the installation and environment management experience was a train wreck.
What you heard is from the Zen of Python, a short text meant to express core ideas behind the design of the Python language. You can read it by typing `import this` in the Python interpreter. The exact sentence is:
There should be one-- and preferably only one --obvious way to do it.
This sentence was coined as an answer to a catch phrase that was used to describe the Perl programming language: There Is More Than One Way To Do It. Giving programmers more freedom to express themselves in different ways was presented as a good thing by the Perl community.
Python was partly marketed as a replacement for Perl and the sentence from the Zen of Python expresses a difference from Perl. The idea is that having different ways to do things leads to confusion and code that is harder to maintain, problems that Perl was supposed to incur according to its critics.
The sentence was true to a certain extent when it came to the Python language. It don't think it has ever been true for the Python ecosystem. For example, during the early 2000s, there were a plethora of web back-end frameworks for Python. As the Python language has since gained a lot of features, I'm not even sure that this is true for the language itself.
Regarding package management, this has always been a weak point of the Python ecosystem. Python developers often make jokes between themselves about that. Unfortunately, I would be very surprised if this project was to put an end to this issue.
Despite all this, I encourage you to learn Python because it's a very interesting and powerful language with an extremely rich ecosystem. Yes, there are many ways to do the same thing with it. But on the other hand, there is a way to do pretty much anything with it.
> poetry took whole day and still couldn't resolve deps.
I hate doing this, but the solution is to reduce the search space for poetry to find a compatible version.
Verbosely install with poetry (-vvv) and note the package it gets stuck on. Find the currently installed version from the lock file and specify it as the minimum in your pyproject.toml file.
The time to find a solution went from 2-4 hours to <90 seconds when I did this a few months ago for a large complex work project.
Pixi is limited in focus to the Conda ecosystem within Python's ecosystem. Rye is not quite what Cargo is to Rust, it's more like a faster Poetry. Both Rye and Pixi are using uv, which aspires to close the gap for Python packaging tools to be the Cargo of Python. Rye will likely fold into UV at some point in the future.
I was going to complain, but I’ll ask you/yall instead: what do you mean “makes it actually pleasant”? Is it too hard to summarize? Because I don’t think I ever identified anything about Anaconda or Poetry that felt like a particular choice, at least UX-wise. And curation-wise, it seems like a hard sell to trust a smaller org over the larger established group.
In other words: what does it say on the tin?? All I can read is “good package manager is good because good and fast and good”. Maybe there’s a comparison or ethos page…
A lot of data people use Anaconda. Anaconda is sooo slow. Even on a very beefy workstation, Anaconda often needs > 10 mins to solve an environment, and often fails. I would be excited to try something without these issues.
Speed for one thing.
Rye also manages your python version by downloading a version and with a less finicky setup the pipenvs/pipenv virtualenv shell scripts(which take longer and are less reliable because they compile python from source instead of downloading it).
As someone who has had to deal with his teams python setup. Installing poetry and pipenv and compiling Python automatically on every users machine is a lot more finicky in practice. Plus poetry wasn't just much slower sometimes locking took many minutes to finish appearing to lock up.
There's also rye install/rye tool install works like pipx, install tools in a silo-ed virtualenv with a run file in the rye dir you've already added to $PATH (it also has parameters to pass in extra parameters such as installing db packages for slaacodegen, and optionally exposing their executables on your path). It bundles other tools from astral ie ruff which is the new hotness for python linting /auto formatting/import sorting that's also fast/written in rust.
I feel with rye/uv/ruff astral is finally making progress towards a fast useful all in one python package Manager/tool like cargo. And they keep on adding a lot of useful features, for example ruff is slowly working towards implementing both flake8 and pylint and other lints.
I love Rye. After using package managers from other languages like cargo and hex, the lack of a similar system for Python always had me yearning for more. I'm really happy to say Rye has completely solved this itch for me, and its a real pleasure to use as someone who doesn't want to install different tools to manage venvs, python versions and my project's dependencies. Rye just does it all.
ML researcher perspective: Conda is... dog slow, even for relatively simple tasks (clone and run a project). The recommendation nowadays is to use Mamba (iirc), but in my experience (a few years old now) it's been unstable and unable to find package solves which worked on my system / our cluster.
I've settled on just using Poetry for most things, and then using pip in a venv to install the pyproject.toml file from Poetry either in a Dockerfile or directly on a cluster. That's worked fairly well so far, even with torch/cuda (and the mess of CUDA versioning) and from macOS to Linux.
I think uv/rye is a good next step, Poetry can be a bit slow as well at times.
My philsophy is simple. If the program is intended to be distributable, just use Go. If it does not require port stuff, use docker. If you have an IT team or someone to hand you a computer with OS and Python version installed that everyone else in the org uses, use venv.
If you have to work with ports, you have to distribute programs, or your libraries depend on C or OS stuff, then start consulting where you do not have to manage the codebase or have no committment to it after getting paid.
It's more complicated to write machine learning software in go than it is to write portable apps in python. Same goes for a lot of uses cases for python outside of backend servers or similar web related use cases. You can't really just "use go" for a lot of the things people use python for, at least not realistically
Choosing a language based on its distribution capabilities is the wrong criteria. Instead, decide based on what it enables you to do, and deal with the distribution later. The distribution won't matter if your project is not successful anyway.
Indeed. Docker solved distributing and running python programs like 10 years ago. You can even run CUDA and pytorch in docker nowadays. And the usual answers you see on HN every time someone brings up "just use docker" on those threads, is "but I don't wanna """learn""" docker". Takes 10 min to get a python container running with 0 experience in Docker.
It looks really interesting but it is hard to really invest in yet another ecosystem that tells you to curl and pipe into bash and then tells you to eval arbitrary command output.
I try new versions of pixi from time to time because I have a project that depends on LAVIS and EasyOCR.
My default project-management tool, Poetry, has problems with PyTorch.
Right now, I use pip-tools for the project.
While Conda worked, I didn't like the tooling that much.
What is currently blocking me from using pixi is the PyPI integration (https://github.com/prefix-dev/pixi/issues/1295).
I can evaluate pixi in earnest when it is solved.
I find pixi great. If anyone uses conda, pixi is a drop-in replacement where the environment is associated with the git/project directory, similar to devbox/devenv/flox.
The story is a bit complicated. There was conda by the anaconda company written in Python. Then the open source ecosystem conda-forge is a conda channel with CI build bots. Then mamba being in the same umbrella under conda-forge is a drop-in replacement of conda written in C++ (this is actual drop-in that `alias conda=mamba` should work.) Then now conda uses libmamba as the solver to speed it up.
Then the author of mamba spin it off to pixi, a rewrite in rust with different philosophy on how environments should be located and activated, with full compatibility with conda environments.
Conda always supports installing packages from PyPI via pip (when it isn’t available from conda channels for example.) and pixi support PyPI packages via uv. That makes pixi fast. (There are other optimizations done outlined in their blog post making it much faster than even mamba.)
If anyone uses any non-pure python packages, then conda is the way to go. Package manager choice (conda/mamba/pixi) is secondary.
The problem with PyPI is the lack of gate keeping. That coupled with lack of standard way to package non pure python packages makes environments leaking (see comments on errors encountered in exotic or old environments), and/or non-reproducible (especially when people is distributing source only and doing crazy things in setup.py to bootstrap their environments including compilers.)
In conda land, the conda-forge channel has pretty good gate keeping to ensure quality, such as being constrained properly, licensed properly (PyPI maintainers sometimes didn’t include the necessary license file in the distribution), environment isolated properly, etc. it’s not bullet proof as there is official bot that maintainers can use to auto-merge changes from PyPI that has wrong version constraints for example.
The problems that no tools can solves right now are centered around PyPI: deal with packages not available in conda, and releasing packages virtually mandates releasing on PyPI first.
When installing packages available on PyPI only through conda, there are some of its dependencies still available through conda. AFAIK, no package manager will use conda packages to fulfill the PyPI package dependencies. You can manually add the conda packages to resolve dependencies, risking not subjecting it to the right version constraints.
And when you author an open source python package, even if your setup relies on conda channels only, you most probably would/need to release it on PyPI first (releasing on conda-forge channel virtually mandates a presence at PyPI first). Then you need non-conda tools to help you. This is why Rye would still be useful to people like me, and worth checking out.
> Rye is still a very experimental tool, but this guide is here to help you get started.
While I’m really excited about this project, I’m planning on waiting until this project is in a more mature stage. I am a big fan of everything else the Astral team has put out so I have high hopes for this.
E.g., we recently added support for "universal" resolution in uv, so you can generate a locked requirements.txt file with a single resolution that works on all platforms and operating systems (as opposed to _just_ the system you're running on). And Rye supports it too in the latest release.
[1] https://github.com/astral-sh/uv
---
I work on Rye and uv, if you have any questions :)
https://github.com/python-poetry/poetry/issues/6409
https://news.ycombinator.com/item?id=39257501
This is one concern that would prevent the team I'm on from moving. We do ML work, so I'll use a pytorch-based project for my example. The desired state is that all dependencies are in pyproject.toml, e.g., from which you can generate a set of lockfiles from an AArch64 Mac or an AMD64 Windows workstation for the following platform configurations:
1. Mac with default (MPS) pytorch
2. Windows with CUDA pytorch
3. AArch64 Linux with CPU pytorch
4. AMD64 Linux with CPU pytorch
5. AMD64 Linux with CUDA pytorch
P.S. Big thanks for making Ruff, we're huge fans of linting and formatting both.
It's a hacky workaround, but it seems to work so far. It would be much nicer to see this solved in a better way, though!
Now its:
* Install Rye,
* Pull from Github,
* Type rye sync
Since Rye already uses uv behind the scenes there won’t be a lot of incompatibilities to deal with so the migration should be trivial.
https://guac.sh/
https://pypi.org/project/sbom4python/ https://github.com/CycloneDX/cyclonedx-python
Your recently added ability to unpin dependencies so devs are more encouraged to stay compatible as they dev, then generate a correct explicit requirements.txt for reproducibility, makes both vuln management and the SBOM step a far easier thing than poetry etc.
Thank you!
For similar reasons, we use https://hatch.pypa.io/latest/why/ and appreciate that it plays nicely with `uv`.
How do you pronounce "uv"?
It doesn't look like this is currently documented but I found some hints in the release notes.
Setting up a new project is: rye init && rye sync Adding a dep is: rye add flask && rye sync You can pin your python version in the pyproject.toml
Migrating from an established project is a little hard than it should be. Importing the requirements.txt into the pyproject.toml is not a good idea as uv gets itself in a twist with all the low-level dependencies that exist in the requirements.txt. I've never tried it with a poetry made pyproject.toml, report back if you try it.
On the whole its a good experience, fast and straight forward.
Don't get me wrong, both are hilariously bad, but I like to see more chaos in Python infrastructure. You pushing against PyPA (rightfully) delegitimizes them.
From their philosophy page: https://rye.astral.sh/philosophy/
> No Python Binary Distributions: CPython builds from python.org are completely inadequate. On some platforms you only get an .msi installer, on some you literally only get tarballs. The various Python distributions that became popular over the years are diverging greatly and cause all kinds of nonsense downstream. This is why this Project uses the indygreg standalone builds. I hope that with time someone will start distributing well maintained and reliable Python builds to replace the mess we are dealing with today.
And here is info about those particular indygreg builds.
https://gregoryszorc.com/docs/python-build-standalone/main/
It is, however, possible to choose a different Python.
https://rye.astral.sh/guide/toolchains/
I've never really experienced the problem they are describing. Any official Python build I've gotten from python.org has worked. Every normal old Python I've gotten from my distribution's package manager has worked. Every Python included with an official Docker container image has worked.
I'm sure their special builds will also work just fine, but their non-standardness gives me pause. There's even a list of behavior quirks. Why are we adding quirks to something that has no problems? And the fact that the rye philosophy seems so intent on using them, turns me off from it compared to the alternatives that sensibly default to using the Python you already have available on your OS.
1. Work at a company that runs some ancient version of CentOS or something.
2. The system package manager doesn't have anything newer than Python 3.6.
3. Official binaries from python.org print an inscrutable error about the glibc version when you try to run them.
4. Building Python from source requires some new system dependencies, takes forever, and then produces an even less scrutable error about a missing header file.
5. Googling that error leads to a ./configure flag that works around the issue, which upstream GCC fixed in 2017.
6. Success?
If you haven't seen that error #3 before, or dealt with "manylinux" build environments, you've dodged an entire world of pain. The same goes for scripting installs on Windows, and for the part of that page that's talking about "limiting the CPU instructions that can be used" :')
1. Try to install pytorch
I have been in the #3 hell, almost exactly how you described but it was always about SSL and its missing headers. On my desktop wiki, the most important section about Python is the one that contains the incantations required to compile SSL, setting myriad of variables and using pyenv to build a newer (3.10/3.11/3.12) Python.
I think it's easy to compile Python, but it's easy to end up just having to go re-compile it for some random component that was not compiled in the "standard" fashion.
If you have a good test suite this stuff shows up quite loudly though. At one point the core issue is more collaborators wanting to not have to compile things themselves.
(And to "automating away" as a comment... indygreg's releases _are_ this! Someone has done the work for us all)
The concern could be absolved by simply improving the docs with the most recommended compile flags. I think they are actually noted there. Also of note our build time decreased substantially with llvm.
Also I think a big issue is the inconsistency between platforms. For example the official Python installed doesn't include python3.exe (frankly because the devs are idiots), but the one from the Microsoft app store does!
If you stay on one platform you wouldn't see those issues.
I'm just guessing, but they could mean that there are no macOS/Windows binaries for security releases of older Python versions. You can't, for example, download an installer for Python 3.10.14. The last Windows installer is for Python 3.10.11 (April 5, 2023).
The PSF has nothing to do with software development.
Linux distributions ARE NOT binary compatible, you can't just build Python on one distro and run in on another. You need to statically link everything to do that.
Seeing how this module is basically useless, a lot of Linux distros also exclude it from the Python they package. But PyPI builds try to include everything.
There are few more modules like that.
Another aspect is various defaults that help Python locate its parts or control its loading. Eg. the site module or sysconf. For various reasons various vendors may configure these differently. This may result in some packages being broken upon install.
I.e. Python programmers are notoriously bad at creating packages and understanding how they work (also, Wheel format is a dumpster fire of nonsense). So, a lot of developers don't understand the consequences of packaging anything that's not strictly Python source code (which, frankly, should never have been packaged! but hey, remember, Wheel? dumpster fire? So... where was I...) anyways, native libraries packaged with python source may end up in some random place Python doesn't look at, and consequently fails to load, or other data files end up in the wrong place because the developer packaging them after countless trial and error has made it work on their computer, with their particular Python installation, but they still don't know why it worked.
Similarly, if a package wants to provide header files so that other packages can compile against the native modules the package provides... oh-ho-ho, bet you didn't know that's possible, right?! Well, don't bother. It mostly doesn't work anyways. And there's a bunch more stuff like that.
As a "typical" user, you might have never encountered any of the issues above, but me, as a Python infra person who's summoned to extinguish fires started by talented Python programmers by using tools like the one in OP deals with this stuff pretty regularly. So, I can relate to the disappointment with any aspect of Python infrastructure. There has never been a case of me discovering something in Python infra and looking at it with admiration. At best it's passable, but in most instances it's hilariously bad.
I’m glad regular Python versions work for you, and you can register them with Rye. That’s very intentionally supported.
The goal of rye is to reach higher. It wants to see how close we can get to an experience you can get from node or Rust. That includes getting access to Python itself.
I have been programming Python since 2.2 and have helped countless of people over the years with their Python challenges. From mentoring to tutoring to just helping on IRC and other places. So many times people had a completely broken Python installation or ran into compilation errors. Even pyenv ships patches to make Python compile cleanly in some cases.
The indygreg builds have issues, no doubt about it. In an ideal world the PSF would distribute portable Python builds for all platforms.
I love Rye. It does what it says on the tin. It makes the whole venv/Python-version/packaging process actually pleasant, and it’s invisible to someone used to Python-official usage (pyproject.toml et al). And it makes Python feel like Cargo, which is a great thing to work with too.
IMHO pip-tools was always the far nicer design than poetry, pipenv etc as it was orthogonal to both pip and virtualenv (both of which have been baked into Python for many years now). I would argue Rye is the iterative, standards compliant approach winning out.
Beyond the speedups from Rust, it's nice to have some opinionated takes on where to put virtualenvs (.venv) and how to install different Python versions. It sounds small, but since wheels fixed numpy installs, sane defaults for these and a baked in pip-tools is basically all that was missing. Talking of which, what has been the point of anaconda since binary wheels became a thing?
When you need python + R + some linked or CLI binary in an isolated environment. Also you will use the same tool to manage this environment across multiple OSs (e.g. no OS specific `apt`, `brew`, etc).
Lot's of random C/C++/Fortran libraries that can be used directly from conda and save a massive headache.
Glad to hear it’s getting better finally.
Python was partly marketed as a replacement for Perl and the sentence from the Zen of Python expresses a difference from Perl. The idea is that having different ways to do things leads to confusion and code that is harder to maintain, problems that Perl was supposed to incur according to its critics.
The sentence was true to a certain extent when it came to the Python language. It don't think it has ever been true for the Python ecosystem. For example, during the early 2000s, there were a plethora of web back-end frameworks for Python. As the Python language has since gained a lot of features, I'm not even sure that this is true for the language itself.
Regarding package management, this has always been a weak point of the Python ecosystem. Python developers often make jokes between themselves about that. Unfortunately, I would be very surprised if this project was to put an end to this issue.
Despite all this, I encourage you to learn Python because it's a very interesting and powerful language with an extremely rich ecosystem. Yes, there are many ways to do the same thing with it. But on the other hand, there is a way to do pretty much anything with it.
I am sold. Was thinking of trying out pixie after poetry took whole day and still couldn't resolve deps.
Looks like there are more python package managers that chat apps from Google ?
I hate doing this, but the solution is to reduce the search space for poetry to find a compatible version.
Verbosely install with poetry (-vvv) and note the package it gets stuck on. Find the currently installed version from the lock file and specify it as the minimum in your pyproject.toml file.
The time to find a solution went from 2-4 hours to <90 seconds when I did this a few months ago for a large complex work project.
In other words: what does it say on the tin?? All I can read is “good package manager is good because good and fast and good”. Maybe there’s a comparison or ethos page…
As someone who has had to deal with his teams python setup. Installing poetry and pipenv and compiling Python automatically on every users machine is a lot more finicky in practice. Plus poetry wasn't just much slower sometimes locking took many minutes to finish appearing to lock up.
There's also rye install/rye tool install works like pipx, install tools in a silo-ed virtualenv with a run file in the rye dir you've already added to $PATH (it also has parameters to pass in extra parameters such as installing db packages for slaacodegen, and optionally exposing their executables on your path). It bundles other tools from astral ie ruff which is the new hotness for python linting /auto formatting/import sorting that's also fast/written in rust.
I feel with rye/uv/ruff astral is finally making progress towards a fast useful all in one python package Manager/tool like cargo. And they keep on adding a lot of useful features, for example ruff is slowly working towards implementing both flake8 and pylint and other lints.
I love Rye. After using package managers from other languages like cargo and hex, the lack of a similar system for Python always had me yearning for more. I'm really happy to say Rye has completely solved this itch for me, and its a real pleasure to use as someone who doesn't want to install different tools to manage venvs, python versions and my project's dependencies. Rye just does it all.
I haven't had an idea of Rye yet, but conda can do "manage venvs, python versions and my project's dependencies" fine.
I've settled on just using Poetry for most things, and then using pip in a venv to install the pyproject.toml file from Poetry either in a Dockerfile or directly on a cluster. That's worked fairly well so far, even with torch/cuda (and the mess of CUDA versioning) and from macOS to Linux.
I think uv/rye is a good next step, Poetry can be a bit slow as well at times.
If you have to work with ports, you have to distribute programs, or your libraries depend on C or OS stuff, then start consulting where you do not have to manage the codebase or have no committment to it after getting paid.
Sometimes you need to use a Python library.
The story is a bit complicated. There was conda by the anaconda company written in Python. Then the open source ecosystem conda-forge is a conda channel with CI build bots. Then mamba being in the same umbrella under conda-forge is a drop-in replacement of conda written in C++ (this is actual drop-in that `alias conda=mamba` should work.) Then now conda uses libmamba as the solver to speed it up.
Then the author of mamba spin it off to pixi, a rewrite in rust with different philosophy on how environments should be located and activated, with full compatibility with conda environments.
Conda always supports installing packages from PyPI via pip (when it isn’t available from conda channels for example.) and pixi support PyPI packages via uv. That makes pixi fast. (There are other optimizations done outlined in their blog post making it much faster than even mamba.)
If anyone uses any non-pure python packages, then conda is the way to go. Package manager choice (conda/mamba/pixi) is secondary.
The problem with PyPI is the lack of gate keeping. That coupled with lack of standard way to package non pure python packages makes environments leaking (see comments on errors encountered in exotic or old environments), and/or non-reproducible (especially when people is distributing source only and doing crazy things in setup.py to bootstrap their environments including compilers.)
In conda land, the conda-forge channel has pretty good gate keeping to ensure quality, such as being constrained properly, licensed properly (PyPI maintainers sometimes didn’t include the necessary license file in the distribution), environment isolated properly, etc. it’s not bullet proof as there is official bot that maintainers can use to auto-merge changes from PyPI that has wrong version constraints for example.
The problems that no tools can solves right now are centered around PyPI: deal with packages not available in conda, and releasing packages virtually mandates releasing on PyPI first.
When installing packages available on PyPI only through conda, there are some of its dependencies still available through conda. AFAIK, no package manager will use conda packages to fulfill the PyPI package dependencies. You can manually add the conda packages to resolve dependencies, risking not subjecting it to the right version constraints.
And when you author an open source python package, even if your setup relies on conda channels only, you most probably would/need to release it on PyPI first (releasing on conda-forge channel virtually mandates a presence at PyPI first). Then you need non-conda tools to help you. This is why Rye would still be useful to people like me, and worth checking out.
> Rye is still a very experimental tool, but this guide is here to help you get started.
While I’m really excited about this project, I’m planning on waiting until this project is in a more mature stage. I am a big fan of everything else the Astral team has put out so I have high hopes for this.