Today I was trying to use Python to build some custom ZMK firmware, which relies on a package named west. For the life of me, I cannot figure out how to get it installed so that it's in my PATH. Why is python package management still this bad after so many years, with so many wonderful examples of how good a package manager can be?
```
- I discovered that you'd never get an answer to a problem from Linux Gurus by asking. You have to troll in order for someone to help you with a Linux problem.
- For example, I didn't know how to find files by contents and the man pages were way too confusing. What did I do? I knew from experience that if I just asked, I'd be told to read the man pages even though it was too hard for me.
- Instead, I did what works. Trolling. By stating that Linux sucked because it was so hard to find a file compared to Windows, I got every self-described Linux Guru around the world coming to my aid. They gave me examples after examples of different ways to do it. All this in order to prove to everyone that Linux was better.
- ion has quit IRC (Ping timeout)
- brings a tear to my eye... :') so true..
- So if you're starting out Linux, I advise you to use the same method as I did to get help. Start the sentence with "Linux is gay because it can't do XXX like Windows can". You will have PhDs running to tell you how to solve your problems.
```
I mean we already have AI-automated "email trolling" where they use GPT3 to produce an introduction sentence which suggests that the sender took at least the time to look at my homepage before bothering me, but in reality it's all automated and they are just sending more personal-looking spam than the next guy.
Python's packaging story is rotten in its bones. I think at this point it's nearly impossible to fix (though many continue to try).
The way I see it, a solution would require:
- A cohesive, self-contained, easy to use, all-inclusive toolset, with a completely standardized manifest format for packages. And, for all that is holy, doesn't mess with the global environment so that we have to lie to it with virtual environments.
- That toolset would have to get overwhelming adoption by the community, to where it retroactively becomes The Python Package Manager (outside of legacy codebases, which would continue to languish). This would probably require endorsement by the Python developers, or for the tool to be so unassailably better that it's preferred without exception, or possibly both. Otherwise we'll continue to have: https://xkcd.com/927/
I want to emphasize that on the second point 50% adoption is not enough. 70% adoption is not enough. I'm talking 90%+ adoption for libraries and new projects; everything outside of it needs to be a rounding error, or we're still in fragmentation hell.
And then even in the best case- there would be a long tail for years to come of packages that haven't been ported to the new tool/manifest format, where you have to duck out to one of the other package managers to pull them in.
I mean, you can’t write a command line program which is reliable and straightforward to install in blub, why would anybody need it?
The only complex thing about Python packaging is that is separates it tools into "frontend" and "backend" build tools. On the frontend, pip is essentially the cohesive, self-contained, easy to use, all-inclusive solution, as long as your backends support the standards [0][1].
> with a completely standardized manifest format for packages
That's wheel [2]. Everything else is deprecated now, except for sdist (which doesn't really matter as it's limited to Python-only packages, and transparently supported by all tools).
> And, for all that is holy, doesn't mess with the global environment so that we have to lie to it with virtual environments.
This would be nice, but it is completely out of hands of the language developers. Different Linux (and other OSs, like OSX) distributions have made different decisions about the python version(s) they bundle by default, and in many cases the bundled versions are critical for some OS-level tools to function properly. Pretty much all interpreted languages either a) have their versions of virtual environments, or b) rely on a single system-wide interpreter.
That being said, there is no need to use virtualenvs if a) you know that you need only one version of Python, and b) you have full control over your system. The best example is service deployments, where you can install a version of your choosing (or start with an appropriate base image), and install everything else on top of it.
> That toolset would have to get overwhelming adoption by the community, to where it retroactively becomes The Python Package Manager (outside of legacy codebases, which would continue to languish). This would probably require endorsement by the Python developers
Why exactly? There are two ways one can work with packages:
1. The majority of users want to be able to install packages.
This is already the case. Pip is overwhelmingly adopted by everyone, with two notable exceptions:
But both of these exceptions are very specific and narrow in scope.> Otherwise we'll continue to have: https://xkcd.com/927/
We already don't have that situation. I agree that we did, but the standards are now very clear, and the tools are slowly moving in the right direction.
[0] https://peps.python.org/pep-0518/
[1] https://peps.python.org/pep-0660/
[2] https://realpython.com/python-wheels/
Using virtual envs or pipx or poetry or whatever are non-standard.
Packaging Python projects, however, ... don't get me started on the whole `setup.py`, `setup.cfg`, `pyproject.toml` debacle. This article has more information about it, but the fact that this is supposed to be the short version makes it even more infuriating: https://blog.ganssle.io/articles/2021/10/setup-py-deprecated...
Originally, it was meant as a central point to define your build system requirements [0]; for this purpose it needs to include only two lines:
Then, package managers like pipenv and poetry started using it as a central place to storing other project metadata like dependencies, description etc. Most package managers now have their own versions of that functionality, and it is currently being standardised to a common form [1].Finally, many other projects have started adding support for keeping its configurations in pyproject.toml. Some (like black) don't even support any other form, while others (like flake8, and until recently mypy) are resisting this trend; but it is already so prevalent that it can be considered the standard.
[0] https://peps.python.org/pep-0518/
[1] https://peps.python.org/pep-0621/
$ python -m pip install west
That's it, no venv, no sudo.
pip will install it in ~/.local/ so you'll need a:
to get it to your path.But that's not the recommended way to install pypi packages you want to use, with debian at least. This may install dependencies that will break (for your user) some debian packages depending on other versions of said dependencies. The easy way to go is `pipx install west`.
EDIT: --user is not necessary anymore. I think this does not alter the validity of the rest of my comment.
Does it try to install packages in your system paths and fails due to lack of permissions?
It’s a bad idea right up there with Linux distros having python2, python3, python3.1, puthon3.2, python3.3, …. They give up the ease of use of a command line application where you can tell people what to type for the mind numbing complexity of a GUI application where it takes a 1500 page book to explain how to do anything in Microsoft Word because now you can’t tell people what to type to run and install things.
One example (but there's many): On Windows a `pip install --upgrade pip` can't work as the OS would lock the `pip` executable, while a `python -m pip install --upgrade pip` works as the OS would lock python instead.
I also encontered environments (CI runners, things like this) with `python` and pip installed, installed but no `pip` shortcut.
More: https://snarky.ca/why-you-should-use-python-m-pip/
You create a venv or bundle, list requirements in a text file, then ask it to install things for you.
And if you need custom stuff, you can just pip install a .whl file, too.
I have yet to encounter a case where it's not working as expected, so my answer would be that python isn't getting fixed because it's not broken.
wontfix, works for me
Just last week I had problems with a project that I hadn't touched in a while.
After installing Python 3 on a new computer (and making sure that pip is installed) I found that my scripts broke because "pip install" was no longer a thing. I now needed to do something like "python -m pip install".
Not a big issue, just a reminder that things are still improving for the better.
That said, whenever there's native code involved, things can get tricky (especially in Alpine based containers with musl instead of glibc).
That does apply to pretty much everything, just yesterday I also discovered that Ruby was slowing down builds 2-3x because of sassc install being really slow after an update. Then again, the whole library it depends on is deprecated so there's additional migration pain to be had soon.
And don't even get me started on database drivers and their prerequisites in each environment!
That said, even if something like Maven seems (sometimes) better due to no native extensions, I'm still happy that Python/Ruby/Node package managers work as well as they do. Sure beats copying random code manually.
What I'm trying to say: In most production uses, the line between dev and release dependencies is so blurred that it almost doesn't exist.
Instead of solving dependencies, pip just starts installing stuff and it tries to backtrack if it paints itself into a corner but it frequently gets stuck.
If you dependencies are wheels it is not so bad, in fact with the right software you can download the dependency list of a wheel without downloading a wheel do you could do what real dependency management software (maven) does and download and resolve the dependency graph before installing anything.
With eggs you are SOL because you have to run a Python script to determine the dependencies and if you run enough of those one will have a side effect or two.
Here’s your dumpster fire. I can’t figure out why this crappy venv thing has to exist in Python when it doesn’t exist anywhere else.
Well, there's RVM[1] for Ruby, maven-env[2] for maven, and perl5-virtualenv[3] for Perl.
[1] https://rvm.io [2] https://github.com/AlejandroRivera/maven-env [3] https://github.com/orkunkaraduman/perl5-virtualenv
C++ -> vcpkg
Ruby -> bundler
Node -> npm vendor
Go -> Go Modules
Java -> Maven
If you disagree, please explain how Ruby's bundler caching dependencies in a vendor subfolder differs from Python's venv caching dependencies in a subfolder and why one of them is a dumpster fire that and the other one doesn't exist.
"Just do pip install" --> Whenever I hear this comment, I know Im talking to someone who has never used scientific libraries like numpy, scipy etc. Never seen the problem of dependency versions going into a mess because Pip by default doesnt pin dependencies (Poetry does, but it is not standard).
Python packaging is a mess because for some weird reason that baffles me, a large majority won't even admit tehre is a problem. And they will start jumping on you and calling you an idiot if you say there is. A lot of gaslighting going on here.
It's still a mess. Arguably, a worse mess than it used to be.
They provide dep management. Pip/virtualenv never did dep management really. There was requirements.txt sure, but you had to manage that yourself.
Conda, for instance, is great for ML and scientific given the extra niceties for replicating environments, but most specifically because you get package sets optimized for your analysis needs.
1. A user install: `pip install [package]` and make sure "$HOME/.local/bin" is on your PATH
2. A global install: `sudo pip install [package]` - it will be installed to a dir on your path already (/usr/bin I think)
As for why pip si not ideal for installing software: it's not supposed to be. It's a Python package manager, not a software package manager. It's meant to install libraries for the Python interpreter to import, not commands for you to run. Of course, people do often library managers to install software (npm, cargo, go...), but the experience is the same in all of them - either you install with sudo and it "just works", but might cause problems later, or you install in "local" mode which requires you to add a the package managers's directory to your path.