Readit News logoReadit News
Alir3z4 · a year ago
Almost 2 decades of working with python.

I create a venv. Pip install and keep my direct deps in requirements.txt

That's it. Never understood all these python dependency management problems dramas.

Recently, I started using pyproject.toml as well which makes the whole thing more compact.

I make lots of python packages too. Either I go setup.py or sometimes I like to use flit for no specific reason.

I haven't ever felt the need for something like uv. I'm good with pip.

Borealid · a year ago
That doesn't work well (enough) if you have one project that requires Python <3.10 and another that requires Python >=3.10.

To really pin everything you'd need to use something like asdf, on top of poetry or a manual virtualenv.

Otherwise you get your colleagues complaining that pip install failed with mysterious errors.

nyrikki · a year ago
venvs are namespace isolation, they are like containers.

Even in huge monorepos you can just use something like a Makefile to produce a local venv using PHONY and add it to clean too

This is how I actually test old versions of python, with versioned build targets, cython vs ...

You can set up almost any IDE to activate them automatically too.

The way to get you coworkers to quit complaining is to automate the build env setup, not fighting dependency hell, which is a battle you will never win.

It really is one of the most expensive types of coupling.

alkh · a year ago
Would recommend you to install pyenv[1]. It was very useful when my team had to update a lot of projects using <=3.10 to 3.11 [1] https://github.com/pyenv/pyenv
zahlman · a year ago
I have multiple versions of Python built from source. If I want to test what my code will do on a given version, I spin up a new venv (near instantaneous using `--without-pip`, which I achieve via a small Bash wrapper) and try installing it (using the `--python` option to Pip, through another wrapper, allowing me to reuse a single global copy).

No matter what tooling you have, that kind of test is really the only way to be sure anyway.

If something doesn't work, I can play around with dependency versions and/or do the appropriate research to figure out what's required for a given Python version, then give the necessary hints in my `pyproject.toml` (https://packaging.python.org/en/latest/specifications/pyproj...) as environment markers on my dependency strings (https://peps.python.org/pep-0508/#environment-markers).

"Mysterious errors" in this area are usually only mysterious to end users.

marky1991 · a year ago
I don't get it, you ought to be building a different venv per project anyway.

(Of course, I don't distribute most of my projects, so I just dump them all in the global install and don't worry about it)

merb · a year ago
pyproject.toml solves this nowadays
rcxdude · a year ago
This is more or less my experience, but I think in part it took a while for pip to actually get into a usable position, hence some of the proliferation of other options.
thingification · a year ago
That might be fine in your context. People's problems are real, though. What they're almost always missing is separating the source code from the compiled output ("lock files"). Pick a tool to help with that, commit both files to your ("one's") project, problem solved.

People end up committing either one or the other, not both, but:

- You need the source code, else your project is hard to update ("why did they pick these versions exactly?" - the answer is the source code).

- You need the compiled pinned versions in the lock file, else if dependencies are complicated or fast-moving or a project goes unmaintained, installing it becomes a huge mindless boring timesink (hello machine learning, all three counts).

Whenever I see people complaining about python dependencies, most of the time it seems just that somebody lacked this concept, or didn't know how to do it with python, or are put off by too many choices? That plus that ML projects are moving quickly and may have heavy "system" dependencies (CUDA).

thingification · a year ago
To be more concrete:

In the source code - e.g. requirements.in (in the case of pip-tools or uv's clone of that: uv pip compile + uv pip sync), one lists the names of the projects one's application depends on, with a few version constraints explained with comments (`someproject <= 5.3 # right now spamalyzer doesn't seem to work with 5.4`).

In the compiled output - i.e. the lock files (pip-tools or uv pip sync/compile use requirements.txt for this) one makes sure every version is pinned to one specific version, to form a set of versions that work together. A tool (like uv pip compile) will generate the lock files from the source code, picking versions that are declared (in PyPI metadata) should work together.

My advice: pip-tools (pip-compile + pip-sync) does this very nicely - even better, uv's clone of pip-tools (uv pip compile + uv pip sync), which runs faster. Goes nicely with:

- pyproject.toml (project config / metadata)

- plain old setuptools (works fine, doesn't change: great)

- requirements.in: the source for pip-tools (that's all pip-tools does: great! uv has a faster clone)

- pyenv to install python versions for you (that's all it does: great! again uv has a faster clone)

- virtualenv to make separate sandboxed sets of installed python libraries (that's all it does: great! again uv has a faster clone)

- maybe a few tiny bash scripts, maybe a Makefile or similar just as a way to list out some canned commands

- actually write down the commands you run in your README

PS: the point of `uv pip sync` over `uv pip install -r requirements.txt` is that the former will uninstall packages that aren't explicitly listed in requirements.txt.

uv also has a poetry-like do-everything 'managed' everything-is-glued-together framework (OK you can see my bias). Personally I don't understand the benefits of that over its nice re-implementations of existing unix-y tools, except I guess for popularizing python lockfiles - but can't we just market the idea "lock your versions"? The idea is the good part!

gkhartman · a year ago
That's been my experience too. The main complaint I hear about this workflow is that venvs can't be moved without breaking. I just rebuild my venv in each new new location, but that rebuild time can add up for projects with many large scientific packages. Uv solved that pain point for me, since it provides a "pip install" implementation that runs in a fraction of the time.
mrbungie · a year ago
This. And also try always to fix the version of the requirements, and that's it.

Never had a problem making reproducible builds doing so.

Fethbita · a year ago
I had issues with exactly this method. One of my dependencies was pulled off to a paid model so my project no longer worked.
Alir3z4 · a year ago
Yeah, I assume pinning the version is something everyone does? Or probably many just don't and will have those "python deps management is a mess drama".

TBH, I've seen tutorials or even some companies simply do `pip freeze > requirements.txt` :shrug: which is a mess.

atoav · a year ago
Then you deploy to an old debian and everything falls apart.
Alir3z4 · a year ago
Not really.

`pyproject.toml` let's you set the min python version. If not met, it won't install.

Regardless, majority of the times, deployment is done via Docker.

stavros · a year ago
These days, I am extremely happy to be using uv with single scripts. Put this at the start of your script (which has to end in .py):

(EDIT: Sorry, HN doesn't like code, see the start of https://github.com/skorokithakis/calumny/blob/master/calumny... for an example)

The script will run with uv and automatically create a venv and install all dependencies in it. It's fantastic.

The other alternative, if you want to be extra sure, is to create a pex. It'll even bundle the Python interpreter in the executable (or download it if it's not available on the target machine), and will run anywhere with no other dependency (maybe libc? I forget).

simonw · a year ago
HN displays code just fine if you put four spaces in front of each line, like this:

    #!/usr/bin/env -S uv run
    # /// script
    # requires-python = ">=3.12"
    # dependencies = [
    #     "google-api-python-client",
    #     "google-auth-httplib2",
    #     "google-auth-oauthlib",
    #     "selenium",
    #     "webdriver_manager",
    #     "pydantic",
    # ]
    # ///
    import argparse
    import datetime
I built a tool to help me do that: https://observablehq.com/@simonw/wrap-text-at-specified-widt...

Edit: Huh, two spaces works too:

  #!/usr/bin/env -S uv run
  # /// script
  # requires-python = ">=3.12"

Jtsummers · a year ago
Two spaces, Reddit is the four spaces one.

  Two spaces
    Four spaces

Flimm · a year ago
Here are the official formatting guidelines: https://news.ycombinator.com/formatdoc
stavros · a year ago
Thanks Simon, I guess it was a brain fart, because I'm sure I've used this formatting a lot in the past.
d4rkp4ttern · a year ago
You can go a step further and have scripts runnable from anywhere without cloning any repo or setting up any venv, directly from PyPi: when you package your lib called `mylib`, define cli scripts under [project.scripts] in your pyproject.toml,

  [project.scripts]
  mytool = "path.to.script:main"

and publish to PyPi. Then anyone can directly run your script via

  uvx --from mylib mytool
As an example, for Langroid (an open-source agent-oriented LLM lib), I have a separate repo of example scripts `langroid-examples` where I've set up some specific scripts to be runnable this way:

https://github.com/langroid/langroid-examples?tab=readme-ov-...

E.g. to chat with an LLM

  uvx --from langroid-examples chat --model ollama/qwen2.5-coder:32b
or chat with LLM + web-search + RAG

  uvx --from langroid-examples chatsearch --model groq/llama-3.3-70b-versatile

stavros · a year ago
Hm, I think you can just run something with 'uvx <name>' and it'll download and run it, am I misremembering? Maybe it's only when the tool and the lib have the same name, but I think I remember being able to just run 'uvx parachute'.
zahlman · a year ago
>Put this at the start of your script (which has to end in .py):

Yes, that format is specified by PEP 723 "Inline script metadata" (https://peps.python.org/pep-0723/). The cause was originally championed by Paul Moore, one of the core developers of Pip, who authored the competing PEP 722 (see also https://discuss.python.org/t/_/29905) and was advocating for the general idea for long before that.

It's also supported by Pipx, via the `run` subcommand. There's at least one pull request to put it directly into Pip, but Moore doesn't seem to think it belongs there.

stavros · a year ago
Yep! It's just that uv installs everything fast enough to run scripts comfortably.
ethbr1 · a year ago
Fwiw, at the bottom the author recommends uv in almost all situations. (Except for non-Python dependencies, in which case Pixi)
CamperBob2 · a year ago
That looks pretty awesome. What are the drawbacks to using uv? Does it get along with existing pip and conda installations?
stavros · a year ago
I don't use conda, so I don't know, but it works fine with pip, it just makes its own virtualenv somewhere. It doesn't touch anything else.
sunshowers · a year ago
One minor drawback I've noticed is that direnv doesn't come with uv support out of the box: https://github.com/direnv/direnv/issues/1250
d4rkp4ttern · a year ago
Poetry -> UV migration is missing: If you already have a project using Poetry, with a large pyproject.toml with many extras and other settings, there currently isn’t a smooth way to port this to UV. Someone can enlighten me if I missed that.
shlomo_z · a year ago
It is too fast and doesn't give you time to think. :)

Just Kidding! It's amazing. It gets along with existing installations very well.

nextaccountic · a year ago
Why does it has to end with .py?
karl42 · a year ago
It doesn't have to. But if it does not end in .py, you have to add the --script (or -s for short) flag to tell it to interpret the file as a python script.
stavros · a year ago
Otherwise uv doesn't recognize it and goes into an infinite loop.
ethbr1 · a year ago
In before 'Python dependency management isn't hard. You just use {newest Python flavor-of-the-year way} of doing {same thing that is standardized in other languages}.'

Which, fair. Python is and will always be a bazaar.

>> PDM (Edit 14/12/2024) When I shared this article online, I was asked why I did not mention PDM. The honest reason was because I had not heard of it.

Ha! On brand.

lanstin · a year ago
"There should be one-- and preferably only one --obvious way to do it."

It was originally not a bazaar but "batteries included" where the thing you wanted to do had an obvious best way of doing it. An extremely difficult property to maintain over the decades.

thesuperbigfrog · a year ago
So with Python dependency management 'there is more than one way to do it':

pip, pipenv, poetry, conda, setuptools, hatch, micropipenv, PDM, pip-tools, egg, uv, ActiveState platform, homebrew, your operating system's package manager, and many others . . .

Relevant xkcd: https://xkcd.com/1987/

zahlman · a year ago
Of the list you offer, only Poetry, Hatch, PDM and Uv actually do "Python dependency management" - in the sense of offering an update command to recalculate locked versions of dependencies, a wrapper to install or upgrade those dependencies, and a wrapper to keep track of which environment they'll be installed into.

pipenv, micropipenv and pip-tools are utilities for creating records of dependencies, but don't actually "manage" those dependencies in the above sense.

Your list also includes an installer (Pip), a build backend (Setuptools - although it has older deprecated use as something vaguely resembling a workflow tool similar to modern dependency managers), a long-deprecated file format (egg) which PyPI hasn't even accepted for a year and a half (https://packaging.python.org/en/latest/discussions/package-f...), two alternative sources for Python itself (ActiveState and homebrew - and I doubt anyone has a good reason to use ActiveState any more), and two package management solutions that are orthogonal to the Python ecosystem (Conda - which was created to support Python, but its environments aren't particularly Python-centric - and Linux system package managers).

Any system can be made to look complex by conflating its parts with other vaguely related but ultimately irrelevant objects.

zahlman · a year ago
Nobody agrees on what the entire "dependency management" process entails. If they supported everyone's use case in a standard tool, many people would be unhappy with how certain things were done, and many more people would ignore a huge fraction of it.

New workflow tools like Poetry, PDM, Hatch, uv etc. tend to do a lot of wheel reinvention, in large part because the foundational tools are flawed. In principle, you can do everything with single-purpose tools. The real essentials look like:

* Pip to install packages

* venv to create environments

* `build` as a build frontend to create your own distributions

* a build backend (generally specified by the package, and set up automatically by the frontend) to create sdists and wheels for distribution

* `twine` to put sdists and wheels on PyPI

The problems are:

* Determining what to install is hard and people want another tool to do that, and track/update/lock/garbage-collect dependencies

* Keeping track of what venvs you made, and which contains what, is apparently hard for some users; they want a tool to help make them, and use the right one when you run the code, and have an opinion about where to keep them

* Pip has a lot of idiosyncrasies; its scope is both too wide in some places and too narrow in others, it's clunky to use (the UI has never had any real design behind it, and the "bootstrap Pip into each venv" model causes more problems), and it's way too eager to build sdists that won't end up getting installed (which apparently is hard to fix because of the internal structure of the code)

* Setuptools, the default build backend, has a legacy of trying to be an entire workflow management tool, except targeting very old ideas of what that should entail; now it's an absurdly large pile of backwards-compatibility wrappers in order to keep supporting old-fashioned ways of doing things. And yet, it actually does very little in a modern project that uses it: directly calling `setup.py` is deprecated, and most of what you would pass to the `setup` call can be described in `pyproject.toml` instead; yet when you just properly use it as a build backend, it has to obtain a separate package (`wheel`) to actually build a wheel

* Project metadata is atrocious, proximately a standardization issue, but ultimately because legacy `setup.py`-exclusive approaches are still supported

buggy6257 · a year ago
This really just reads like someone who ignored literally every good practice for using python and then pikachu_shocked.jpg when he set his world on fire

Just having a virtual environment and requirements.txt alone would solve 90% of this article.

Also with python 3.12 you literally CANT install python packages at the system level. Giant full page warning saying “use a venv you idiot”

I expected something along these lines and was still disappointed by TFA

jeremyjh · a year ago
The author's complaint is that python is supposed to be a good language for people new to programming to pick up. But the default tooling manages dependencies in a way that is unsound, and this has been known for more than a decade. Yet the defaults are still terrible, and regardless of how many articles there are on "best practices" a lot of people get burned.
zahlman · a year ago
People who are new to programming have a long way to go before even the concept of "managing dependencies" could possibly be made coherent for them. And the "unsoundness" described (i.e. not having lockfile-driven workflows by default) really just doesn't matter a huge percentage of the time. I've been writing Python for 20 years and what I write nowadays will still just work on multiple Python versions across a wide range of versions for my dependencies - if it even has any dependencies at all.

But nowadays people seem to put the cart before the horse, and try to teach about programming language ecosystems before they've properly taught about programming. People new to programming need to worry about programming first. If there are any concepts they need to learn before syntax and debugging, it's how to use a command line (because it'll be harder to drive tools otherwise; IDEs introduce greater complexity) and how to use version control (so they can make mistakes fearlessly).

Educators, my plea: if you teach required basic skills to programmers before you actually teach programming, then those skills are infinitely more important than modern "dependency management". And for heavens' sake, you can absolutely think of a few months' worth of satisfying lesson plans that don't require wrapping one's head around full-scale data-science APIs, or heaven forbid machine-learning libraries.

If you need any more evidence of the proper priorities, just look at Stack Overflow. It gets flooded with zero-effort questions dumping some arcane error message from the bowels of Tensorflow, forwarded from some Numpy 2d arrays used as matrices having the wrong shape - and it'll get posted by someone who has no concept of debugging, no idea of any of the underlying ML theory, and very possibly no idea what matrix multiplication is or why it's useful. What good is it to teach "dependency management" to a student who's miles away from understanding the actual dependencies being managed?

For that matter, sometimes they'll take a screenshot of the terminal instead of copying and pasting an error message (never mind proper formatting). Sometimes they even use a cell phone to take a picture of the computer monitor. You're just not going to teach "dependency management" successfully to someone who isn't properly comfortable with using a computer.

tomnipotent · a year ago
I don't see any language in the blog post about "people new to programming to pick up".

In fifteen years of using Python, the only people I see getting burned are, conveniently, the folks writing blogs on the subject. No one I've worked with or hired seems to be running into these issues. It's not to say that people don't run into issues, but the problems seem exaggerated every time this subject comes up.

Deleted Comment

drunkenmagician · a year ago
Yes! This exactly.
sunshowers · a year ago
The design of requirements.txt is a bit outdated -- it commits the first sin of developer tooling, which is to mix manually edited and automatically generated (via pip freeze) files. Newer systems use separate lockfiles for that reason, and uv brings this state of the art to Python.
askonomm · a year ago
And that makes this, what, the 100th package management solution for Python? A big reason why Python package management sucks is because there are as many package managers as there are frameworks in JavaScript. Nobody ever knows what is the standard, and by the time that information propagates, it is no longer the standard.
otteromkram · a year ago
That's why I have a requirements folder with separate files (eg - dev.txt, prod.txt) for various installation needs. If you want to include test dependencies in development, just add it into the file like you're installing a regular requirements file:

  -r test.txt


And, to double-down, if you read the pip documentation (the second sin of software development?), you can use things other than pip freeze. Like,

   python -m pip list --not-required 
That option flag is pretty nice because it excludes packages that aren't dependencies (aka - the primary packages that you need). If you do that you don't to worry about dependency management as much.

Deleted Comment

TZubiri · a year ago
You can use 2 separate requirements files no?

Pip install -f requirements.txt

Pip freeze > requirements.lock.txt

Pip install -f requirements.lock.txt

janice1999 · a year ago
I think 'pip freeze' was introduced later and requirements.txt was not designed with such a use in mind. pipenv and other tools have lockfile equivalents for a while.
zahlman · a year ago
In my experience, one of the biggest factors driving people to have the author's experience is... backwards compatibility.

The old ways of doing things have existed for much longer than the new ways, and become well established. Everyone just accepts the idea of copying Pip into every new virtual environment, even though it's a) totally unnecessary (even before the `--python` option was introduced two years ago, you could sort of get by with options like `--target` and `--python-version` and `--prefix` and `--platform` and `--abi`) and b) utterly insane (slow, wasteful, makes it more confusing when your PATH gets messed up, leads to misconceptions...). And that's before considering the things people try to do that aren't officially blessed use cases - like putting `sudo` and `--break-system-packages` on the same command line without a second thought, or putting code in `setup.py` that actually tries to copy the Python files to specific locations for the user, or trying to run Pip explicitly via its non-existent "API" by calling undocumented stuff instead of just specifying dependencies (including "extras" lists) properly. (The Pip documentation explicitly recommends invoking Pip via `subprocess` instead; but you're still probably doing something wrong if this isn't part of your own custom alternative to Poetry etc., and it won't help you if Pip isn't available in the environment - which it doesn't have to be, except to support that kind of insane use case).

Another part is that people just don't want to learn. Yes, you get a 'Giant full page warning saying “use a venv you idiot”'. Yes, the distro gets to customize that warning and tell you exactly what to do. Users will still give over a million hits to the corresponding Stack Overflow question (https://stackoverflow.com/questions/75608323), which will collect dozens of terrible answers, many of them suggesting complete circumvention of the package-management lock. It was over a year before anything significant was done about the top answer there (disclosure: I contributed quite a bit after that point; I have a personal policy of not writing new answers for Stack Overflow, but having a proper answer at the top of this question was far too important for me to ignore), which only happened because the matter was brought to the attention of the Python Discourse forum community (https://discuss.python.org/t/_/56900).

Starlevel004 · a year ago
> and requirements.txt

The proliferation of requirements.txt files is a massive reason for why Python dependency management sucks.

zahlman · a year ago
You'll be interested in relevant upcoming packaging standards PEPs: https://peps.python.org/pep-0751/ for lock files (still in discussion - have your say at https://discuss.python.org/t/_/69721), and (recently accepted, but I can't immediately point at existing tool support) https://peps.python.org/pep-0735/ for listing groups of dependencies in pyproject.toml.
TZubiri · a year ago
"You don't know what caused the breakage, and you don't know how to go back to a working environment. "

The author would be well served by using the first person and not including us in his uncertainty.

bmitc · a year ago
It still doesn't make Python's dependency management not to be horrible. Every other modern language has a single tool for these built-in. Python doesn't, with half of the tools coming with the core and half of the tools coming from third-parties and many issues, conflicts, and incompatibilities. Even Poetry, which makes things much, much easier, makes decisions that are incompatible or makes managing dependencies more difficult in some cases.
liontwist · a year ago
npm ecosystem refugees trying out a new platform
stcg · a year ago
One of the biggest usability problems with Python dependencies is that the name you import might be different from the name that you use to install the package.

So if you find some script on the web that has an `import foo` at the top, you cannot just `pip install foo`. Instead, you'll have to do some research into which package was originally used. Maybe it's named `pyfoo` or `foolib`.

Compare that to for example Java, which does not have that problem, thanks to Reverse Domain Name Notation. That is a much better system.

zahlman · a year ago
"install name == import name" cannot work in Python, because when you `pip install foo`, you may get more than one top-level package. Or you may get a single-file module. Or you may, validly per the spec, get no Python code whatsoever. (For example, you could publish large datasets separately from your data science library, as separate wheels which could be listed as optional dependencies.)

The lack of good namespacing practice is a problem. Part of the reason for it, in my estimation, is that developers have cargo-culted around a mistaken understanding of `__init__.py`.

dmart · a year ago
Venvs are so clunky and probably the biggest stumbling block for beginners. There was a proposal for a directory-based node_modules analogue which was unfortunately rejected.

I think that would have been the single biggest improvement to the Python onboarding experience ever.

zahlman · a year ago
> There was a proposal for a directory-based node_modules analogue which was unfortunately rejected.

There were many problems with the proposal. The corresponding discussion (https://discuss.python.org/t/_/963) is worth looking through, despite the length.

Installers like Pip could help by offering to install `--in-new-environment` for the first install, and Brett Cannon (a core developer) has done some work on a universal (i.e. not just Windows) "launcher" (https://github.com/brettcannon/python-launcher) which can automatically detect and use the project's venv if you create it in the right place (i.e., what you'd have to do with __pypackages__ anyway).

silverwind · a year ago
Indeed, python dug its own grave by not supporting in-directory venv.

One can emulate it with tools like poetry and uv but that incurs a performance penalty that every script has to go through `poetry run` and `uv run` which is often a few hundret ms and unsuitable for performant CLIs.

daemonologist · a year ago
For me the vast majority of the pain occurs when _updating_ dependencies. It can be an all-day chore to find and force versions of everything which will work together, and occasionally you'll have an irreconcilable conflict in subdependencies. I have in the past had to hack in a second version under a different name and update all imports in the parent dependencies to match.

I know other languages have various solutions for this (to basically have package namespaces local to a branch of the dependency tree), but I don't know how much better that experience is.

otteromkram · a year ago
Have you tried using pip? Or, reading the documentation? It has more features than just "install."

You could try running

  python -m pip check
To check dependencies. Or,

  python -m pip inspect 

To get a JSON output of your current virtual environment.

Or, update stuff automatically:

  python -m pip install --upgrade  
Or, skip worrying about dependencies:

  python -m pip install --no-deps
Or, do a dry run installation with full report on what would be installed:

pip install --ignore-installed --dry-run --quiet --report

And, there's a lot more than that. Pip is pretty powerful and I'm surprised everyone dislikes it so much.

Hope this helps. Cheers!

zahlman · a year ago
In principle, yes. In practice, there are a lot of issues with that workflow.

For example, `pip install --ignore-installed --dry-run --quiet --report` will build sdists (and run arbitrary code from `setup.py` or other places specified by a build backend) - just so that it can confirm that the downloaded sdist would produce a wheel with the right name and version. Even `pip download` will do the same. I'm not kidding. There are multiple outstanding issues on the tracker that are all ultimately about this problem, which has persisted through multiple versions of the UI and all the evolving packaging standards, going back almost the entire history of Pip.

See for example https://github.com/pypa/pip/issues/1884 ; I have a long list of related reports written down somewhere.

A security researcher was once infamously bitten by this (https://moyix.blogspot.com/2022/09/someones-been-messing-wit...).

daemonologist · a year ago
I appreciate the suggestions, unfortunately my problems aren't really with pip. It's more that packages ship over- or under-specific version requirements (e.g. dependency A wants numpy == 1.24.1, dependency B wants numpy >= 1.24 but actually doesn't work with 1.24.1 for some obscure reason (the authors probably didn't have A installed so they only tested with 1.24.2 and 1.26), but A actually will work fine with 1.25 so you need to override the 1.24.1 requirement). Or sometimes they have perfectly accurate but incompatible requirements and there's no standard or even good way to reconcile them.
tcfhgj · a year ago
10 hours later:

Invoking SAT with clause count: 9661561

Invoking SAT with clause count: 5164645

roenxi · a year ago
Bit of a Chesterton's fence situation. One of the reasons that Python is popular at all is because the dependency management is terrible and easy to get wrong. The point is it enables relative novices to get started without engaging with important questions. Things like what environmental assumptions are being made, what dependency versions are supported, what a sustainable build pipeline looks like. They can just be vague and their project will work in the here and now - enough to do whatever they want to do.

There is a trade off here and the equilibrium is probably deliberate. The sort of person who tries to get dependencies right up is a professional programmer and although a lot of them use Python the language is designed for a much broader audience. Java is an example of much better dependency management by default and in the main it is only professionals with a view to the long term using Java. Setting up a Java project needs a tutorial and ideally specialist software.

c0redump · a year ago
Can you elaborate on the difficulties of setting up Java projects? I have never worked in Java at an enterprise level, but have tinkered a bit. IntelliJ makes setting up a maven-based project pretty much one-click. But I’m guessing that the complexity that you’re referring to wouldn’t be apparent to someone like me who is only using it for small hobby projects.
roenxi · a year ago
You have to have identified that you need IntelliJ and know what a maven-based project is (indeed, know what a 'project' is, the concept is a bit technical). And there is the split between the JDK & JVM. This is all to be learned before actually approaching the challenge of adding a dependency to the project.

Compare that to a Python beginner where they install Python & use the text editor that exists on their machine & it can be a general purpose text editor rather than something specifically developed to write Java code. There might be one `pip install` along the way, but no requirement to understand a folder layout, project concept or even create a file for dependency management. There is even a reasonable chance that Python is pre-installed on their machine.

xg15 · a year ago
I don't think the OP's complaint was that python's dependency management wasn't complicated enough or doesn't involve enough cryptic XML yet. The point was that the novice will find themselves massively frustrated the moment they try to run their script on a different computer.

Indeed, the novice shouldn't have to make all kinds of intricate choices about the build system to get something running. The language designers should have provided a good set of default choices here. The problem with python is that the default choices aren't actually the good ones.