Readit News logoReadit News
monster_truck · 4 months ago
I've been burned too many times by embracing open source products like this.

We've been fed promises like these before. They will inevitably get acquired. Years of documentation, issues, and pull requests will be deleted with little-to-no notice. An exclusively commercial replacement will materialize from the new company that is inexplicably missing the features you relied on in the first place.

woodruffw · 4 months ago
For what it's worth, I understand this concern. However, I want to emphasize that pyx is intentionally distinct from Astral's tools. From the announcement post:

> Beyond the product itself, pyx is also an instantiation of our strategy: our tools remain free, open source, and permissively licensed — forever. Nothing changes there. Instead, we'll offer paid, hosted services that represent the "natural next thing you need" when you're already using our tools: the Astral platform.

Basically, we're hoping to address this concern by building a separate sustainable commercial product rather than monetizing our open source tools.

abdullahkhalids · 4 months ago
I believe that you are sincere and truthful in what you say.

Unfortunately, the integrity of employees is no guard against the greed of investors.

Maybe next year investors change the CEO and entire management and they start monetizing the open source tools. There is no way of knowing. But history tells us that there is a non-trivial chance of this happening.

wrasee · 4 months ago
The uncertainly over a future rug pull is always real, but in reality I wonder if the actual reason for people's hesitancy is more than just that. In reality I suspect it's closer to one of simply identity and the ownership model itself. Just the very idea that core tooling you depend on is in the hands of a commercial company is enough to many back off in a way one might not be when the tooling is in the hands of a broader community that one can support on more equal terms.

@woodruffw I love your quote above that commits you to your open source base and I'm rooting for you. But how about an approach that commits you to this sentence in a more rigorous and legal way, and spin off your open source tooling to a separate community-based entity? Of course, upon that you can continue to maintain sufficient representation to make Astral's commercial products the natural progression and otherwise the model remains the same. That would be a significant transfer of control, but it is that very transfer that would get a overwhelming response from the community and could really unblock these great tools for massive growth.

I work a lot with LLVM/Clang and whilst i know Apple and Google are significant contributors I feel confident that LLVM itself exists outside of that yet accept that e.g. Apple's contributions afford them weight to steer the project in ways that match their interests in e.g. Swift and Apple tooling.

BrenBarn · 4 months ago
It makes sense, but the danger can come when non-paying users unwittingly become dependent on a service that is subsidized by paying customers. What you're describing could make sense if pyx is only private, but what if there is some kind of free-to-use pyx server that people start using? They may not realize they're building on sand until the VC investors start tightening the screws and insist you stop wasting money by providing the free service.

(Even with an entirely private setup, there is the risk that it will encourage too much developer attention to shift to working within that silo and thus starve the non-paying community of support, although I think this risk is less, given Python's enormous breadth of usage across communities of various levels of monetization.)

physicsguy · 4 months ago
Conda said all this as well, and solved the same issues you’re trying to - namely precompiled versions of difficult to build packages. It then went commercial.

In the HPC space there are already Easybuild and Spack which make all the compiler tool chain and C and Fortran library dependency stuff very easy. They just haven’t taken off outside as they aim to solve cluster management problems but Spack is easy to self serve.

o11c · 4 months ago
The entire reason people choose "permissive licenses" is so that it won't last forever. At best, the community can fork the old version without any future features.

Only viral licenses are forever.

zemo · 4 months ago
> Basically, we're hoping to address this concern by building a separate sustainable commercial product rather than monetizing our open source tools.

jfrog artifactory suddenly very scared for its safety

zahlman · 4 months ago
Will pyx describe a server protocol that could be implemented by others, or otherwise provide software that others can use to host their own servers? (Or maybe even that PyPI can use to improve its own offering?) That is, when using "paid, hosted services like pyx", is one paying for the ability to use the pyx software in and of itself, or is one simply paying for access to Astral's particular server that runs it?
threatofrain · 4 months ago
The hosting and administration part is what’s expensive and can’t be free and open source except when someone pays for it. So is npm open source? Whether it is or isn't doesn't matter as much as whether Microsoft continues to pay the bill.
dataflow · 4 months ago
I think the only way you can alleviate people's concerns here is by legally binding the company in a way that would adequately address those concerns. Which I don't recall ever seeing any company attempt in such a scenario, probably because it obviously devalues the company as it restricts what a potential future buyer could do with it. Which is exactly why some people don't buy such pinky promises.
pbowyer · 4 months ago
This is the way it's worked for PHP's composer packages, and it's worked well.
crystal_revenge · 4 months ago
I love using uv, but having worked for a VC funded open source startup, your concerns are spot on.

As soon as there is a commercial interest competing with the open source project at the same company the OSS version will begin to degrade, and often the OSS community will be left in the dark about this. The startup I was at had plenty of funding, far too many engineers, and still removed basically every internal resource from the oss project except one person and drove out everyone working on the community end of things.

I would also recommend avoiding working for any open source startup if your goal is to get paid to contribute to a community project. Plenty of devs will take a reduced salary to work on a great community project, but most of the engineers I saw definitely got the "bait and switch" and moved immediately to commercial projects.

mnazzaro · 4 months ago
This is a valid concern, but astral just has an amazing track record.

I was surprised to see the community here on HN responding so cautiously. Been developing in python for about a decade now- whenever astral does something I get excited!

halfcat · 4 months ago
> This is a valid concern, but astral just has an amazing track record.

The issue is, track record is not relevant when the next investors take over.

JimDabell · 4 months ago
Frankly, it’s weird. You can find this business model all over the open-source world but for some reason Astral in particular is singled out for way more criticism on this than anything else I’ve seen, despite being unambiguously great contributors who have never put a foot wrong as far as I can tell.

Microsoft – who invented embrace, extend, and extinguish – own NPM, but I don’t see people wringing their hands over them in every thread that mentions NPM. But you mention Astral here or on Reddit and people line up to tell you it’s only a matter of time before they fuck people over. Why the disparity?

unmole · 4 months ago
HN comments default to cynisim.
sevensor · 4 months ago
It’s great that astral are doing what they do, but it’s important to hedge against their success. We must maintain ecosystem fragmentation, or Astral becomes a choke point where a bad actor can seize the entire Python community and extract rent from it. Like Anaconda but successful. So keep using pylint, keep using mypy, keep using PyPI. Not exclusively, but as a deterrent against this kind of capture.
nerdponx · 4 months ago
Has that ever happened in the Python ecosystem specifically? It seems like there would be a community fork led by a couple of medium-size tech companies within days of something like that happening, and all users except the most enterprise-brained would switch.

Deleted Comment

ActorNightly · 4 months ago
I agree. If any of the stuff was worthwhile to pursue, it would be merged into pip.
zahlman · 4 months ago
Pyx represents the server side, not the client side. The analogue in the pre-existing Python world is PyPI.

Many ideas are being added to recent versions of pip that are at least inspired by what uv has done — and many things are possible in uv specifically because of community-wide standards development that also benefits pip. However, pip has some really gnarly internal infrastructure that prevents it from taking advantage of a lot of uv's good ideas (which in turn are not all original). That has a lot to do with why I'm making PAPER.

For just one example: uv can quickly install previously installed packages by hard-linking a bunch of files from the cache. For pip to follow suit, it would have to completely redo its caching strategy from the ground up, because right now its cache is designed to save only download effort and not anything else about the installation process. It remembers entire wheels, but finding them in that cache requires knowing the URL from which they were downloaded. Because PyPI organizes the packages in its own database with its own custom URL scheme, pip would have to reach out to PyPI across the Internet in order to figure out where it put its own downloads!

woodruffw · 4 months ago
This doesn't generalize: you could have said the same thing about pip versus easy_install, but pip clearly has worthwhile improvements over easy_install that were never merged back into the latter.
nilamo · 4 months ago
Pip is broken and has been for years, they're uninterested in fixing the search. Or even removing the search or replacing it with a message/link to the package index.

imo, if pip's preference is to ship broken functionality, then what is/is not shipped with pip is not meaningful.

benrutter · 4 months ago
I'm assuming by pip you mean pypi (the package registry) - I think you're making the mistake of thinking it has a lot more resources than it does, because of how much it punches above its weight.

Pypi is powered by warehouse which has around 3 developers maintaining it[0]. They're doing an awesome job, but the funding and resource available to them are probably substantially less than Astral could have with a paid offering.

[0] https://github.com/pypi/warehouse/graphs/contributors

lijok · 4 months ago
It’s not that complex - just try it
rmonvfer · 4 months ago
To be honest, this was just a matter of time. As a long time Python developer, I just can’t wrap my head around the lack of something like this. GitHub was going to get hosted packages for Python but never did because it “didn’t align with their strategy objectives and a reallocation of resources” [1] (or some other similar corpospeak) Astral is a great company and I think we can’t question what they’ve achieved and provided to the Python community. uv is a game changer and solves one of the core issues with Python by providing a unified tool that’s also fast, reliable and easy to use. In fact, after using uv for the first time (coming from a combination of pyenv + poetry) I never wanted to go back and this is something all of my peers have experienced too. I’m glad it’s Astral who is doing this, and of course they will have to make money one way or another (which is perfectly fine and I don’t think anyone on this forum can be against that, as long as they are actually providing real value) but I was honestly tired of the paralysis on this matter. I did try to build a registry (pyhub.net) but being just one person with almost no resources and having another full time business made it impossible. Anyway, congrats to the team for the effort! [1] https://github.com/orgs/community/discussions/8542
westurner · 4 months ago
Is this problem also solved by storing software artifacts in OCI container image registries that already support SLSA-compliant TUF signatures?
miraculixx · 4 months ago
Anaconda solved the same problem ~10+ years ago already.
in9 · 4 months ago
HAHAHAH don't even get me started on how bad anaconda is. On how slow the installer + interpreter, how they avoided being a part of the usual pip workflow, bloated environment, cross platform inconsistencies, extremely slow dependency resolution, etc etc etc...
tylfin · 4 months ago
Posit has solved similar problems with their Package Manager as well, the benefit being that it's hosted on-prem, but the user has to build wheels for their desired architecture (if they're not on pypi).
hexo · 4 months ago
to be honest. ill never use uv. python ecosystem tools should be in python.
jpambrun · 4 months ago
This is very close minded. It's best to avoid statements like that.

I feel like having a working python environment is not a great requirement to managing your python environment.

gryn · 4 months ago
is this ragebait ?

most of the stuff in the python ecosystem have a core built in C, including the language itself.

Kevcmk · 4 months ago
Definitely has never seen/used uv
cma256 · 4 months ago
Wait until you find out what language Python is written in.
LucasOe · 4 months ago
Why?
miraculixx · 4 months ago
+1
ljm · 4 months ago
This is quite a disappointing self-limitation given the improvements uv brings to the table. You’re missing out on some good stuff.
mbonnet · 4 months ago
this is the kind of statement that will get anyone who works for me PIPd
runningmike · 4 months ago
All python packaging challenges are solved. Lesson learned is that there is not a single solution for all problems. getting more strings attached with VC funded companies and leaning on their infrastructure is a high risk for any FOSS community.
bastawhiz · 4 months ago
Well I started with pip because it's what I was told to use. But it was slow and had footguns. And then I started using virtualenv, but that only solved part of the problem. So I switched to conda, which sometimes worked but wrecked my shell profile and often leads to things mysteriously using the wrong version of a package. So someone told me to use pipenv, which was great until it was abandoned and picked up by someone who routinely broke the latest published version. So someone told me to use poetry, but it became unusably slow. So I switched back to pip with the built-in venv, but now I have the and problems I had before, with fewer features. So I switched to uv, because it actually worked. But the dependency I need is built and packaged differently for different operating systems and flavor of GPU, and now my coworkers can't get the project to install on their laptops.

I'm so glad all the Python packaging challenges are "solved"

Eduard · 4 months ago
I started with "sudo apt install python" a long time ago and this installed python2. This was during the decades-long transition from python2 to python3, so half the programs didn't work so I installed python3 via "sudo apt install python3". Of course now I had to switch between python2 and python3 depending on the program I wanted to run, that's why Debian/Ubuntu had "sudo update-alternatives --config python" for managing the symlink for "python" to either python2 or python3. But shortly after that, python3-based applications also didn't want to start with python3, because apt installed python3.4, but Python developers want to use the latest new features offered by python3.5 . Luckily, Debian/Ubuntu provided python3.5 in their backports/updates repositories. So for a couple of weeks things sort of worked, but then python3.7 was released, which definitely was too fresh for being offered in the OS distribution repositories, but thanks to the deadsnakes PPA, I could obtain a fourth-party build by fiddling with some PPA commands or adding some entries of debatable provenance to /etc/apt/lists.conf. So now I could get python3.7 via "sudo apt install python3.7". All went well again. Until some time later when I updated Home Assistant to its latest monthly release, which broke my installation, because the Home Assistant devs love the latest python3.8 features. And because python3.8 wasn't provided anymore in the deadsnakes PPA for my Ubuntu version, I had to look for a new alternative. Building python from source never worked, but thank heavens there is this new thing called pyenv (cf. pyenv), and with some luck as well as spending a weekend for understanding the differences between pyenv, pyvenv, venv, virtualenv (a.k.a. python-virtualenv), and pyenv-virtualenv, Home Assistant started up again.

This wall of text is an abridged excursion of my installing-python-on-Linux experience.

There is also my installing-python-on-Windows experience, which includes: official installer (exe or msi?) from python.org; some Windows-provided system application python, installable by setting a checkbox in Windows's system properties; NuGet, winget, Microsoft Store Python; WSL, WSL2; anaconda, conda, miniconda; WinPython...

anothernewdude · 4 months ago
I felt like python packaging was more or less fine, right up until pip started to warn me that I couldn't globally install packages anymore. So I need to make a billion venvs to install the same ml, plotting libraries and dependencies, that I don't want in a requirements.txt for the project.

I just want packaging to fuck off and leave me alone. Changes here are always bad, because they're changes.

seriocomic · 4 months ago
I've walked the same rocky path and have the bleeding feet to show for it! My problem is that now my packaging/environment mental model is so muddled I frequently mix up the commands...
integralid · 4 months ago
What's wrong with just using virtualenv. I never used anything else, and I never felt the need to. Maybe it's not as shipping l shiny as the other tools, but it just works.
btreecat · 4 months ago
> But the dependency I need is built and packaged differently for different operating systems and flavor of GPU, and now my coworkers can't get the project to install on their laptops.

This is why containers are great IMO.

It's meant to solve the problem of "well it works on my machine"

frollogaston · 4 months ago
Even the way you import packages is kinda wack
arkh · 4 months ago
Coming from the php ecosystem this kind of package manager problems feels crazy.

Maybe Python and js people should just use composer too.

voicedYoda · 4 months ago
You forgot the wheels and eggs
yulyavaluy · 4 months ago
We actually built a platform that eliminates all these steps, you can now reproduce GitHub repos with 0 manual config in 60% cases. check for more info on https://x.com/KeploreAI We just launched it and waiting for first users to be astonished :), let me know if you have any questions

Deleted Comment

aledalgrande · 4 months ago
Man I used python sparingly over the years and I still had to deal with all those package manager changes. Worse than the JS bundling almost?
computershit · 4 months ago
> All python packaging challenges are solved.

This comes across as uninformed at best and ignorant at worst. Python still doesn't have a reliable way to handle native dependencies across different platforms. pip and setuptools cannot be the end all be all of this packaging ecosystem nor should they be.

_the_inflator · 4 months ago
„across different platforms“

First things first:

Import path, os

I love Python, the ZEN of it, and you really need to accept the fact that there are conventions - quite a lot and that bash or shell scripts are where the magic happens, like environmental variables, if you know how to secure your app.

Even the self thing finally makes sense after years of bewilderment (“Wait: not even Java is that brutal to its users.”)

Lately stumbled over poetry after really getting the gist out of venv and pip.

Still hesitant, because Windows doesn’t play a role.

benreesman · 4 months ago
Try doing CUDA stuff. It's a chemical fire. And the money would make solving it would fund arbitrary largesse towards OSS in perpetuity.
dirkc · 4 months ago
I see VC money as an artificial force propping up a project. It is not bad per se, but VC money is not a constant and it leaves a big drop at the end. If there is a big enough community that has grown around the project, that drop might be okay.
tempest_ · 4 months ago
I share your concern but I have saved so much time with uv already that I figure ill ride it till the VC enshitification kills the host.

Hopefully at the point the community is centralized enough to move in one direction.

NegativeLatency · 4 months ago
I've been heartened by the progress that opentofu has made, so I think if it gets enough momentum it could survive the inevitable money grab
anitil · 4 months ago
I agree, now I just use uv and forget about it. It does use up a fair bit of disk, but disk is cheap and the bootstrapping time reduction makes working with python a pleasure again

Deleted Comment

JonChesterfield · 4 months ago
I've been dealing with python vs debian for the last three hours and am deeply angry with the ecosystem. Solved it is not.

Debian decided you should use venv for everything. But when packages are installed in a venv, random cmake nonsense does not find them. There are apt-get level packages, some things find those, others do not. Names are not consistent. There's a thing called pipx which my console recommended for much the same experience. Also the vestiges of 2 vs 3 are still kicking around in the forms of refusing to find a package based on the number being present or absent.

Whatever c++headerparser might be, I'm left very sure that hacking python out of the build tree and leaving it on the trash heap of history is the proper thing to do.

nemomarx · 4 months ago
from what I hear uv is the "solved" and venv by hand is the old way
Imustaskforhelp · 4 months ago
Uv truly is great, and I mean they are open source and we can always fork it just as how valkey forked redis

And also if you mean that pyx might be hosted on uv, well I think the discussion can go towards that pyx should be made open source but honestly, I am pretty sure that someone might look at pyx and create a pyx api compliant hosted server or I am still curious as to how pyx works and what it actually truly does.

chaostheory · 4 months ago
No. This is the only thing that python still doesn’t have just working. Otherwise there would be no excitement for anything new in this space.
mbonnet · 4 months ago
If Python packaging problems are solved, why is Python known for having the worst tooling ecosystem of any "modern" language?
miraculixx · 4 months ago
+1

Deleted Comment

zzzeek · 4 months ago
sorry, I guess you're new here? Here, try this Kool Aid. I think it will help you fit in. oh don't mind that "MongoDB" logo on the glass that's old
m_kos · 4 months ago
> Why is it so hard to install PyTorch, or CUDA, or libraries like FlashAttention or DeepSpeed that build against PyTorch and CUDA?

This is so true! On Windows (and WSL) it is also exacerbated by some packages requiring the use of compilers bundled with outdated Visual Studio versions, some of which are only available by manually crafting download paths. I can't wait for a better dev experience.

giancarlostoro · 4 months ago
Stuff like that led me fully away from Ruby (due to Rails), which is a shame, I see videos of people chugging along with Ruby and loving it, and it looks like a fun language, but when the only way I can get a dev environment setup for Rails is using DigitalOcean droplets, I've lost all interest. It would always fail at compiling something for Rails. I would have loved to partake in the Rails hype back in 2012, but over the years the install / setup process was always a nightmare.

I went with Python because I never had this issue. Now with any AI / CUDA stuff its a bit of a nightmare to the point where you use someone's setup shell script instead of trying to use pip at all.

awesome_dude · 4 months ago
Lets be honest here - whilst some experiences are better/worse than others, there doesn't seem to be a dependency management system that isn't (at least half) broken.

I use Go a lot, the journey has been

- No dependency management

- Glide

- Depmod

- I forget the name of the precursor - I just remembered, VGo

- Modules

We still have proxying, vendoring, versioning problems

Python: VirtualEnv

Rust: Cargo

Java: Maven and Gradle

Ruby: Gems

Even OS dependency management is painful - yum, apt (which was a major positive when I switched to Debian based systems), pkg (BSD people), homebrew (semi-official?)

Dependency Management is the wild is a major headache, Go (I only mention because I am most familiar with) did away with some compilation dependency issues by shipping binaries with no dependencies (meaning that it didn't matter which version of linux you built your binary for, it will run on any of the same arch linux - none of that "wrong libc" 'fun'), but you still have issues with two different people building the same binary in need of extra dependency management (vendoring brings with it caching problems - is the version in the cache up to date, will updating one version of one dependency break everything - what fun)

ilvez · 4 months ago
Do I get it right that this issue is within Windows? I've never heard of the issues you describe while working with Linux.. I've seen people struggle with MacOS a bit due to brew different versions of some library or the other, mostly self compiling Ruby.
avidphantasm · 4 months ago
Have you tried conda? Since the integration of mamba its solver is fast and the breadth of packages is impressive. Also, if you have to support Windows and Python with native extensions, conda is a godsend.
viraptor · 4 months ago
I would recommend learning a little bit of C compilation and build systems. Ruby/Rails is about as polished as you could get for a very popular project. Maybe libyaml will be a problem once in a while if you're compiling Ruby from scratch, but otherwise this normally works without a hassle. And those skills will apply everywhere else. As long as we have C libraries, this is about as good as it gets, regardless of the language/runtime.
nurettin · 4 months ago
Have you tried JRuby? It might be a bit too large for your droplet, but it has the java versions of most gems and you can produce cross-platform jars using warbler.
chao- · 4 months ago
I'm surprised to hear that. Ruby was the first language in my life/career where I felt good about the dependency management and packaging solution. Even when I was a novice, I don't remember running into any problems that weren't obviously my fault (for example, installing the Ruby library for PostgreSQL before I had installed the Postgres libraries on the OS).

Meanwhile, I didn't feel like Python had reached the bare minimum for package management until Pipenv came on the scene. It wasn't until Poetry (in 2019? 2020?) that I felt like the ecosystem had reached what Ruby had back in 2010 or 2011 when bundler had become mostly stable.

poly2it · 4 months ago
Have you tried Nix?

https://nixos.org

bytehumi · 4 months ago
This is the right direction for Python packaging, especially for GPU-heavy workflows. Two concrete things I'm excited about: 1) curated, compatibility-tested indices per accelerator (CUDA/ROCm/CPU) so teams stop bikeshedding over torch/cu* matrixes, and 2) making metadata queryable so clients can resolve up front and install in parallel. If pyx can reduce the 'pip trial-and-error' loop for ML by shipping narrower, hardware-targeted artifacts (e.g., SM/arch-specific builds) and predictable hashes, that alone saves hours per environment. Also +1 to keeping tools OSS and monetizing the hosted service—clear separation builds trust. Curious: will pyx expose dependency graph and reverse-dependency endpoints (e.g., "what breaks if X→Y?") and SBOM/signing attestation for supply-chain checks?
int_19h · 4 months ago
Given that WSL is pretty much just Linux, I don't see what relevance Visual Studio compiler versions have to it. WSL binaries are always built using Linux toolchains.

At the same time, even on Windows, libc has been stable since Win10 - that's 10 years now. Which is to say, any binary compiled by VC++ 2015 or later is C-ABI-compatible with any other such binary. The only reasons why someone might need a specific compiler version is if they are relying on some language features not supported by older ones, or because they're trying to pass C++ types across the ABI boundary, which is a fairly rare case.

m_kos · 4 months ago
If you have to use, e.g., CUDA Toolkit 11.8, then you need a specific version of VS and its build tools for CUDA's VS integration to work. I don't know why exactly that is and I wish I didn't have to deal with it.
morkalork · 4 months ago
This was basically the reason to use anaconda back in the day.
setopt · 4 months ago
In my experience, Anaconda (including Miniconda, Micromamba, IntelPython, et al.) is still the default choice in scientific computing and machine learning.
IHLayman · 4 months ago
Anaconda was a good idea until it would break apt on Ubuntu and make my job that much harder. That became the reason _not_ to use Anaconda in my book.

venv made these problems start to disappear, and now uv and Nix have closed the loop for me.

miraculixx · 4 months ago
Windows is the root cause here, not pip
miohtama · 4 months ago
In the past, part of the definition of an operating system was that it ships with a compiler.
int_19h · 4 months ago
When was that ever a part of the definition? It was part of the early Unix culture, sure, but even many contemporary OSes didn't ship with compilers, which were a separate (and often very expensive!) piece of software.

OTOH today most Linux distros don't install any dev tools by default on a clean install. And, ironically, a clean install of Windows has .NET, which includes a C# compiler.

Deleted Comment

simonw · 4 months ago
This is effectively what Charlie said they were going to build last September when quizzed about their intended business model on Mastodon: https://hachyderm.io/@charliermarsh/113103564055291456
pietroppeter · 4 months ago
And this fact effectively builds trust in the vision and in execution.
_verandaguy · 4 months ago
Soon: there are 14 competing Python packaging standards.

This is a joke, obviously. We've had more than 14 for years.

woodruffw · 4 months ago
Python packaging has a lot of standards, but I would say most of them (especially in the last decade) don't really compete with each other. They lean more towards the "slow accretion of generally considered useful features" style.

This itself is IMO a product of Python having a relatively healthy consensus-driven standardization process for packaging, rather than an authoritative one. If Python had more of an authoritative approach, I don't think the language would have done as well as it has.

(Source: I've written at least 5 PEPs.)

_verandaguy · 4 months ago
There are highs and lows to the system, just like with any system. Pip overall was a great package manager like 15 years ago, and a big step up from easy_install for sure (n.b., I started programming around the time easy_install was going out of fashion, so my point of view is coloured by that timing).

That said, it'd be nice if pip (or some PSF-blessed successor) adopted a model more similar to that offered by poetry (and formerly, pipenv, and now, I guess, uv) at least for package locking. `pip freeze > requirements.txt` isn't fit for purpose in the age of supply chain attacks, unfortunately, and PyPI already offers a reasonably workable backend for this model, whether the community at-large agrees to this or not. There are objective benefits (biggest one being better code integrity guarantees) that outweigh the objective (largely, performance-related) drawbacks.

callc · 4 months ago
Do you really think Python’s consensus-driven language development is better than authoritarian?

I am honestly tired of the Python packing situation. I breathe a sigh of relief in language like Go and Rust with an “authoritative” built-in solution.

I wouldn’t mind the 30 different packaging solutions as long as there was authoritative “correct” solution. All the others would then be opt-in enhancements as needed.

I guess a good thought experiment would be if we were to design a packaging system (or decide not to) for a new PL like python, what would it look like?

jsmeaton · 4 months ago
Astral folks that are around - there seems to be a bit of confusion in the product page that the blog post makes a little more clear.

> The next step in Python packaging

The headline is the confusing bit I think - "oh no, another tool already?"

IMO you should lean into stating this is going to be a paid product (answering how you plan to make money and become sustainable), and highlight that this will help solve private packaging problems.

I'm excited by this announcement by the way. Setting up scalable private python registries is a huge pain. Looking forward to it!

zanie · 4 months ago
Thanks for the feedback!
divbzero · 4 months ago
The combination of

– “client (uv) and server (pyx)” and

– “You can use it to host your own internal packages, or as an accelerated, configurable frontend to public sources like PyPI and the PyTorch index.”

is what really helped me understand what pyx aims to be.

IshKebab · 4 months ago
I would also put this list of issues that this fixes higher. It makes it more obvious what the point is. (And also a setuptools update literally broke our company CI last week so I was like "omg yes" at that point.)
ctoth · 4 months ago
As I said a couple weeks ago, they're gonna have to cash out at some point. The move won't be around Uv -- it'll be a protected private PyPi or something.

https://news.ycombinator.com/item?id=44712558

Now what do we have here?

snooniverse · 4 months ago
Not sure what you're trying to get at here. Charlie Marsh has literally said this himself; see e.g. this post he made last September:

> "An example of what this might look like (we may not do this, but it's helpful to have a concrete example of the strategy) would be something like an enterprise-focused private package registry."

https://hachyderm.io/@charliermarsh/113103605702842937

Astral has been very transparent about their business model.

MoreQARespect · 4 months ago
Astral doesn't really have a business model yet, it has potential business models.

The issue is that there isn't a clean business model that will produce the kind of profits that will satisfy their VCs - not that there isn't any business model that will help support a business like theirs.

Private package management would probably work fine if they hadn't taken VC money.

eldenring · 4 months ago
Cash out is a bit of a negative word here. They've shown the ability to build categorically better tooling, so I'm sure a lot of companies would be happy to pay them to fix even more of their problems.
klysm · 4 months ago
It’s not negative, it’s accurate. The playbook is well known and users should be informed.
kinow · 4 months ago
I haven't adopted uv yet watching to see what will be their move. We recently had to review our use of Anaconda tools due to their changes, then review Qt changes in license. Not looking forward to another license ordeal.
zanie · 4 months ago
We're hoping that building a commercial service makes it clear that we have a sustainable business model and that our tools (like uv) will remain free and permissively licensed.

(I work at Astral)

__mharrison__ · 4 months ago
You know what they say: The best time to adopt uv was last year...

I'm all seriousness, I'm all in on uv. Better than any competition by a mile. Also makes my training and clients much happier.

lenerdenator · 4 months ago
Fortunately for a lot of what uv does, one can simply switch to something else like Poetry. Not exactly a zero-code lift but if you use pyproject.toml, there are other tools.

Of course if you are on one of the edge cases of something only uv does, well... that's more of an issue.

int_19h · 4 months ago
Given how widely popular uv is, I'm pretty sure that in the event of any impactful license change it would immediately get forked.