Readit News logoReadit News
IceHegel · 3 years ago
If I see some JS, Go, or Rust code online I know I can probably get it running on my machine in less than 5 min. Most of the time, it's a ‘git clone’ and a 'yarn' | 'go install' | 'cargo run', and it just works.

With python, it feels like half the time I don't even have the right version of python installed, or it’s somehow not on the right path. And once I actually get to installing dependencies, there are often very opaque errors. (The last 2 years on M1 were really rough)

Setting up Pytorch or Tensorflow + CUDA is a nightmare I've experienced many times.

Having so many ways to manage packages is especially harmful for python because many of those writing python are not professional software engineers, but academics and researchers. If they write something that needs, for example, CUDA 10.2, Python 3.6, and a bunch of C audio drivers - good luck getting that code to work in less than a week. They aren’t writing install scripts, or testing their code on different platforms, and the python ecosystem makes the whole process worse by providing 15 ways of doing basically the same thing.

My proposal:

- Make poetry part of pip

- Make local installation the default (must pass -g for global)

- Provide official first party tooling for starting a new package

- Provide official first party tooling for migrating old dependency setups to the new standard

edit: fmt

kapitanjakc · 3 years ago
What you feel with python is what I feel with JS

managing npm and package.json which has dependency issues.

With python it always has been pip install package and we are done.

Although I do share your pain with versioning. I once spent a week debugging an issue only to find out that there's a fixed version available.

chrismorgan · 3 years ago
I find a difference between managing and using.

As a user of an application: `npm install` just works (same with `cargo build`). For Python, I’ll probably do `python -m venv env; . env/bin/activate` and then, well, it’s probably `pip install -r requirements.txt`, but sometimes it’ll be other things, and there are just too many options. I may well add --ignore-installed and use some packages installed locally of potentially different versions, e.g. for setting up Stable Diffusion recently (first time I’ve ever used the dGPU on my laptop) I wanted to use the Arch Linux packages for PyTorch and the likes.

For managing dependencies in an existing library or application (as distinct from starting from scratch, where you’ll have to make several extra choices in Python, and where npm is badly messed up for libraries), both npm and Python are generally fairly decent, but the whole you-can-only-have-one-version-of-a-library thing from Python lends itself more to insurmountable problems.

My personal background: many years of experience with both npm and Python, but I’ve never done all that much packaging in either. (Also many years of Rust, and I have done packaging there, and it’s so much easier than both.)

moring · 3 years ago
> With python it always has been pip install package and we are done.

The worst I encountered was a somewhat lengthy manual to build a project (cannot remember what it was), and not only did you have to manually install all required dependencies -- npm automates this by having an npm-readable list in package.json -- but after running all those commands, the last note said something like "oh BTW, whenever we wrote "pip" above we actually meant "pip3", use that instead, and if you did it wrong, there's no way to undo it."

jkrubin · 3 years ago
Once I switched to pnpm many of those problems went away for me. I manage a slew of js/ts monorepos and pnpm is a god send, not because of performance or workspaces, but because things get resolved sanely.
DangitBobby · 3 years ago
I used to feel this way in the JS ecosystem. It's been quite a while since I've encountered something insurmountable (unless it's something like Bit using bespoke packaging).
qbasic_forever · 3 years ago
Libraries that use native code in JS or C-bindings in Go are equally annoying to get going if the documentation from the author is sub-par. In the JS world you'll get pages and pages of node-gyp errors that are completely incomprehensible until you realize the author never mentioned that you needed a half dozen little development library dependencies installed. Native C/C++ library code interfacing just sucks in general because there is zero package or dependency management in that world too.
smeagull · 3 years ago
I think some of the pain has been Apple's fault. Requiring conda for some official packages means you always have two competing ecosystems on one machine - which is just asking for pain.
greymalik · 3 years ago
What packages are you referring to? I’ve been doing professional Python dev on a Mac for the past three years and have never had a reason to use conda, so I’m curious what I’m missing.
thrown_22 · 3 years ago
Installing Python only applications is trivial. What you're complaining about is all the missing code in other languages which isn't controlled by Python and depends on the OS to provide.

This is why we created Linux distributions in the first place. It is not the place of every language to reinvent the wheel - poorly.

tibbon · 3 years ago
Isn’t wheel one of pythons many packaging schemes (intended to replace eggs?)?

> There should be one– and preferably only one –obvious way to do it.

Oh no…

bno1 · 3 years ago
I wish pip had some package deduplication implemented. Even some basic local environments have >100MB of dependencies. ML environments go into the gigabytes range from what I remember.
smeagull · 3 years ago
Cargo allows you to share a single package directory. Having a single site-packages and the ability to import a particular version (or latest by default) would solve this.
BerislavLopac · 3 years ago
I'm not sure what you mean - unlike npm, pip installs only one copy of a dependency in each environment.
lucyferzyr · 3 years ago
As a senior dev who is just starting with python, I just wish python makes pip being able to handle everything nicely. Hate seeing a bunch of tuts saying that I should use other tools (like poetry or whatever) because pip doesn't handle X use case.

I completely agree with your feeling. I been working with js, php, ruby for years, and packaging is pretty straightforward in every one of them. And the lang versioning works for every one of them (php is the most annoying, yeah, but even on php is easier than in python). Ruby has several alternatives like rbenv or rvm, but any of them works has every feature 99,9% of developers needs, and they work just fine. I want that in python ):

bilalq · 3 years ago
For all the complaints I hear about JS/npm, it's been so much less of a hassle than Gradle configs, Gemfiles, Composer files, and whatever python has going on.

I don't know how I'd feel about poetry being part of pip. Didn't poetry just push out a change where they deciding randomly failing 5% of the time while running in a CI env was a good idea?

The Python ecosystem is a bit of a disaster. There are many useful packages and programs written in Python, but they're legit scary to install. I always search for Go or JS alternatives if they're available to avoid all the python headaches.

Every time I need to interact with Python, I find myself banging my head trying to figure out eggs vs wheels vs distutils vs setuptools vs easyinstall vs pip vs pip3 vs /usr/bin/pip3 vs /opt/homebrew/bin/pip3 vs pip3.7 vs pip3.8 vs pipx vs pip-tools vs poetry vs pyenv vs pipenv vs virtualenv vs venv vs conda vs anaconda vs miniconda.

rumblings · 3 years ago
After working with Java for over a year professionally, I really appreciate its dependency management system(Maven or Gradle). Whereas with Python it is always a mess(Poetry looks promising though).
isitmadeofglass · 3 years ago
> Whereas with Python it is always a mess

I’m curious because this gets repeated a lot by many people. What specific messes do you get into?

I’m asking because my experience with python these days is always just doing “python -m venv .venv” activating it and then using “pip install -r requirements.txt”

To update dependencies I do “pip install —-upgrade dep” run tests and then go “pip freeze > requirements.txt” this never fails me. Though sometimes updating dependencies does make tests fail of cause, but that’s the fault of the individual packages not the packaging system. Even so I’d say even that is rare for me these days.

I know the might only work for 95% of the workflows out there, but I’m very curious as to what specific messes the last 5% end up struggling with and what makes people like you feel that it’s always a mess and not just “sometimes it gets messy” etc.

KronisLV · 3 years ago
> After working with Java for over a year professionally, I really appreciate its dependency management system(Maven or Gradle).

Personally, it feels better in some ways and worse in others.

The whole pom.xml approach/format that Maven uses seems decent, all the way up to specifying which registries or mirrors you want to use (important if you have something like Nexus). Although publishing your own package needs additional configuration, which may mean putting credentials in text files, though thankfully this can be a temporary step in the CI pipeline.

That said, personally I almost prefer the node_modules approach to dependencies that Node uses (and I guess virtualenv to a degree), given that (at least usually) everything a particular project needs can easily be in a self-contained folder. The shared .m2 cache isn't a bad idea, it can just make cleaning it periodically/for particular projects kind of impossible, which is a shame.

I think one of the aspects that make dependencies better in JVM land is the fact that you oftentimes compile everything your app needs (perhaps without the JVM, though) into a .jar file or something similar, which can then be deployed. Personally, I think that that's one of the few good approaches, at least for business software, which is also why I like Go and .NET when similar deployments are possible. JVM already provides pretty much everything else you might need runtime-wise and you don't find yourself faffing about with system packages, like DB drivers.

That said, what I really dislike about the JVM ecosystem and frameworks like Spring, is the reliance on dynamic loading of classes and the huge amounts of reflection-related code that is put into apps. It's gotten to the point where even if your code has no warnings and actually compiles, it might still easily fail at runtime. Especially once you run into issues where dependencies have different versions of the same package that they need themselves, or alternatively your code doesn't run into the annotations/configuration that it needs.

Thankfully Spring Boot seems like a (decent) step forwards and helps you avoid some of the XML hell, but there is still definitely lots of historical baggage to deal with.

Personally, I like Python because of how easy it is to work with once things actually work and its relatively simplistic nature and rich ecosystem... but package management? I agree that it could definitely use some work. Then again, personally I just largely have stuck with the boring setup of something like pip/virtualenv inside of containers, or whatever is the most popular/widespread at any given moment.

Of course, Python is no exception here, trying to work with old Ruby or Node projects also sometimes has issues with getting things up and running. Personally I feel that the more dependencies you have, the harder it will be to keep your application up and running, and later update it (for example, even in regards to front end, consider React + lots of additional libraries vs something like Angular which has more functionality out of the box, even if more tightly coupled).

orf · 3 years ago
For improvements I commented: Remove setup.py files and mandate wheels. This is the root cause of a lot of the evil in the ecosystem.

Next on the list would be pypi namespaces, but there are good reasons why that is very hard.

The mission statement they are proposing, “a packaging ecosystem for all”, completely misses the mark. How about a “packaging ecosystem that works” first?

I spent a bunch of time recently fixing our internal packaging repo (nexus) because the switch from md5 hashes to sha256 hashes broke everything, and re-locking a bajillion lock files would take literally months of man hours time.

I’ve been a Python user for the last 17 years, so I’m sympathetic of how we got to the current situation and aware that we’ve actually come quite far.

But every time I use Cargo I am insanely jealous, impressed and sad that we don’t have something like it. Poetry is closest, but it’s a far cry.

dalke · 3 years ago
> Remove setup.py files and mandate wheels

What alternative is there for me?

My package has a combination of hand-built C extensions and Cython extensions, as well as a code generation step during compilation. These are handled through a subclass of setuptools.command.build_ext.build_ext.

Furthermore, I have compile-time options to enable/disable certain configuration options, like enabling/disabling support for OpenMP, via environment variables so they can be passed through from pip.

OpenMP is a compile-time option because the default C compiler on macOS doesn't include OpenMP. You need to install it, using one of various approaches. Which is why I only have a source distribution for macOS, along with a description of the approaches.

I have not found a non-setup.py way to handle my configuration, nor to provide macOS wheels.

Even for the Linux wheels, I have to patch the manylinux Docker container to whitelist libomp (the OpenMP library), using something like this:

  RUN perl -i -pe 's/"libresolv.so.2"/"libresolv.so.2", "libgomp.so.1"/'
    /opt/_internal/pipx/venvs/auditwheel/lib/python3.9/site-packages/
    auditwheel/policy/manylinux-policy.json
Oh, and if compiling where platform.machine() == "arm64" then I need to not add the AVX2 compiler flag.

The non-setup.py packaging systems I've looked at are for Python-only code bases. Or, if I understand things correctly, I'm supposed to make a new specialized package which implements PEP 518, which I can then use to boot-strap my code.

Except, that's still going to use effectively arbitrary code during the compilation step (to run the codegen) and still use setup.py to build the extension. So it's not like the evil disappears.

orf · 3 years ago
To be clear, I’m not suggesting we remove the ability to compile native extensions.

I’m suggesting we find a better way to build them, something a bit more structured, and decouple that specific use case from setup.py.

It would be cool to be able to structure this in a way that means I can describe what system libraries I may need without having to execute setup.py and find out, and express compile time flags or options in a structured way.

Think of it like cargo.toml va build.rs.

korijn · 3 years ago
I emphatize with your situation and it's a great example. As crazy as this may sound, I think you would have to build every possible permutation of your library and make all of them available on pypi. You'd need a some new mechanism based on metadata to represent all the options and figure out how to resolve against available system libraries. Especially that last part seems very complicated. But I do think it's possible.
pabs3 · 3 years ago
> not add the AVX2 compiler flag

It is a better idea to do instruction selection at runtime in the code that currently uses AVX2. I recently wrote some docs for Debian contributors about the different ways to achieve this:

https://wiki.debian.org/InstructionSelection

Deleted Comment

blibble · 3 years ago
> The mission statement they are proposing, “a packaging ecosystem for all”, completely misses the mark. How about a “packaging ecosystem that works” first?

I think at the point a programming language is going on about "mission statements" for a packaging tool, you know they've lost the plot

copy Maven from 2004 (possibly with less XML)

that's it, problem solved

ziml77 · 3 years ago
I tend to just give up on a package if it requires a C toolchain to install. Even if I do end up getting things set up in a way that the library's build script is happy with, I'll be inflicting pain on anyone else who then tries to work with my code.
cycomanic · 3 years ago
I know this is unpopular opinion on here, but I believe all this packaging madness is forced on us by languages because Windows (and to a lesser degree osx) have essentially no package management.

Especially installing a tool chain to compile C code for python is no issue on Linux, but such a pain on Windows.

tux3 · 3 years ago
It feels so suboptimal to need the C toolchain to do things, but having no solid way to depend on it as a non-C library (especially annoying in Rust, which insists on building everything from source and never installing libraries globally).

I make a tool/library that requires the C toolchain at runtime. That's even worse than build time, I need end users to have things like lld, objdump, ranlib, etc installed anywhere they use it. My options are essentially:

- Requiring users to just figure it out with their system package manager

- Building the C toolchain from source at build time and statically linking it (so you get to spend an hour or two recompiling all of LLVM each time you update or clear your package cache! Awesome!),

- Building just LLD/objdump/.. at build-time (but user still need to install LLVM. So you get both slow installs AND have to deal with finding a compatible copy of libLLVM),

- Pre-compiling all the C tools and putting them in a storage bucket somewhere, for all architectures and all OS versions. But then not have support when things like the M1 or new OS versions right away, or people on uncommon OSes. And now need to maintain a build machine for all of these myself.

- Pre-compile the whole C toolchain to WASM, build Wasmtime from source instead, and just eat the cost of Cranelift running LLVM 5-10x slower than natively...

I keep trying to work around the C toolchain, but I still can't see any very good solution that doesn't make my users have extra problems one way or another.

Hey RiiR evangelism people, anyone want to tackle all of LLVM? .. no? No one? :)

Deleted Comment

korijn · 3 years ago
...and ensure _all_ package metadata required to perform dependency resolution can be retrieved through an API (in other words without downloading wheels).
orf · 3 years ago
Yeah, that’s sort of what I meant by my suggestion. Requirements that can only be resolved by downloading and executing code is a huge burden on tooling
dalke · 3 years ago
What if I have a dependency on a commercial third-party Python package which is on Conda but not on PyPI?
progval · 3 years ago
> For improvements I commented: Remove setup.py files and mandate wheels.

This would make most C extensions impossible to install on anything other than x86_64-pc-linux-gnu (or arm-linux-gnueabihf/aarch64-linux-gnu if you are lucky) because developers don't want to bother building wheels for them.

mathstuf · 3 years ago
I think it'd make other things impossible too. One project I help maintain is C++ and is mainly so. It optionally has Python bindings. It also has something like 150 options to the build that affect things. There is zero chance of me ever attempting to make `setup.py` any kind of sensible "entry point" to the build. Instead, the build detects "oh, you want a wheel" and generates `setup.py` to just grab what the C++ build then drops into a place where `build_ext` or whatever expects them to be using some fun globs. It also fills in "features" or whatever the post-name `[name]` stuff is called so you can do some kind of post-build "ok, it has a feature I need" inspection.
urschrei · 3 years ago
cibuildwheel (which is an official, supported tool) has made this enormously easier. I test and generate wheels with a compiled (Rust! Because of course) extension using a Cython bridge for all supported Python versions for 32-bit and 64-bit Windows, macOS x86_64 and arm64, and whatever manylinux is calling itself this week. No user compilation required. It took about half a day to set up, and is extremely well documented.
kevin_thibedeau · 3 years ago
Setup.py can do things wheels can't. Most notably it's the only installation method that can invoke 2to3 at runtime without requiring a dev to create multiple packages.
orf · 3 years ago
It’s lucky Python 2 isn’t supported anymore then, and everyone has had like a decade to run 2to3 once and publish a package for Python 3, so that use case becomes meaningless.
cozzyd · 3 years ago
Many "Python" packages include native code in some form either as bindings or to workaround Python being agonizingly slow. Which means you often need to call make, or cmake or some other build system anyway... unless you want to build wheels for every possible configuration a user might have (which is virtually impossible, considering every combination of OS, architecture, debug options, etc. you may want to support). Plus you need a build system to build the wheels anyway...
x3n0ph3n3 · 3 years ago
I suggested "one packaging system to rule them all." The fragmentation in this space is frustrating.
baggiponte · 3 years ago
I recommend PDM over poetry!
at_a_remove · 3 years ago
I have a terrible admission to make: one of the reason I like Python is its huge standard library, and I like that because I just ... despise looking for libraries, trying to install them, evaluating their fitness, and so on.

I view dependencies outside of the standard library as a kind of technical debt, not because I suffer from Not Invented Here and want to code it myself, no, I look and think, "Why isn't this in the standard library with a working set of idioms around it?"

I haven't developed anything with more than five digits of code to it, which is fine for me, but part of it is just ... avoidance of having to screw with libraries. Ran into a pip issue I won't go into (it requires a lot of justification to see how I got there) and just ... slumped over.

This has been a bad spot in Python for a long, long time. While people are busy cramming their favorite feature from their last language into Python, this sort of thing has languished.

Sadly, I have nothing to offer but encouragement, I don't know the complexities of packaging, it seems like a huge topic that perhaps nobody really dreamed Python would have to seriously deal with twenty years ago.

samwillis · 3 years ago
> despise looking for libraries, trying to install them, evaluating their fitness, and so on.

This is exactly why I prefer the larger opinionated web frameworks (Django, Vue.js) to the smaller more composable frameworks (Flask, React). I don’t what to make decisions every time I need a new feature, I want something that “just works”.

Python and Django just work, and brilliantly at that!

5d8767c68926 · 3 years ago
Currently dealing with Flask and it makes me sad from the endless decision fatigue. Enormous variations in quality of code, documentation, SO answers, etc. To not even consider the potential for supply side attacks.

With Django there is a happy path answer for most everything. If I run into a problem, I know I'm not the first.

Deleted Comment

manuelabeledo · 3 years ago
This is one of the reasons why I don't quite like Node. It feels like everything is a dependency.

It seems ridiculous to me that there isn't a native method for something as simple and ubiquitous as putting a thread to sleep, or that there is an external library (underscore) that provides 100+ methods that seem to be staples in any modern language.

Python is nice in that way. It is also opinionated in a cohesive and community driven manner, e.g. PEP8.

mrweasel · 3 years ago
If requests and a basic web framework was in the standard library you’d effectively eliminate the majority of my dependencies.

Honestly I doubt see the package management being an issue for most end-users. Between the builtin venv, conda and Docker I feel that the use-cases for most is well covered.

The only focus area I really see is better documentation. Easier to read documentation more precisely. Perhaps a set of templates to help people getting start with something like pyproject.

It feels like the survey is looking for a specific answer, or maybe it’s just that surveys are really hard to do. In any case I find responses to be mostly: I have no opinion one way or the other.

rpcope1 · 3 years ago
Something like bottle.py would be an excellent candidate for inclusion. The real reason to avoid putting anything into the standard library is that it seems to often be the place where code goes to stagnate and die for Python.
qayxc · 3 years ago
> I view dependencies outside of the standard library as a kind of technical debt

That's an interesting position. So are you suggesting that very specialised packages such as graph plotting, ML-packages, file formats, and image processing should be part of the standard library? What about very OS/hardware-specific packages, such as libraries for microcontrollers?

There are many areas that don't have a common agreed-upon set of idioms or functionality and that are way too specialised to be useful for most users. I really don't think putting those into the standard library would be a good idea.

at_a_remove · 3 years ago
Hrm. Graph-plotting ... yes. File formats ... yes, as many as possible. Image processing, given the success of ImageMagick, I'd say yes there as well. I don't know much about ML to say.

OS-specific packages, quite possibly.

The thing about the standard library is that it is like high school: there's a lot of stuff you think you will never need, and you're right about most of it, but the stuff you do need you're glad you had something going, at least.

itake · 3 years ago
More packages in the standard library means it can run in less machines and more extra junk needs to be installed.

Minimal standard library languages let you pick and choose what needs to be run. Golang is a nice happy medium since it’s compiled.

ryan29 · 3 years ago
I didn't take the survey because I've never packaged anything for PyPI, but I wish all of the package managers would have an option for domain validated namespaces.

If I own example.com, I should be able to have 'pypi.org/example.com/package'. The domain can be tied back to my (domain verified) GitHub profile and it opens up the possibility of using something like 'example.com/.well-known/pypi/' for self-managed signing keys, etc..

I could be using the same namespace for every package manager in existence if domain validated namespaces were common.

Then, in my perfect world, something like Sigstore could support code signing with domain validated identities. Domain validated signatures make a lot of sense. Domains are relatively inexpensive, inexhaustible, and globally unique.

For code signing, I recognize a lot of project names and developer handles while knowing zero real names for the companies / developers involved. If those were sitting under a recognizable organizational domain name (example.com/ryan29) I can do a significantly better job of judging something's trustworthiness than if it's attributed to 'Ryan Smith Inc.', right?

jacques_chester · 3 years ago
Maven Central requires validation of a domain name in order to use a reverse-domain package[0].

It's not without problems. One is that folks often don't control the domain (consider Go's charming habit of conflating version control with package namespacing). Another is what was noted below: resurrection attacks on domains can be quite trivial and already happen in other forms (eg registering lapsed domains for user accounts and performing a reset).

[0] https://central.sonatype.org/faq/how-to-set-txt-record/

simonw · 3 years ago
That's a really interesting idea, but I worry about what happens when a domain name expires and is re-registered (potentially even maliciously) by someone else.
ryan29 · 3 years ago
I think you'd probably need some buy in from the domain registries and ICANN to make it really solid. Ideally, domains would have something similar to public certificate transparency logs where domain expirations would be recorded. I even think it would be reasonable to log registrant changes (legal registrant, not contact info). In both cases, it wouldn't need to include any identifiable info, just a simple expired/ownership changed trigger so others would know they need to revalidate related identities.

I don't know if registries would play ball with something like that, but it would be useful and should probably exist anyway. I would even argue that once a domain rolls through grace, redemption, etc. and gets dropped / re-registered, that should invalidate it as an account recovery method everywhere it's in use.

There's a bit of complexity when it comes to the actual validation because of stuff like that. I think you'd need buy in from at least one large company that could do the actual verification and attest to interested parties via something like OAuth. Think along the lines of "verify your domain by logging in with GitHub" and at GitHub an organization owner that's validated their domain would be allowed to grant OAuth permission to read the verified domain name.

rhyselsmore · 3 years ago
Needs compensating controls to get it right.

* Dependencies are managed in a similar way to Go - where hashes of installed packages are stored and compared client side. This means that a hijacker could only serve up the valid versions of packages that I’ve already installed.

* This is still a “centralized” model where a certain level of trust is placed in PyPi - a mode of operation where the “fingerprint” of the TLS key is validated would assist here. However it comes with a few constraints.

Of course the above still comes with the caveat that you have to trust pypi. I’m not saying that this is an unreasonable ask. It’s just how it is.

westurner · 3 years ago
CT: Certificate Transparency logs log creation and revocation events.

The Google/trillian database which supports Google's CT logs uses Merkle trees but stores the records in a centralized data store - meaning there's at least one SPOF Single Point of Failure - which one party has root on and sole backup privileges for.

Keybase, for example, stores their root keys - at least - in a distributed, redundantly-backed-up blockchain that nobody has root on; and key creation and revocation events are publicly logged similarly to now-called "CT logs".

You can link your Keybase identity with your other online identities by proving control by posting a cryptographic proof; thus adding an edge to a WoT Web of Trust.

While you can add DNS record types like CERT, OPENPGPKEY, SSHFP, CAA, RRSIG, NSEC3; DNSSEC and DoH/DoT/DoQ cannot be considered to be universally deployed across all TLDs. Should/do e.g. ACME DNS challenges fail when a TLD doesn't support DNSSEC, or hasn't secured root nameservers to a sufficient baseline, or? DNS is not a trustless system.

EDNS (Ethereum DNS) is a trustless system. Reading EDNS records does not cost EDNS clients any gas/particles/opcodes/ops/money.

Blockcerts is designed to issue any sort of credential, and allow for signing of any RDF graph like JSON-LD.

List_of_DNS_record_types: https://en.wikipedia.org/wiki/List_of_DNS_record_types

Blockcerts: https://www.blockcerts.org/ https://github.com/blockchain-certificates :

> Blockcerts is an open standard for creating, issuing, viewing, and verifying blockchain-based certificates

W3C VC-DATA-MODEL: https://w3c.github.io/vc-data-model/ :

> Credentials are a part of our daily lives; driver's licenses are used to assert that we are capable of operating a motor vehicle, university degrees can be used to assert our level of education, and government-issued passports enable us to travel between countries. This specification provides a mechanism to express these sorts of credentials on the Web in a way that is cryptographically secure, privacy respecting, and machine-verifiable

W3C VC-DATA-INTEGRITY: "Verifiable Credential Data Integrity 1.0" https://w3c.github.io/vc-data-integrity/#introduction :

> This specification describes mechanisms for ensuring the authenticity and integrity of Verifiable Credentials and similar types of constrained digital documents using cryptography, especially through the use of digital signatures and related mathematical proofs. Cryptographic proofs enable functionality that is useful to implementors of distributed systems. For example, proofs can be used to: Make statements that can be shared without loss of trust,

W3C TR DID (Decentralized Identifiers) https://www.w3.org/TR/did-core/ :

> Decentralized identifiers (DIDs) are a new type of identifier that enables verifiable, decentralized digital identity. A DID refers to any subject (e.g., a person, organization, thing, data model, abstract entity, etc.) as determined by the controller of the DID. In contrast to typical, federated identifiers, DIDs have been designed so that they may be decoupled from centralized registries, identity providers, and certificate authorities. Specifically, while other parties might be used to help enable the discovery of information related to a DID, the design enables the controller of a DID to prove control over it without requiring permission from any other party. DIDs are URIs that associate a DID subject with a DID document allowing trustable interactions associated with that subject.

> Each DID document can express cryptographic material, verification methods, or services, which provide a set of mechanisms enabling a DID controller to prove control of the DID. Services enable trusted interactions associated with the DID subject. A DID might provide the means to return the DID subject itself, if the DID subject is an information resource such as a data model.

mdmglr · 3 years ago
My wishlist:

We need a way to configure an ordered list of indexes pip searches for packages. —extra-index-url or using a proxy index is not the solution.

Also namespaces and not based on a domain. So for example: pip install apache:parquet

Also some logic either in the pip client or index server to minimize typosquatting

Also pip should adopt a lock file similar to npm/yarn. Instead of requirements.txt

And also “pip list” should output a dependency tree like “npm list”

I should not have to compile source when I install. Every package should have wheels available for the most common arch+OS combos.

Also we need a way to download only what you need. Why does installing scipy or numpy install more dependencies than the conda version? For example pywin and scipy.

tempest_ · 3 years ago
If you are using poetry you can add something to the pyproject.toml to handle the indexes, though I am not sure if they are ordered or not

[[tool.poetry.source]] name = "my-pypi" url = "https://my-pypi-index.wherever" secondary = true

mdmglr · 3 years ago
Thanks. I’ll look into this.
jbylund · 3 years ago
Typosquatting is a thing that has been looked at and is being looked at:

https://github.com/pypi/warehouse/pull/5001 - had to be reverted because it was too noisy

https://github.com/pypi/warehouse/issues/9527

ciupicri · 3 years ago
> apache:parquet

How are you going to name the file storing the wheel for that package? Using ":" on Windows is going to be problematic.

mdmglr · 3 years ago
That’s right. Then some other delimiter.
clintonb · 3 years ago
MonkeyMalarky · 3 years ago
Seems more oriented to (potential) contributors than end users of the packaging system. Who cares about mission statements and inclusivity, secure funding and pay developers to make the tools.
woodruffw · 3 years ago
> Who cares about mission statements and inclusivity, secure funding and pay developers to make the tools.

These are connected things.

I maintain a PyPA member project (and contribute to many others), and the latter is aided by the former: the mission statement keeps the community organized around shared goals (such as standardizing Python's packaging tooling), and inclusivity insures a healthy and steady flow of new contributors (and potential corporate funding sources).

nomdep · 3 years ago
The PSF are not engineers looking for a better developer experience, but politicians looking for power. That’s why the pipenv fiasco a few years ago
crazytalk · 3 years ago
This survey is the literal definition of leading question. Found about 2 boxes I could tick, before being forced to order a list of the designer's preferences according to how much I agree with them. The only data that can be generated from a survey like this is the data you wanted to find (see also Boston Consulting Group article earlier today). I cannot honestly respond to it

The only question I have is, what grant application(s) is the survey data being used to support?

KingEllis · 3 years ago
The absence of the go binary as a tool (i.e. "go get ...", "go install ..." etc.) is odd, considering that is what has been eating Python's lunch lately.
zbentley · 3 years ago
I imagine many of you have feedback that could be useful to folks making decisions about the future of Python packaging, a common subject of complaint in many discussions here.

Remember not to just complain, but to offer specific problems/solutions--i.e. avoid statements like "virtualenvs suck, why can't it be like NPM?" and prefer instead feedback like "the difference between Python interpreter version and what virtualenv is being used causes confusion".

Kwpolska · 3 years ago
"virtualenvs suck, why can't it be like NPM?" is a specific problem and a specific solution. The problem being having to manage venvs (which have many gotchas and pitfalls, and no standarization), and the solution is to replace those with packages being installed into the project folder with standardized and well-known tools.
qbasic_forever · 3 years ago
Keep an eye on https://peps.python.org/pep-0582/ it's a proposal to add local directory/node_modules-like behavior to package installs. It stalled out a few years ago but I heard there is a lot more discussion and push to get it in now.

I think if this PEP makes it in then like 90% of people's pain with pip just completely goes away almost overnight. Love it or hate it the NPM/node_modules style of all dependencies dumped in a local directory solves a _ton_ of problems in the packaging world. It would go a long way towards making the experience much smoother for most python users.

jogjayr · 3 years ago
I did a proof-of-concept for this a few years ago: https://github.com/jogjayr/pykg

It literally uses npm, the NPM registry, and node_modules for Python dependency management.

erwincoumans · 3 years ago
I am pretty happy with PyPi/pip, it is an easy way to distribute Python and C++ code wrapped in a Python extension to others. For a C++ developer it is becoming harder to distribute native executables, since MacOS and Windows require signing binaries. Python package version conflicts and backwards incompatibility can be an issue.