Readit News logoReadit News
ratorx · a year ago
Slightly off-topic, but the fact that this script even needs a package manager in a language with a standard library as large as Python is pretty shocking. Making an HTTP request js pretty basic stuff for a scripting language, you shouldn’t need or want a library for it.

And I’m not blaming the author, the standard library docs even recommend using a third party library (albeit not the one the author is using) on the closest equivalent (urllib.request)!

> The Requests package is recommended for a higher-level HTTP client interface.

Especially for a language that has not cared too much about backwards compatibility historically, having an ergonomic HTTP client seems like table stakes.

diggan · a year ago
> Making an HTTP request js pretty basic stuff for a scripting language, you shouldn’t need or want a library for it.

Sometimes languages/runtimes move slowly :) Speaking as a JS developer, this is how we made requests for a long time (before .fetch), inside the browser which is basically made for making requests:

    var xhr = new XMLHttpRequest();
    xhr.open('POST', 'https://example.com', true);
    xhr.setRequestHeader('Content-type', 'application/x-www-form-urlencoded');
    xhr.onload = function () {
        console.log(this.responseText);
    };
    xhr.send('param=add_comment');
Of course, we quickly wanted a library for it, most of us ended up using jQuery.get() et al before it wasn't comfortable up until .fetch appeared (or various npm libraries, if you were an early nodejs adopter)

d4mi3n · a year ago
This takes me back. I'm glad `fetch` has become the canonical way to do this. XHR was a new capability at the time, but back then we were just starting to learn about all the nasty things people could do by maliciously issuing XHR requests and/or loading random executables onto the page. Clickjacking was all the rage and nothing equivalent to Content Security Policy existed at the time.
ratorx · a year ago
I don’t think it’s just slowness or stability. The original release of requests was in 2011 and the standard library module (urllib.request) was added in Python 3.3 in 2012.
gorgoiler · a year ago
It has two! — http.client and urllib.request — and they are really usable.

Lots of people just like requests though as an alternative, or for historical reasons, or because of some particular aspect of its ergonomics, or to have a feature they’d rather have implemented for them than have to write in their own calling code.

At this stage it’s like using jQuery just to find an element by css selector (instead of just using document.querySelector.)

WesolyKubeczek · a year ago
Requests used to have good PR back in the day and ended up entrenched as a transitive dependency for a lot of things. Because it’s for humans, right?

But recently I had to do something using raw urllib3, and you know what? It’s just as ergonomic.

ratorx · a year ago
Sure historical popularity is a good reason for people who are already familiar with it to keep using it.

That is not really an excuse for why the standard library docs for the clients you mentioned link to requests though (especially if they were actually good, rather than just being legacy). If they really were good, why would the standard library itself suggest something else?

imtringued · a year ago
They could have used a database driver for msql, postgresql or mongodb for a more realistic example (very common for sysadmin type scripts that are only used once and then thrown away) and your complaint would be invalid, but then you'd have to set up the database and the example would no longer be fit for a quick blog post that gives you the opportunity to just copy paste the code and run it for yourself.
ratorx · a year ago
Well, the “this script needs package manager part”. The rest of my comment about the state of the HTTP client in Python would still be valid (but I probably wouldn’t have discovered it).
mkesper · a year ago
The standard library does not give you a possibility to do async HTTP requests, that's what httpx does. As Python still heavily relies on async this is really a bummer.
masklinn · a year ago
There’s absolutely no need for async http here. The script does one http request at the top of main. And a trivial one too (just a simple GET).

    response = urlopen(url)
    return json.load(response)
is what they’re saving themselves from.

masklinn · a year ago
requests is really useful for non-trivial http requests (especially as urllib has terrible defaults around REST style interactions).

But here all the script is going is a trivial GET, that’s

    urllib.request.urlopen(url)

gabrielsroka · a year ago
I talked about that in my readme https://github.com/gabrielsroka/r
rat87 · a year ago
Python has historically cared about backwards compatibility. Nowadays they're finally dropping some old libraries that probably shouldn't have been in the stdlib. They're not likely to add more. Especially now that you can add dependencies to scripts so easily

Deleted Comment

jgalt212 · a year ago
I agree. Requests is an embarrassment and indictment of the Python standard library. And so are dataclasses. They just should have subsumed attrs.
zahlman · a year ago
>And I’m not blaming the author, the standard library docs even recommend using a third party library (albeit not the one the author is using) on the closest equivalent (urllib.request)!

For perspective: urllib has existed since at least as 1.4 (released in 1996), as long as python.org's archive goes back (https://docs.python.org/release/1.4/lib/node113.html#SECTION...). Requests dates to 2011. httpx (the author's choice) has a 0.0.1 release from 2015, but effectively didn't exist until 2019 and is still zerover after a failed 1.0.0 prerelease in 2021. Python can't be sanely compared to the modern package-manager-based upstarts because it's literally not from that generation. When Python came out, the idea of versioning the language (not referring to a year some standards document was published) was, as far as I can tell, kinda novel. Python is older than Java, Applescript, and VB; over twice as old as Go; and over three times as old as Swift.

>Especially for a language that has not cared too much about backwards compatibility historically

It's always confused me that people actually see things this way. In my view, excessive concern for compatibility has severely inhibited Python (and especially packaging, if you want to include that despite being technically third-party) from fixing real problems. People switching over to 3.x should have been much faster; the breaking changes were unambiguously for the better and could not have been done in non-breaking ways.

There are tons of things the developers refuse to remove from the standard library that they would never even remotely consider adding today if they weren't already there - typically citing "maintenance burden" for even the simplest things. Trying to get anything added is a nightmare: even if you convince everyone it looks like a good idea, you'll invariably asked to prove interest by implementing it yourself (who's to say all the good ideas come from programmers?) and putting it on PyPI. (I was once told this myself even though I was proposing a method on a builtin. Incidentally, I learned those can be patched in CPython, thanks to a hack involving the GC implementation.) Then, even if you somehow manage to get people to notice you, and everyone likes it, now there is suddenly no reason to add it; after all, you're in a better position to maintain it externally, since it can be versioned separately.

If I were remaking Python today, the standard library would be quite minimal, although it would integrate bare necessities for packaging - APIs, not applications. (And the few things that really need to be in the standard library for a REPL to be functional and aware of the platform, would be in a namespace. They're a honking great idea. Let's do more of those.)

eesmith · a year ago
Lib/urllib.py was created "Tue Mar 22 15:37:06 1994", by renaming Lib/urlopen.py which was created "Mon Feb 21 17:07:07 1994".
ratorx · a year ago
I was referring to 3.x, but also to “minor” releases (not sure they use semver), where standard library functions and options are being removed occasionally.

So it is both “not conservative enough”, whilst as you say being overly conservative.

The main problem with “small libraries” is supply chain risk. This is why I try to use languages with a strong standard library (and first party external packages). Python would be a lot less useful without a strong standard library.

frfl · a year ago
Anyone use PEP 723 + uv with an LSP based editor? What's your workflow? I looked into it briefly, the only thing I saw after a lot of digging around was to use `uv sync --script <script file>` and get the venv from the output of this command, activate that venv or specify it in your editor. Is there any other way, what I describe above seems a bit hacky since `sync` isn't meant to provide the venv path specifically, it just happens to display it.

Edit: I posted this comment before reading the article. Just read it now and I see that the author also kinda had a similar question. But I guess the author didn't happen to find the same workaround as I mention using the `sync` output. If the author sees this, maybe they can update the article if it's helpful to mention what I wrote above.

JimDabell · a year ago
uv v0.6.10 has just been released with a more convenient way of doing this:

    uv python find --script foo.py
https://github.com/astral-sh/uv/releases/tag/0.6.10

https://docs.astral.sh/uv/reference/cli/#uv-python-find--scr...

BoumTAC · a year ago
How does it work ? How does it find the environment ?

Let say I have a project in `/home/boumtac/dev/myproject` with the venv inside.

If I run `uv python find --script /home/boumtac/dev/myproject/my_task.py`, will it find the venv ?

lynx97 · a year ago
Thanks! Came here to ask how to make pyright work with uv scripts...

pyright --pythonpath $(uv python find --script foo.py) foo.py

networked · a year ago
My general solution to project management problems with PEP 723 scripts is to develop the script as a regular Python application that has `pyproject.toml`. It lets you use all of your normal tooling. While I don't use an LSP-based editor, it makes things easy with Ruff and Pyright. I run my standard Poe the Poet (https://poethepoet.natn.io/) tasks for formatting, linting, and type checking as in any other project.

One drawback of this workflow is that by default, you duplicate the dependencies: you have them both in the PEP 723 script itself and `pyproject.toml`. I just switched a small server application from shiv (https://github.com/linkedin/shiv) to inline script metadata after a binary dependency broke the zipapp. I experimented with having `pyproject.toml` as the single source of truth for metadata in this project. I wrote the following code to embed the metadata in the script before it was deployed on the server. In a project that didn't already have a build and deploy step, you'd probably want to modify the PEP 723 script in place.

  #! /usr/bin/env python3
  # License: https://dbohdan.mit-license.org/@2025/license.txt
  
  import re
  import tomllib
  from pathlib import Path
  from string import Template
  
  import tomli_w
  
  DEPENDENCIES = "dependencies"
  PROJECT = "project"
  REQUIRES_PYTHON = "requires-python"
  
  DST = Path("bundle.py")
  PYPROJECT = Path("pyproject.toml")
  SRC = Path("main.py")
  
  BUNDLE = Template(
      """
  #! /usr/bin/env -S uv run --quiet --script
  # /// script
  $toml
  # ///
  
  $code
  """.strip()
  )
  
  
  def main() -> None:
      with PYPROJECT.open("rb") as f:
          pyproject = tomllib.load(f)
  
      toml = tomli_w.dumps(
          {
              DEPENDENCIES: pyproject[PROJECT][DEPENDENCIES],
              REQUIRES_PYTHON: pyproject[PROJECT][REQUIRES_PYTHON],
          },
          indent=2,
      )
  
      code = SRC.read_text()
      code = re.sub(r"^#![^\n]+\n", "", code)
  
      bundle = BUNDLE.substitute(
          toml="\n".join(f"# {line}" for line in toml.splitlines()),
          code=code,
      )
  
      DST.write_text(bundle)
  
  
  if __name__ == "__main__":
      main()

zahlman · a year ago
If you already have a pyproject.toml, and a "build and deploy step", why not just package normally? PEP 723 was developed for the part of the Python world that doesn't already live on PyPI (or a private package index).
skeledrew · a year ago
I'm generally not a fan of the incremental rustification of the Python ecosystem, but I started using uv a few weeks ago just for this particular case and have been liking it. And to the point where I'm considering to migrate my full projects as well from their current conda+poetry flow. Just a couple days ago I also modified a script I've been using for a few years to patch pylsp so it can now see uv script envs using the "uv sync --dry-run --script <path>" hack.
ratorx · a year ago
Out of curiosity, what are some problems with rustification? Is it an aversion to Rust specifically or a dislike of the ecosystem tools not being written in Python?

The former is subjective, but the latter seems like not really much of an issue compared to the language itself being written in C.

zahlman · a year ago
Speaking for myself:

I have no aversion to Rust (I've read some of it, and while foreign, it comes across as much more pleasant than C or C++), but the way it's promoted often is grating. I'm getting really tired in particular of how the speed of Rust is universally described as "blazing", and how "written in Rust" has a sparkle emoji as mandatory punctuation. But maybe that's just because I'm, well, older than Python itself.

I don't really care that the reference implementation isn't self-hosting (although it's nice that PyPy exists). Using non-Python for support (other than IDEs - I don't care about those and don't see a need to make more of them at all) is a bit grating in that it suggests a lack of confidence in the language.

But much more importantly, when people praise uv, they seem to attribute everything they like about it to either a) the fact that it's written in Rust or b) the fact that it's not written in Python, and in a lot of cases it just doesn't stand up to scrutiny.

uv in particular is just being compared to a low bar. Consider: `pip install` without specifying a package to install (which will just report an error that you need to specify a package) on my machine takes almost half a second to complete. (And an additional .2 seconds with `--python`.) In the process, it imports more than 500 modules. Seriously. (On Linux you can test it yourself by hacking the wrapper script. You'll have to split the main() call onto a separate line to check in between that and sys.exit().)

skeledrew · a year ago
It's more the latter, particularly when Rust is used in libraries (eg. FastAPI) as opposed to tools, as it's destroying portability. For example I use flet[0] in some of my projects, and I have to be increasingly careful about the other dependencies as there is no support for the Rust toolchain within Dart/Flutter, and even if there was it still sounds like it'd be a nightmare to maintain. Same applies to any other platforms/apps out there that support running Python for flexibility, and handling another language is just way out of scope (and I'm pretty sure there are quite a few). A key part of Python's existence is as glue between disparate system parts, and rustification is reducing it's usefulness for an increasing number of niche cases where it once excelled.

[0] https://flet.dev

NeutralForest · a year ago
I can understand the sentiment somewhat. It's another layer of complexity and it makes working on projects more difficult. The fact pip or mypy code is all Python makes it much easier to interact with and patch if needed.

You can also write Cython for more perf oriented code but I can totally understand the value Rust brings to the table, it's just now another language you'll need to know or learn, more layers like maturin or pyO3, while cffi is just there.

All the tooling coming from astral is amazing and I use it everyday but I can see the increasing complexity of our toolchains, not in ergonomics (it's much better now) but the tools themselves.

WesolyKubeczek · a year ago
A problem with rustification is that it puts a giant ecosystem on a giant ecosystem, with poorly matched tooling. C has a lot of home ground advantage, and CPytjon is built on it.

Then you have PyPy which you’d have to accommodate somehow.

It doesn’t help that in a case where you have to build everything, Rust build toolchain currently needs Python. That sure would make bootstrapping a bitch if Python and Rust became a circular dependency of one another.

htunnicliff · a year ago

    I also modified a script I've been using for a few years to patch pylsp so it can now see uv script envs using the "uv sync --dry-run --script <path>" hack.
This sounds like a really useful modification to the LSP for Python. Would you be willing to share more about how you patched it and how you use it in an IDE?

skeledrew · a year ago
I have a somewhat particular setup where I use conda to manage my envs, and autoenv[0] to ensure the env for a given project is active once I'm in the folder structure. So there's a .env file containing "conda activate <env_name>" in each. I also use Emacs as my sole IDE, but there are quite a few instances where support falls short for modern workflows. I use the pylsp language server, and it's only able to provide completions, etc for native libraries, since by default it doesn't know how to find the envs containing extra 3p packages.

And so I wrote a patcher[1] that searches the project folder and parents until it finds an appropriate .env file, and uses it to resolve the path to the project's env. With the latest changes to the patcher it now uses the output from "uv sync", which is the path to a standalone script's env, as well as the traditional "source venv_path/bin/activate" pattern to resolve envs for uv-managed projects.

[0] https://github.com/hyperupcall/autoenv [1] https://gitlab.com/-/snippets/2279333

oulipo · a year ago
what's the --dry-run hack ?
skeledrew · a year ago
Using "--dry-run" makes the command a no-op, but still prints the env path.
stereo · a year ago
I used to have a virtual environment for all little scrappy scripts, which would contain libraries I use often like requests, rich, or pandas. I now exclusively use this type of shebang and dependency declaration. It also makes runnings throwaway chatgpt scripts a lot easier, especially if you put PEP-723 instructions in your custom prompt.
alkh · a year ago
Bonus points for "Bonus: where does uv install its virtual environments?" section! I was wondering the same question for a long time but haven't had a chance to dig in. It's great that venv is not being recreated unless any dependencies or Python version got modified
thisdavej · a year ago
Thanks for the positive feedback! I was curious too and thought others would enjoy hearing what I learned.
sorenjan · a year ago
You can also run `uv cache dir` to show the location.
__float · a year ago
This was discussed somewhat recently in https://news.ycombinator.com/item?id=42855258
sireat · a year ago
This is neat writeup on use of uv, but it doesn't solve the "how to give self contained script to grandma" problem.

Now anyone you give your script to has to install uv first.

the_mitsuhiko · a year ago
> This is neat writeup on use of uv, but it doesn't solve the "how to give self contained script to grandma" problem.

Not at the moment, but will your grandma run a script? There is an interesting thing you can already do today for larger applications which is to install uv alongside your app. You can make a curl-to-bash thing or similar that first installs uv into a program specific location to then use that to bootstrap your program. Is it a good idea? I don't know, but you can do that.

alanfranz · a year ago
For simple scripts (I never succeeded using it on something really complex, but it's great when you don't want to use bash but need something like Python) I had used this approach that still works nowadays and has no uv dependency (only requires pip to be installed in the same Python interpreter that you're using to run your script):

https://www.franzoni.eu/single-file-editable-python-scripts-...

renewiltord · a year ago
You can write a bash shebang that curl into shell. Unfortunately when I did it and gave to grandma it has failed because grandma has substituted oil shell and linked it as sh which is not best practice. I think grandmother shell script is simply impossible. They have spent decades acquiring idiosyncratic unix environment
ElectricalUnion · a year ago
Good thing those days init is systemd instead of a series of de jure POSIX shell scripts but de facto bashism shell scripts that will fail to boot if you swap /bin/sh away from bash.

At least ubuntu helped force the ecosystem to at least pretend to support /bin/dash too.

denzil · a year ago
For this case, it might be easier to package the script using pyinstaller. That way, she can just run it. Packaging it that way is more work on your side though.
oezi · a year ago
I think uv should become a default package for most operating systems.
globular-toast · a year ago
It automatically downloads interpreters from some internet source. It's a security nightmare. It can be configured not to do that but it's not the default.
oulipo · a year ago
well you can just give her the `./install_uv.sh && ./run_script.sh` command, eg

`( curl -LsSf https://astral.sh/uv/install.sh | sh ) && ./run_script.sh`

imtringued · a year ago
Next up: uv competitor compiled with cosmopolitan libc.
sorenjan · a year ago
You don't need to run the script as `py wordlookup.py` or make a batch file `wordlookup.cmd` in Windows.

The standard Python installation in Windows installs the py launcher and sets it as the default file handler for .py (and .pyw). So if you try to run `wordlookup.py` Windows will let the py launcher handle it. You can check this with `ftype | find "Python"` or look in the registry.

You can make it even easier that that though. If you add .py to the PATHEXT environment variable you can run .py files without typing the .py extension, just like .exe and .bat.