Readit News logoReadit News
cglan · a year ago
Super true. One of the best tests of this is setting up a new laptop. Some of the best experiences are when you get a new laptop, and just clone the codebase and everything works as it did before, no special magic. Golang with vendored dependencies seems to be wonderful for this but I've had relatively decent experiences with newer java projects.

My worst experiences universally have always been python projects. I don't think I've had a single time where I cloned a python project and had it just work.

Beyond just the code, I've had lots of mixed experiences with CI/CD being smooth. I unfortunately don't think I've been in a single shop where deployments or ci have been a good experience. They often feel very fragile and undocumented and hard for newcomers.

sethammons · a year ago
I have a couple decades of experience and have ridden a small start up through public and I have worked intimately with 6 companies. I know about taking a product and ultra scaling it in both technical and organizational scale.

I will never recommend Python outside of a small team. It is organizational molasses. My current company has multiple teams striving to keep our Python tech stack serving our growing technical and organizational scale.

I have fixed this in two companies in no small part with migrating to Go. I am on my third.

aaronblohowiak · a year ago
The hoops that people go through to solve this sometimes creates something even more complex and not great, like forcing all development into a docker container..

Ever try conda though? I’ve had moderate success with pipenv, but tbh I don’t love it as it hides too many things when installing a package fails.

senko · a year ago
> My worst experiences universally have always been python projects. I don't think I've had a single time where I cloned a python project and had it just work.

I'm curios if you can spot a pattern in the platform (win/osx/linux), type of project, or is it all over the place?

My own experience with Python boils down to creating a virtualenv, installing the deps, setting up configuration (or just copying it from somewhere) and creating a database, and I'm off to the races. The only exception in recent memory was when a project had two dozen microservices, half of the codebase was on private package repository, and we used Poetry. The combo required somewhat more involved setup. That said, IIRC all the projects had fully pinned package versions (package==x.y.z).

In contrast, every time I touch something in JS land I get the same experience you described for Python. On one project we literally copied node_modules across machines (including servers), because it was unbounded amount of time trying to do a full reinstall. Anecdotally, amount of churn in JS is much higher, and the maintenance load increases proportionally.

Usually it's something like:

- have a project in JS with some dependency X that's no longer on the bleeding edge, but works nice

- want to depend on a new package Y for some new feature

- the new package Y depends on a library Z that's higher than what the other dependency (X) can work with

- try to update the original dependency (X)

- wailing, gnashing of teeth, and considering the switch to agriculture instead

In my experience, if you're not closely tracking the bleeding edge, upgrading packages and updating your code accordingly, your JS developer experience will be abysmal.

Agree on the CD part, especially the fragility and more manual work than if the deploy is some manually driven (semi-)automated process.

imp0cat · a year ago
You can get the same JS/node_modules experience with Python, just use pdm. ;)
AlienRobot · a year ago
I love Python but it always amazes me how hard it is for it to just... work.

So there is virtualenv, built in, but... if there is a venv directory, Python doesn't just use it.

Like you have app.py, and you python app.py, that doesn't run it with the venv python. This leads to all sorts of problems with scripts that assume they're running under venv. Which means you probably want to write a script that sources venv just so you don't forget, but if you place it in the same directory you may forget you need to call the script, so you probably want to add an extra directory to hide all the python code so you only see the shell script that you need to run to properly setup the environment to run the python code. Or just use an IDE.

Just "pip install." But pip isn't installed and ensure pip doesn't work? What do I even do then?

I recall downloading a project that required a library that wasn't available for the newest version of python, so when you tried to install the requirements pip wouldn't find it. I discovered this, naturally, because I updated my operating system so the python version changed which means the project that used to work stopped working! What is the solution for installing multiple python versions side by side? Hint: it's not an official project by the Python organization but something you can find on github.

adammarples · a year ago
My recent workflow is to use a great program called mise. You have a config file in your directory and hey presto, python venvs work, they install themselves if they don't already exist, and it will install the exact version of python you specify in your config. On top of that is will set environment variables for you and unload them when you change directory. If you combine this with uv (just tell mise you want uv installed in the config) you can run uv pip sync and instantly reflect any changes in your requirements file directly into your venv very quickly.
tmnvix · a year ago
For the past 4-5 years this is what has worked exceptionally well for me:

- pyenv for installing multiple versions of python on my machine

- direnv for managing environments (env variables, python version, and virtual environment)

- pip for installing dependencies (pinning versions and only referencing primary packages in requirements.txt - none of their dependencies)

This makes everything extremely easy to work with. When I cd into a project directory direnv loads everything necessary for that environment.

Each project directory has a .env and a .envrc file. The .envrc looks something like this:

    layout python ~/.pyenv/versions/3.11.0/bin/python3
    dotenv .env
Absolutely no headaches working on dozens of local python projects.

fragmede · a year ago
yeah I wrote python-wool and set it as my local alias for python so it does just look for a venv in the called program's path, and use that.

https://github.com/fragmede/python-wool

nzach · a year ago
> My worst experiences universally have always been python projects.

Do you mind sharing why do you think this happens ? Although I never worked professionally with python, this sentiment matches with my experiences as a user. So I don't have a lot of context why this is the case.

Some siblings in this thread provided some explanations that mostly boils down to 'bad tooling' in one form or another. But this doesn't feel right.

In my opinion if it was just bad tooling this problem would be solved by now.

ronakjain90 · a year ago
Every time I setup a JS project which is older than a few years, it's

1. Extremely difficult to setup the code base, because of dependency spaghetti 2. Lot of breaking changes across different libraries, making maintenance not so easy.

Easiest projects to maintain were written on Go, Java, Ruby,

humanfromearth9 · a year ago
You may want to consider using Nix, with nix flakes.
bryanlarsen · a year ago
How much of that just hides complexity? I remember back in the day hiding a large amount of complexity behind vagrant.

A new dev could get up and running quickly with "install vagrant; vagrant up", but that was hiding a lot of complexity behind a very leaky abstraction.

pphysch · a year ago
> My worst experiences universally have always been python projects. I don't think I've had a single time where I cloned a python project and had it just work.

I got a new Chromebook from work, and had VSCode+Docker running an existing Postgres+Django+etc dev environment in literally 15 minutes. I was shocked. Devcontainers are magic, and poor Python DX is a skill issue.

porridgeraisin · a year ago
> Poor Python DX is a skill issue

Oh yes, the language whose ecosystem only hears about backwards compatibility in their own death marches? Not their problem. It's the developers, it's _their_ problem.

Not the standard library which _removes_ packages, breaking code which I recently cloned. See "imp".

And not the next python version, which throws a syntax error on bare excepts, breaking old code for absolutely zero benefit beyond pretending to be a linter.

Deleted Comment

sameoldtune · a year ago
uludag · a year ago
I was reflecting on this article, thinking about what software tools and languages I use that reflect this property, and a weird realization came to mind: Emacs lisp is by far the best language I use in this regard. I literally copy-pasted 20+ year code, eval it, and every time it just works. Then if I want to debug it: C-u C-M-x, and I'm instantly stepping through the code.

Something this old shouldn't have this property. Nothing "modern" even comes close. Look at the top languages, Python, JavaScript, and Java, and you don't even have to consider too much how abysmal these languages are in this regards.

karthink · a year ago
> Something this old shouldn't have this property.

It's not an accident -- reading through the emacs-devel mailing list, it's easy to see how much effort the maintainers pour into backward compatibility. It's one of Emacs' unspoken guiding principles[1].

At the same time, it's not that surprising either. Emacs does not have other objectives that more modern languages/ecosystems do: no revenue or growth targets, corporations or VCs breathing down its neck, or a mandate to be "modern". Its most vocal and experienced users, who are also its volunteer maintainers, decide what its priorities should be. Since they've been using it for decades, backward compatibility is high on the list.

[1]: It's "spoken" guiding principles being to further the goals of the GNU project.

sethammons · a year ago
As a principal software engineer, my life is moving larger orgs closer to this model. I have lived it when it works. It is critical to so many things. I feel like Plato's Cave when I couple this stuff with structured logs, metrics, dashboards, and alerts. So many shops don't understand that this stuff gives you wings.
aaronblohowiak · a year ago
I like this. Couple things to add:

Fast setup and revision are important but incomplete list of maintenance tasks; are metrics/logs predictably named and accessed? Can you perform manual experimentation without hard-to-configure client (ir: hit the server with a browser or run a cli)?

Also, "cycle time" or "revision time" are soo important, but I havent found a good way to do that with AI model development :( any tips here?

anonyfox · a year ago
Nowadays i come to my conclusion that "ease of maintenance" is the most important feature to have in a project. More critical is only that the project in itself is valuable enough, so many engineers optimize things that shouldn't exist in the first place.

Easy to maintain is not only about keeping something alive with minimal effort over longer periods of time. It also plays a pivotal role for scalability in any direction. Adding more engineers/teams, adding more unforseeable features, iterating quickly in general, surviving more traffic/load, removing technical bottlenecks, ... everything is so much easier when the project is easy to work with and maintainable.

chanux · a year ago
Cannot be overstated! I've spent countless hours trying to understand systems built by others (dozens of others of various skill levels) to try and bring the code to a more maintainable posture. Sometimes it feels like a thankless job but it's a rather selfish endeavor because first and foremost, I want to save my future self from suffering.
stevepike · a year ago
I think the kind of application here matters a lot, specifically whether you're trying to make a change to a web app or if you're hacking on library code.

In ruby, for example, I can pretty trivially clone any open source gem and run the specs in < 5 minutes. Patching something and opening a PR in under an hour is definitely standard.

On the other hand, getting a development environment running for a company's proprietary web app is often a hassle. Mostly though this isn't because of the language or dependencies, it's because of:

  - Getting all the dependent services up and running (postgres version X, redis Y, whatever else) with appropriate seed data. 
  - Getting access to development secrets
My company (infield.ai) upgrades legacy apps, so we deal with setting up a lot of these. We run them in individual siloed remote developer environments using devcontainers. It works OK once we've configured the service containers.

afiodorov · a year ago
More software is written than kept. It's harder to write useful software than to configure CI/CD. The latter is a problem that has been solved before, whilst chances of any software codebase being useful enough that it is even worth maintaining are very low.