Readit News logoReadit News
armchairhacker · a year ago
This is a behind-the-scenes sort of improvement which has been happening since the late 2010s while real applications have seemingly gotten worse.

What’s actually happening is that there are a lot of bloated, buggy, badly-designed programs today because they simply wouldn’t have existed prior to the improvements in developer tooling. In some cases because the software has a much broader scope made possible by the tooling (distributed “webscale” sites with massive data throughput and configurability). Alternatively (or additionally), sometimes the developer(s) are mistakenly using a tool (e.g. Kubernetes) that is way too webscale/generalized for their purpose and/or itself wouldn’t have existed prior to improvements in development tooling.

Good software still exists today. It’s just gold lying around in a forest of junk, whereas it used to be more like less-shiny gold lying around in a barren plain.

zer00eyz · a year ago
The web was better when there was a barrier to entry. You must be at least this smart to host a website turned into "post on social media platform xxxx".

Development was better when you must be at least this skilled to build software.

I read a post yesterday where someone was so proud of their bad permissions system... Well someone was unaware of how linux permissions and groups work and what a group even is was (and how it might be represented in a directory to be managed).

As for "web scale" well, all of that autoscaling is just auto-spending. "Cattle not pets" would be a great metaphor but cattle are valuable, most people are leased on chicken farming and own nothing. (If that analogy is lost on you, do your home work and be appalled, but it is on point).

Remember kids, its called a container for a reason, its how we isolate your shitty application from bringing down the rest of the system or the hardware that you dont understand!

conor- · a year ago
> its called a container for a reason, its how we isolate your shitty application from bringing down the rest of the system or the hardware that you dont understand

This reads as extremely cynical and also uncharitable towards the benefits containers provide around managing different versions of dependencies or run-times for the people who do understand the hardware.

It's really nice to not have to wrangle multiple versions of Python or set up venvs or whatever else on a target server that may be hosting separate apps. The alternative of having a couple of self-contained containers with nginx in front of it is better in many ways.

jjackson5324 · a year ago
> while real applications have seemingly gotten worse

Can you elaborate on this? I'm not sure I agree.

Modern day applications are doing far, far more than they were 10-15 years ago. Much more traffic, much more data, much more tracking (unfortunately), etc.

QuadmasterXLII · a year ago
Teams is a gigabyte and takes literal minutes to open. Facetime was 14 years ago and did not take literal minutes to open.

(I hate to be the sort of person who exaggerates, so I decided to go through the effort of actually timing Teams for this comment so that I could give it a fair number. I return bearing no such measurement, as today it decided to instantly segfault on open.

    Exception Type:        EXC_BAD_ACCESS (SIGSEGV)
    Exception Codes:       KERN_INVALID_ADDRESS at 0x0000000000000008
    Exception Codes:       0x0000000000000001, 0x0000000000000008

    Termination Reason:    Namespace SIGNAL, Code 11 Segmentation fault: 11
    Terminating Process:   exc handler [40516]
)

whstl · a year ago
Performance of software has gotten significantly worse pretty much everywhere. Older computers can't run the things we run today, which is obvious. "Handing more traffic and data" is purely due to the amount of hardware used.

I would also argue that quality has gone down significantly (unless you talk about enterprise apps, where quality was always abysmal) and the cost of making software has also increased, even if we adjust for developer salary. But those are just personal opinions.

nottorp · a year ago
> Modern day applications are doing far, far more than they were 10-15 years ago. Much more traffic, much more data, much more tracking (unfortunately), etc.

Yes but are you sure the 'more' is in the user's interest?

I'd say 90% isn't. Out of which maybe half is less development cost because you can throw layers upon layers of bloat upon the user, and half is spyware.

Deleted Comment

1123581321 · a year ago
This isn’t what he meant, but my optimistic reading is that the average quality of software has gone down because so many more people can make it. People who would have gotten close but given up now can publish. It does mean a pile of rubble to sift in the marketplace, but so has any advance in creative tools, and many people are scratching their own itches with software and not bothering others with it, which has no downsides.
calrain · a year ago
> With luck, the options for doing so without paying a monthly tithe to dozens of vendors will improve over the next decade!

This is why I've started digging deep into setting up a homelab environment, with the goal to be implementing local source control, automation pipelines, AI inference models, and hosting.

I feel it's important to keep skills fresh rather than paying others to do.

It is important to focus on what you're good at, but it's also important to know what is really going on at the infrastructure layer.

BTW. on premise compute for homelab environments is incredibly cheap right now! (Looking at you 'second hand Dell Optiplex' gear)

roland35 · a year ago
It's cheap as long as you don't look to closely at your electric bill ;)
calrain · a year ago
This is where the right model hardware shines.

There are low power machines that idle on 8 watts (Dell Optiplex Micro Form Factor), and my whole lab costs around USD$100 a year to run on local power prices.

You're right, it's really important to pick components that are low power.

With Home Assistant automation, you can even start and stop machines so they only draw power when you need, e.g. for a testing pipeline.

I couldn't recommend more the channel "Jim's Garage" on YouTube to help someone get started in this space. Fantastic content!

quickslowdown · a year ago
I have a rack mount server, 3 "gaming" desktops, a mini dell, a ton of raspberry pis, and a few miscellaneous devices running. My bill is $10 more than my brother who's a nurse and has 2 computers, a Switch, and a TV.

I know in theory the electric bill could/should be way higher, but I just haven't found this to be the case.

closewith · a year ago
Depending on where you live, that home lab is also a handy resistance heater.
slily · a year ago
Visual Studio 5-10 years ago with C#/.NET Intellisense was a great experience and high performance. I have yet to replicate that in the bloated Electron IDEs these days, there are still constant issues with autocompletion and navigation being slow or not working randomly for some reason even on my $3000 desktop PC from last year, I haven't found debugging (properly with breakpoints and variable tracking) to be viable compared to "printf debugging" since my Visual Studio days, and the UX of VSCode has not caught up to Visual Studio for bigger projects, file navigation is particularly awful and between the floating toolbars that disappear on a whim and ridiculously narrow scrollbars you'd think they're making it difficult on purpose.

It's an improvement from a certain baseline, but Visual Studio set a higher bar for me personally.

berkut · a year ago
Yeah, as someone who's been attempting to use VSCode to do Python and Rust dev over the past three years, I'm continually surprised by people who say VSCode and the Rust analyzer plugin is a Good/Great IDE env.

It so often seems non-functional to me: auto-completion just doesn't work consistently, even on really simple things - you firstly normally have to save your file first, and even then often if I restart VSCode it will then work again on something it didn't a minute ago, and other times I can never get it to complete things. And this is happening on two different Linux machines and a MBPro M2 in multiple projects, so I don't think it's just a one-off bad configuration I've somehow got.

Its auto-indenting when writing code is insane as well, it seems everyone must be running rustfmt all the time, even on code as they're writing it? It never seems to get the indents right on new lines for me, I'm either having to add them or remove them.

At the end of the day it's what you get used to I think, but Visual Studio 15-20 years ago was pretty good (other than the bloody pause for "Updating intellisense"), I've yet to find anything as good for Python as PyCharm, and QtCreator (CLion was pretty good as well) is still the best Linux/MacOS-based C/C++ dev env I've found (but recent versions of it are getting worse IMO what with all the complicated "Kit" build config stuff).

skydhash · a year ago
My first code editor was Notepad++ and my first IDE was Code::Blocks. These two worked really fast on my Pentium 4 computer with Windows XP. All the while providing much better experience than VSCode.
dweekly · a year ago
Delphi, Turbo Pascal, Visual Basic, and HyperCard were all really fast ways to build GUIs some 20+ years ago.
bluefirebrand · a year ago
I'm always surprised to hear that people like Visual Studio. Personally I've always felt like it was bloated and slow

But I think part of what I dislike about it the most is how opinionated it is about how I structure my projects

Maybe good for huge projects with massive teams, but I've never been able to get myself to conform to it

bigyikes · a year ago
> Both have evolved gradual type systems that might make it easier to hold a large program in an individual’s head productively.

This… is not the primary benefit of type systems. (Not sure I’d agree that this is even an actual benefit - my mental model of a system is unaffected by the presence of types)

In my view, the primary benefit of type systems is that it makes changing code significantly easier, much in the same way that test cases do. They give you confidence that a code change doesn’t cause a regression. Refactoring a code base without types is nightmare fuel. This is not hype.

galdosdi · a year ago
You can see type signatures as free documentation, that unlike other documentation, is not allowed by the compiler to fall out of date. Perhaps that's the sort of thing they meant.
packetlost · a year ago
Have IDEs gotten faster, or have processors gotten faster to the point that it's now feasible to use a heavy IDE without wanting to tear your hair out? I personally think it's the latter.

This isn't really an assessment on the utility of IDEs. I think they're a net positive for the people that get benefit from them.

Keirmot · a year ago
Some IDEs have gotten slightly less frustrating with better processors: e.g. Xcode, which in my opinion deserves the star rating it has in the mac App Store
doctor_eval · a year ago
> Notably, I have not used Kubernetes but the anecdotal data does not lead me to think I’m missing out on much.

My experience as a single developer on a new project is that the usability of K8s has improved dramatically in the last decade.

I use Vultr VKS; setting up a small cluster takes a few minutes, it’s super cheap, and now I have a deployment environment with automatic redundancy, zero downtime upgrades, and much more.

It’s not perfect, but I absolutely love it, and it’s made my life so much easier. I do hate yaml with a passion, and there’s a lot of boilerplate, but IDEs and templates make it bearable, and in any case it’s a small price to pay for all the automation I get.

redeux · a year ago
> usability of K8s has improved dramatically in the last decade.

I sure hope so. It’s only been around for 9 years!

doctor_eval · a year ago
well to be clear I was specifically referring to the blog post:

> Tools for working on software in the large have improved a lot over since I last considered them ten years ago.

irrational · a year ago
> AI assistants/copilots can wear the hat of “better autocomplete” today and may wear the “help me understand this code” or “help me write a good PR/commit message” hat later. I’m skeptical about the wisdom of handing off the latter to a program, but we’ll see how it goes.

The vast majority of PR/commit messages I see (approaching 100%) are basically a JIRA ticket number and the name of the ticket, and that message is used over and over again for multiple commits under that same ticket. I'm not sure Copilot could do any worse.

namaria · a year ago
No amount of good tooling can compensate for bad reasoning. Good commit messages are ones that fit in how the team/organization handle versioning so work can be parceled and context recovered with minimal cognitive load for all parties involved.

Shoving another vendor in the chain because the demo looks great is a sign reasoning is not so great. This kind of layering poorly though out fixes is how most mudballs I've encountered have grown.

xnx · a year ago
> Notably, deploying software doesn’t seem to have improved much at all for the individual developer. Heroku, in its prime, is still the golden standard.

I feel this. There are a lot of PaaS options, but very little portability between them.

noop_joe · a year ago
Is the problem portability or completeness?