Readit News logoReadit News
stefanos82 · 10 months ago
As soon as I finished reading the article, the very first thing that came in my mind is Dieter Rams' "10 Principles of Good Design"; I have been following his principles as much as I can, as they match, more or less, those of UNIX's philosophy:

     1. Good design is innovative
     2. Good design makes a product useful
     3. Good design is aesthetic
     4. Good design makes a product understandable
     5. Good design is unobtrusive
     6. Good design is honest
     7. Good design is long-lasting
     8. Good design is thorough down to the last detail
     9. Good design is environmentally-friendly
    10. Good design is as little design as possible

musicale · 10 months ago
> they match, more or less, those of UNIX's philosophy

     1. Good design is innovative
        UNIX innovated by simplifying Multics -
        throwing away ring security and PL/I's memory safety features.
        Linux innovated by cloning UNIX, giving it away for free,
        and avoiding the lawsuit that sidelined BSD.
     2. Good design makes a product useful
        Yet somehow people managed to use UNIX anyway.
     3. Good design is aesthetic
        UNIX threw away clear, long-form command forms and kept
        short, cryptic abbreviations like "cat" (short for "felis cattus") 
        and "wc" (short for "toilet").
        Its C library helpfully abbreviates "create" as "creat",
        because vowels are expensive.
     4. Good design makes a product understandable
        See #3
     5. Good design is unobtrusive
        That's why UNIX/Linux enthusiasts spend so much time
        configuring their systems rather than using them.
     6. Good design is honest
        The UNIX name indicates it is missing something 
        present in Multics. Similarly, "Linux" is the
        gender-neutralized form of "Linus".
     7. Good design is long-lasting
        Like many stubborn diseases, UNIX has proven hard to eradicate.
     8. Good design is thorough down to the last detail
        UNIX/Linux enthusiasts love using those details
        to try to figure out how to get Wi-Fi, Bluetooth,
        and GPU support partially working on their laptops.
     9. Good design is environmentally-friendly
        Linux recycles most of UNIX's bad ideas, and many
        of its users/apologists.
    10. Good design is as little design as possible
        Linux beats UNIX because it wasn't designed at all.

grose · 10 months ago
Reads like it came straight out of the UNIX-HATERS Handbook, nice. (For those unfamiliar: https://web.mit.edu/~simsong/www/ugh.pdf)
cjfd · 10 months ago
Abbreviating create as 'creat' is a bit stupid but it is the kind of quirk that makes me feel at home. The opposite can be found here: https://devblogs.microsoft.com/scripting/weekend-scripter-us... . That is a world where well... I have switched jobs once specifically to get out of that world....
ripped_britches · 10 months ago
I don’t know enough about kernel development to agree or disagree but I am thoroughly entertained
avidiax · 10 months ago
These don't sound like the UNIX philosophy. My impression is that UNIX is more like what's outlined here:

https://web.stanford.edu/class/archive/cs/cs240/cs240.1236/o...

MonkeyClub · 10 months ago
unscaled · 10 months ago
And the "Worse is Better" follows some good design principles, but in a very twisted way: the program is designed to minimize the effort the programmer needs to write it.

Implementation simplicity meant one important thing: Unix could be quickly developed and iterated. When Unix was still new, this was a boon and Unix grew rapidly, but at one point backward compatibility had to be maintained and we remained with a lot of cruft.

Unfortunately, since implementation simplicity and development speed nearly always took precedence over everything else, this cruft could be quite painful. If you look at the C standard library and traditional Unix tools, they are generally quite user hostile. The simple tools like "cat" and "wc" are simple enough to make them useful, but most of the tools have severe shortcomings, either in the interface, lack of critical features or their entire design. For example:

1. ls was never really designed to output directory data in a way that can be parsed by other programs. It is so bad that "Don't parse ls" became a famous warning for shell script writers[1].

2. find has a very weird expression language that is hard to use or remember. It also never really heard about the "do one thing well" part of Unix philosophy and decided that "be mediocre at multiple things" is a better approach. Of course, finding files with complex queries and executing complex actions as a result is not an easy task. But find makes even the simplest things harder than they should be.

A good counterexample is "fd"[2]. You want to find that has a "foo" somewhere in its name in the current directory and display the path in a friendly manner? fd foo vs find . -name 'foo' -Printf "%P\n". What to find all .py files and run "wc -l" on each of them? fd --extension py --exec wc -l (or "fd -e py -x wc -l" if you like it short). "Find requires you to write find . -name '*.py' -exec wc -l {} ;". I keep forgetting that and have to search the manual every time.

Oh, and as a bonus, if you forget to quote your wildcards for find they may (or may not!) be expanded by the shell, and end up giving you completely unexpected results. Great foolproof design.

3. sed is yet another utility which is just too hard to learn. Most people use it as mostly as a regex find-and-replace tool in pipes nowadays, but its regex syntax is quite lacking. This is not entirely sed's fault, since it predates Perl and PCRE which set the modern standard for regular expressions that we expect to more or less work the same everywhere. But it is another example of a tool that badly violates the principles of good design.

The Unix Haters Handbook is full of many more examples, but the reality is that Unix won because other OSes could not deliver what their users needed fast enough. Unix even brought some good ideas to the mass market (like pipes) even if the implementation was messy. We now live under the shadow of its legacy, for better or worse.

But I don't think we should idolize the Unix philosophy. It is mostly a set of principles ("everything is a file", "everything is text" and "each tool should do one job", "write programs to work together") that was never strictly followed (many things in UNIX are not files, many commands do multiple jobs, most commands don't interact nicely with each other unless you just care about opaque lines). But most importantly, the Unix philosophy doesn't extend much beyond designing composable command line tools that handle line-oriented text for power users.

[1] https://mywiki.wooledge.org/ParsingLs

[2] https://github.com/sharkdp/fd

blueflow · 10 months ago
The rules are too generic to be useful, because mankind still can't agree on what is "innovative", "useful", "aesthetic", ... and what isn't.

Only rule 7. and 9. are measurable and not purely subjective.

makeitdouble · 10 months ago
And then rule 7. is debatable.

If you design for an ephemeral state, it doesn't make sense to be long lasting.

3D printing a door handle that perfectly fits my hand, my door, the space it moves in and only lasts until I move to another house can be the ultimate perfect design _for me_.

I'd see the same for prosthetic limbs that could evolve as the wearer evolves (e.g. growth up or ages) or what they expect from it changes.

epolanski · 10 months ago
> Good design is innovative

Why?

Maybe it's meant in an artistic sense, but under an engineering one I just don't see it.

docmars · 10 months ago
I read "innovative" as coming up with a novel solution to a known problem.

Sometimes the things we consider already solved can be solved better with nuances that maybe weren't considered before.

idle_zealot · 10 months ago
If it's not innovative, then you're reinventing the wheel and would be better off using someone else's good design.
scarface_74 · 10 months ago
Artistic design leads you to the unnecessary skeuomorphism of iOS 1-6
rapnie · 10 months ago
Based on UNIX philosophy too is Dan North's idea of Joyful Coding that does away with formal SOLID principles in favor of CUPID more playful ones: https://cupid.dev
MadWombat · 10 months ago
It seems to be one of those "pick any two" jokes, but those usually only have three items on the list. And yet pretty much everything on this list feels mutually exclusive.
hobs · 10 months ago
Everything had tradeoffs, but let's just list out things you called mutually exclusive:

innovative vs useful, understandable vs honest, long lasting vs thorough, aesthetic vs unobtrusive,

What?

0x1ceb00da · 10 months ago
> Good design is honest

> Everything is a file

hannasm · 10 months ago
Is your interpretation that these two statements are at odds? What even is the intended meaning of "a file"?

To me it could be:

Something accessible via a file descriptor that can be read from and/or written to. Feel free to add some other details like permissions, as needed.

Perhaps they should allow for folders as well, since a complex piece of hardware undoubtedly needs to expose multiple files to be robust, but the underlying intention was to create a standardized way of interacting with hardware.

Sectors on disk, switches, analog devices like a speaker, i2c and other hardware ideas all need to be read from or written to in some form to be useful

pinoy420 · 10 months ago
> Good design is aesthetic

> Xorg

I guess it didn’t say pleasing?

gyomu · 10 months ago
Rams’ principles were perhaps noteworthy when he first vocalized them as the state of design discourse was much more primitive back then (not even sure of that actually), but today they ring quite simplistic and hollow, and kind of useless as actual decision making tools.

“Good design is as little design as possible” ok cool but I have 30 different feature requests coming in every week from actual users, that doesn’t really help me make concrete design decisions

“Good design is aesthetic” whose aesthetic? Aesthetic is extremely cultural and arbitrary, my designers are fighting over whether a photorealistic leather texture is more aesthetic than a gradient texture background, how does that help?

“Good design makes a product useful” uh yeah okay I’ve never had a design problem that was solved by someone saying “you know what, we should just make this product useful” “oooh right how did we not think of that”.

I mean these principles all sound good and high falutin’ as abstract statements, but I’ve never found them useful or actionable in my 15+ years working as a designer.

skydhash · 10 months ago
My takes:

“Good design is as little design as possible”

What you create should be a natural flow of what your clients needs to do. Don't go and add lot of options like a plane cockpit. Which usually means try to find the common theme and adding on top, and also clamping down on fantasy wishes

"Good design is aesthetic"

I'd take the definition of pleasing instead of beautiful for the term. When learning to draw, an often given advice is just to focus and detail only a single part of the whole picture, everything else can be left out. So discussion over a single thing is usually meaningless. If it's not going to be the focus point of interaction, as long as it meshes into the whole, no one care about the exact details.

“Good design makes a product useful”

Usability is a whole field, and you can find the whole corpus under the HCI (Human Computer Interaction) keyword. Focus on meeting this baseline, then add your creativity on top.

> I mean these principles all sound good and high falutin’ as abstract statements, but I’ve never found them useful or actionable

It's kinda like Philosophy, you have to understand what it means for yourself. It's not a cake recipe to follow, but more of a framework from where to derive you own methodology.

brailsafe · 10 months ago
Ya but how many of the results of what you're describing as obvious are evaluated critically afterward, based on their intention?

If you're working on a piece of software, how likely is it that people are regularly comparing it to the most effective alternative alternative means to accomplish the same task, and the revert course of it turns out you've actually created a more convoluted and time consuming path to the same outcome? Often times, software just adds gets in the way and makes life less easy than it would have been otherwise.

The opposite of these principles is often easier to reason about. For example, people attempting to make "better" versions of Hacker News seem to rarely be aware of these, and when they post to Show HN, hopefully realize that the way it is is hard to beat because it follows at least some of the principles. To make something better, you'd absolutely need to follow similar principles more effectively.

lloeki · 10 months ago
This page has the 10 principles, along with a small text and a visual illustration for each one.

https://www.vitsoe.com/us/about/good-design#good-design-is-i...

scarface_74 · 10 months ago
And most of that misses the goal of why you write software for a business.

You write software for a company so someone will give them money for it or so the company can save money

Everything else takes a backseat to that core mission. The first goal when writing software is to efficiently get to a point where one of those goals can be met.

It makes no sense to polish software if you are going to run out of money before it gets released, management cuts funding or you can’t find product market fit to convince investors to give you money depending on what your goal is.

Code doesn’t always need to be long lasting, you can’t predict how the market will change, how the underlying platform will change, when a better funded competitor will come in and eat your lunch, etc.

Good design doesn’t need to be “innovative”. It needs to fit within the norms of the platform or ecosystem it is part of.

Doches · 10 months ago
Good thing not all of us write software for a business.

I write little utilities for my parents, games for my son, a web shop for my wife. I write social spaces for myself and my friends. I write tools for myself.

I write software for strangers on the internet. I write software when I’m drunk, to test myself. Sometimes I write software simply for the joy of writing it, and it never runs again after that first “ah, finally!” moment. Aah, time well spent.

Equating “writing software” with “fulfilling business goals” is…quite depressing. Get outside! Feel the sunshine on your face! Write your own image-processing DSL, just for the sheer unbridled hell of it! Learn Nim! Why not?

(Ok, maybe skip the last one)

jwr · 10 months ago
> And most of that misses the goal of why you write software for a business. You write software for a company so someone will give them money for it or so the company can save money

Hmm. I run a solo-founder SaaS business. I write software for my customers so that they can improve their workflows: essentially, work faster, with fewer mistakes, and make work easier. My customer pay me money if my software improves their lives and workflows. If it doesn't live up to the promise, they stop paying me money.

Most of Dieter Rams's design rules very much do apply to software that I write. I can't always afford to follow all of these rules, but I'm aiming to.

And while I don't always agree with antirez, his post resonated with me. Many good points there.

Incidentally, many of the aberrations he mentioned are side-effects of work-for-hire: if you work for a company and get a monthly salary, you are not directly connected to customers, do not know where the money comes from, and you are not constrained by time and money. In contrast to that, if you are the sole person in a business, you really care about what is the priority. You don't spend time on useless rewrites, you are super careful with dependencies (because they end up costing so much maintenance in the future), you comment your code because you do so much that you forget everything quickly, and you minimize complexity, because simpler things are easier to build, debug and maintain.

ozim · 10 months ago
No one forces anyone to write software for the business or for profit.

Everybody still can write software however you like just don’t expect to earn money on that.

begueradj · 10 months ago
What is "good design" ? That's the question.

Dead Comment

frontalier · 10 months ago
why is it 10? why not 7 or 15?
scarface_74 · 10 months ago
Opposite anecdote, in the 2000s, I worked at a company that had dozens of computers running jobs, built out a server room with raised floors and built out a SAN to store a whopping 3 TB of data and we had a home grown VB6 job scheduler that orchestrated jobs across the computers running Object Rexx scripts.

We had our own internal load balancer, web servers, mail servers, ftp servers to receive and send files, and home grown software.

Now I could reproduce the entire setup within a week at the most with some yaml files and hosted cloud services. All of the server architected is “abstracted”. One of the things he complains about.

As far as backwards compatibility, worshipping at the thrown of backwards compatibility is one reason that Windows is the shit show it is. Even back in the mid 2000s there was over a dozen ways to represent a string when programming and you had to convert back and forth between them.

Apple has been able to migrate between 5 processors during its existence by breaking backwards compatibility and even remove entire processing subsystems from ARM chips by removing 32 bit code compatibility.

mihaaly · 10 months ago
About compatibility.

Windows is a shitshow beacuse the leadership is chaotic, dragged all around all the time, never finishing nothing well. They only survived because of backward compatibility! Building on the unlikely success in the 90s.

Also, why do I have to install new software in every couple of months to access my bank account, secure chat, flight booking system, etc., etc., without any noticable difference in operation and functionality. A lot of things unreasonably becoming incompatible with 'old' (we are talking about months for f's sake!!) versions. That's a nuisance and erosion of trust.

BirAdam · 10 months ago
I wouldn’t call Microsoft’s success in the 90s unlikely. They had a decent product at a low price for commodity hardware when nothing else was as good for as cheap. They also had decent marketing. The company worked hard and delivered. That’s not unlikely, it’s good execution. The unlikely part was something more like OSX taking Microsoft developer market monopoly away.
ripped_britches · 10 months ago
The app updates you mention are most likely due to the problem of not being able to hot update client side code easily / at all in the google/apple ecosystems.

Web actually excels here because you can use service workers to manage versioning and caching so that backwards compatibility is never a concern.

chikere232 · 10 months ago
> Also, why do I have to install new software in every couple of months to access my bank account, secure chat, flight booking system, etc., etc., without any noticable difference in operation and functionality. A lot of things unreasonably becoming incompatible with 'old' (we are talking about months for f's sake!!) versions. That's a nuisance and erosion of trust.

Are you talking about security updates?

amrocha · 10 months ago
I’m not sure where you live, but I’ve never had to install anything to access the features you described, going back over a decade.

All of that has been solved by the web at this point.

scarface_74 · 10 months ago
Apples leadership hasn’t always been a light on a shining hill, especially during the 90s and they still managed the 68K to PPC transition.
Arelius · 10 months ago
> worshipping at the thrown of backwards compatibility is one reason that Windows is the shit show it is

You say Windows is a shit show, but as someone who has developed a lot on both Windows and Linux, Linux is just as much a shit show just in different ways.

And it's really nice being able to trust that binaries I built a decade ago just run on Windows.

inglor_cz · 10 months ago
"And it's really nice being able to trust that binaries I built a decade ago just run on Windows."

Wouldn't this need be solved by an emulator of older architectures?

There would be a performance cost, but maybe the newer processors would more than make up for it.

scarface_74 · 10 months ago
Linux is a shit show because there is no driver standard among other reasons
dpkonofa · 10 months ago
>Apple has been able to migrate between 5 processors during its existence by breaking backwards compatibility and even remove entire processing subsystems from ARM chips by removing 32 bit code compatibility.

I would consider myself an Apple evangelist, for the most part, and even I can recognize what's been lost by Apple breaking backwards compatibility every time they need to shift direction. While the philosophy is great for making sure that things are modern and maintained, there is definitely a non-insignificant amount of value that is lost, even just historically but also in general, by the paradigm of constantly moving forward without regard for maintaining compatibility with the past.

scarface_74 · 10 months ago
What was the alternative? Sticking with 65x02, 68K, or PPC?

They could have stuck with x86 I guess. But was moving to ARM really a bad idea?

They were able to remove entire sections of the processor by getting rid of 32 bit code and saving memory and storage by not having 32 bit and 64 bit code running at the same time. When 32 bit code ran it had to load 32 bit version of the shared linked library and 64 bit code had to have its own versions.

sgarland · 10 months ago
> All of the server architected is “abstracted”. One of the things he complains about.

This is my personal bugbear, so I’ll admit I’m biased.

Infrastructure abstractions are both great and terrible. The great part is you can often produce your desired end product much more quickly. The terrible part is you’re no longer required to have the faintest idea of how it all works.

Hardware isn’t fun if it’s not working, I get that. One of my home servers hard-locked yesterday to the point that IPMI power commands didn’t work, and also somehow, the CPUs were overheating (fans stopped spinning is all I can assume). System logs following a hard reset via power cables yielded zero information, and it seems fine now. This is not enjoyable; I much rather would’ve spent that hour of my life finishing the game of Wingspan.

But at the same time, I know a fair amount about hardware and Linux administration, and that knowledge has been borne of breaking things (or having them break on me), then fixing them; of wondering, “can I automate X?”; etc.

I’m not saying that everyone needs to run their own servers, but at the very least, I think it’s an extremely valuable skill to know how to manage a service on a Linux server. Perhaps then, the meaning of abstractions like CPU requests vs. Limits will become clear, and disk full messages will cause one to not spam logs with everything under the sun.

3ple_alpha · 10 months ago
You can also reproduce it within a week without hosted cloud services. What matters is that you don't have to develop custom software and instead spend that week writing config files and orchestration scripts, be it cloud stuff, docker containers or whatever.
scarface_74 · 10 months ago
I can reproduce it without cloud services sure. But then I have to maintain it. Make it fault tolerant. Make sure it stays patched and backed up, buy enough hardware to make sure I can maintain peak capacity instead of having the elasticity, etc.

I have done all of this myself with on prem servers that I could walk to. I know exactly what’s involved and it would be silly to do that these days

forrestthewoods · 10 months ago
Windows is such a shit show that the number one Linux gaming platform works because it turns out the best API for Linux is… Win32.

Linux is a far bigger shit show. At least at the platform level. Windows is a lesser shitshow at the presentation layer

scarface_74 · 10 months ago
On second thought, as a user, Windows itself is…fine. Even as a developer with VSCode + WSL, it’s…fine

It’s more about x86. Using an x86 laptop feels archaic in 2025. Compared to my M2 MacBook Air or my work M3 MacBook Pro.

The only thing that makes MacOS much better for me is the ecosystem integration

worthless-trash · 10 months ago
Incorrect. Android has more games than Linux installations.

Deleted Comment

worik · 10 months ago
> As far as backwards compatibility, worshipping at the thrown of backwards compatibility is one reason that Windows is the shit show it is.

Not entirely, there are other reasons too

But we should respect semantic versioning. Python is a dreadful sinner in that respect.

foobiekr · 10 months ago
Semantic versioning is an illusion. It's a human-managed attempt to convey things about the surfaces and behaviors of software systems. Best case, it isn't completely misleading and a waster of everyone's time.

There is no perfection here, but the correct way to reason about this is to have schema-based systems where the surfaces and state machines are in high level representations and changes can be analyzed automatically without some human bumping the numbers.

eviks · 10 months ago
So what was the great benefit of removing 32 bits brought? Are you are to use a single string type without conversions with Swift or has that part never disappeared in those 5 migrations?
scarface_74 · 10 months ago
Apple was able to physically remove hardware support for 32 bit code and shrink the processor/use the die space for other purposes.

Also, when you have 32 bit and 64 bit code, you have to have 32 bit and 64 bit versions of the framework both in memory and on the disk.

This is especially problematic with iOS devices that don’t have swap.

aqueueaqueue · 10 months ago
If it was 3TB then you should to be fair compare it to 3PB now.
et1337 · 10 months ago
A lot of these points are opposites of each other, so much so that it seems intentional. We’re to maintain backward compatibility, yet eschew complexity.

We’re to reinvent the wheel for ourselves instead of relying on huge dependency trees, yet new languages and frameworks (the most common form of wheel reinventing) are verboten.

The only way I can think to meet all these demands is for everyone (other than you, of course) to stop writing code.

And I gotta say, a part of me wishes for that at least once a day, but it’s not a part of me I’m proud of.

afro88 · 10 months ago
It's a vent rather than a thesis. There isn't really any logic to it, it's just a list of frustrations said in a poetic way.

In my opinion, vents are fine amongst friends. Catharsis and all that. But in public they stir up a lot of negative energy without anything productive to channel it to (any solutions, calls to action etc). That sucks.

pdimitar · 10 months ago
Sucks or not, it's pretty logical. Venting is not interesting to me if I am reading something on HN. Or if they are, it's rare.

Also what you call "negative energy" I would often call "rightful criticism on non-distilled thoughts that have internal controversies".

derangedHorse · 10 months ago
One can eschew complexity without breaking changes. If the initial abstractions are done well enough, a lot of things can be swapped out with minimal breakage. There are also ways to keep existing apis while adding new ones. Outdated apis also don't need to change every time an underlying dependency changes if the abstraction is good enough.
enriquto · 10 months ago
> A lot of these points are opposites of each other, so much so that it seems intentional.

The writing style is reminiscent to this famous text (not written in jest, you have to understand that most of these statements depend on a context that is for you to provide):

"Property is theft." -- P.J. Proudhon

"Property is liberty." -- P.J. Proudhon

"Property is impossible." -- P.J. Proudhon

"Consistency is the hobgoblin of small minds." -- R.W. Emerson

siev · 10 months ago
It also reads to me like the contradictions are intentional. Something about software today feeling samey and churned-out.
abathur · 10 months ago
I agree (and I think the ~point is that weighing and being thoughtful about how tradeoffs affect a given project is part of the process).
sigbottle · 10 months ago
It's something I've been thinking about for a while.

Most logic and language is worst case. You go all in or not on a concept.

Purity culture probably arose from this (not just sexual, but in so many areas of life, we view the binary not as "bad vs good", but "at least once bad, e.g. polluted" versus "always good").

I mean I'm sure we heard a bunch of conflicting messages in our lives in many different areas, and we have rationalizations for why that is (oh, those guys belong to a different political group, different viewpoints, etc.), often to just save on computation power because how tf are you gonna answer these grand philosophical questions in an afternoon, much less your lifetime - it's very halting problem-esque. Nevertheless, we still make heuristics, live in the muddy waters, and live.

austin-cheney · 10 months ago
The most direct resolution to the conflict you mention, and this works for just about everything in life is:

Just ask yourself: Why would I want to do that?

When somebody suggests nonsense to you just ask yourself that one simple question. The answers is always, and I mean ALWAYS, one of these three things:

* evidence

* ethos, like: laws, security protocols, or religious beliefs

* narcissism

At that point its just a matter of extremely direct facts/proofs or process of elimination thereof. In the case of bad programmers the answer is almost universally some unspoken notion of comfort/anxiety, which falls under narcissism. That makes sense because if a person cannot perform their job without, fill in the blank here, they will defend their position using bias, logical fallacies they take to heart, social manipulation, shifting emotion, and other nonstarters.

As to your point about reinventing wheels you are making a self-serving mistake. Its not about you. Its about some artifact or product. In that regard reinventing wheels is acceptable. Frameworks and languages are not the product, at least not your product. Frameworks and languages are merely some enabling factor.

pdimitar · 10 months ago
How idealistic of you to assume people will even be able to agree on what's a sensical and nonsensical suggestion.

OP indeed has mutually exclusive phrases. If we ever get to the "extremely directs facts/proofs" then things get super easy, of course.

99% of the problem when working with people is to even arrive at that stage.

I predict a "well, work somewhere else then" response which I'll just roll my eyes at. You should know that this comes at a big cost for many. And even more cannot afford it. Literally.

tptacek · 10 months ago
At my first professional software job, where we wrote C, because that was all you could realistically write commercial software in, there was one person on our floor who could do software builds. He used some commercial build tool the company had licensed and he was the only one who knew how to use it or would be allowed to learn how to use it. His customers were every product team in our division --- about 12 of them. We had to petition to get beta releases done. The builds took hours.

I think we're doing fine.

YZF · 10 months ago
What year are we talking about here?

Circa 1982 or so (and some years before) IBM was shipping mainframe software written in Assembly that anyone could build or modify. They had a bug database that customers could access. Around the same era Unix was shipping with source code and all the tooling you needed to build the software and the OS itself.

So maybe compared to some of that we're doing worse.

tptacek · 10 months ago
1998.
dgb23 · 10 months ago
This cracked me up. I love hearing those stories, also from my father, where he would talk about "we already did that in the 80's and it was called...", and then he would tell me about some obscure assembler tricks to patch binaries, in order to keep them compatible with some system I never heard about.

But still, I think we can do better. That story you shared highlights a gross inefficiency and diminishing of agency that comes from dependencies.

phendrenad2 · 10 months ago
I kind of agree. I think software has a certain amount of "badness" that will always exist, it's a mathematical equilibrium. There are a myriad of things that can make your process bad, and if you fix all of them, you'll never ship. The list Pope gives here are the most common issues, but not all teams will check the whole list.
skydhash · 10 months ago
> I think software has a certain amount of "badness" that will always exist

It's fitting a messy, real world process, to the virtual, reduced (because formalism) computing world backed by failable, messy hardware components through an error-prone, misunderstanding-prone of programming.

So much failure points, and no one is willing to bear the costs to reduce them.

aqueueaqueue · 10 months ago
We are doing badly. I bet he was paid enough to buy a good house and it was a low stress issue Job!
dilyevsky · 10 months ago
Same here except Motorola circa 2007. The build took hours and compiler was licensed from Windriver. We had to schedule time to do builds and test runs using some Rational pos software and we had a dedicated eng in charge of doing merges on release branch
mewpmewp2 · 10 months ago
All the statements in that post are trade offs. In all cases you are sacrificing something to gain something else. Therefore in a way you are always "destroying" something.

Sometimes it is valid to not reinvent the wheel. Sometimes wheel needs to be reinvented to learn. Both actions are done. Sometimes the decision was right. Sometimes not.

Overall as a whole we are creating things, more than we are destroying. I don't see the need to take a negative stance.

gmueckl · 10 months ago
"Destroying software" is broader than the creation of new, working software artifacts in the moment. The phrase refers to changes in engineering culture in software and it's long term effects, not the immediate successes.

Writing a new green field project using 10.000 npm dependencies for an electron based front end is shockingly easy. But how do you keep that running for the next 15 years? Did the project need npm? Or a web browser? How do all the layers between the lamguage of choice and the bare metal actually behave and can you reason about that aggregate accurately?

The field has come to a point where a lot of projects are set up with too many complexities that are expedient in the short term and liabilities in the long term.

The current generation of junior devs grows up in this environment. They learn that these mistakes as "the right thing to do" when they are questionable and require constant self-reflection and reevaluation. We do not propagate a hacking culture enough that values efficiency and simplicity in a way that leads to simple, efficient, stable, reliable and maintainable software. On a spectrum of high quality craftsmanship to mass-produced single use crap, software is trending too much to the latter. It's always a spectrum, not a bunary choice. But as a profession, we aren't keeping the right balance overall.

tremendoussss · 10 months ago
I've been a backend engineer for about 10 years, with my last job doing an aws lambda stack.

I started a job in manufacturing a few months ago and having to think that this has to work for the next 20 years has been a completely different challenge. I don't even trust npm to be able to survive that so web stuff has been been an extra challenge. I landed on lit web components and just bringing it in via a local CDN.

mewpmewp2 · 10 months ago
World is full of abstractions on many different levels. Something being on a lower level doesn't inherently mean superior. You can go in any direction on the scale or spectrum. Do you know how exactly atoms behave that computers are made out of? There are plenty of people working on all different sorts of abstractions, new abstractions appear and demand for lower level increases when it is needed. You could say that as more abstractions are built on top of lower level the balance of all the field will go higher in abstraction level on average, but that is the natural way to evolve. Abstractions allow you to build faster and the abstractions are possible because of lower level elements. In the end if you are measuring what an average level of abstraction for current industry is you can draw the line arbitrarily. You could include the people who use website builders and you can calculate the average to be even higher. We need people working on at all different levels of abstraction. We could divide the groups with 2 different naming convention for lower level engineers and higher level, then technically you could go back to calculating that average is still where it used to be.

I definitely use npm (or rather pnpm) because I know it will allow me to build whatever I want much faster.

dahart · 10 months ago
Agreed and well said. Furthermore, a lot of the statements in the post are making opposing tradeoffs when you put them together. A bunch of them value experimenting and breaking things, and a bunch of others value using what we already have and not breaking things.

A few of them aren’t decisions any individuals have control over. Most coders aren’t jumping onto new languages and frameworks all the time; that’s an emergent group behavior, a result of there being a very large and growing number of programmers. There’s no reason to think it will ever change, nor that it’s a bad thing. And regardless, there’s no way to control it.

There are multiple reasons people write software fast rather than high quality. Because it’s a time/utility tradeoff, and time is valuable. It’s just a fact that software quality sometimes does not matter. It may not matter when learning or doing research, it may not matter for solo projects, it may not matter for one-off results, and it may not matter when software errors have low or no consequence. Often it’s a business decision, not an engineering decision; to a business, time really is money and the business wants engineering to maximize the utility/time ratio and not rabbit hole on the minutiae of craftsmanship that will not affect customers or sales.

Sometimes quality matters and time is well spent. Sometimes individuals and businesses get it wrong. But not always.

ryandrake · 10 months ago
I guess the rant should be renamed "business is destroying software" because several of the tradeoffs he mentions can be root caused to a commercial entity cutting corners and sacrificing everything on the altar of "developer time" in order to save money. Only a business would come up with the madness of "Move Fast And Break Things."
antirez · 10 months ago
> Overall as a whole we are creating things, more than we are destroying. I don't see the need to take a negative stance.

Fair point: each one of us can think about the balance and understand if it's positive or negative. But an important exercise must be accomplished about this: totally removing AI from the complexity side.

Most of the results that neural networks gave us, given the hardware, could be recreated with a handful lines of code. It is evident every day that small teams can rewrite training / inference engines from scratch and so forth. So AI must be removed from the positive (if you believe it's positive, I do) output of the complexities of recent software.

So if you remove AI since it belongs to the other side, the "complicated software world" what gave us, exactly, in recent times?

mewpmewp2 · 10 months ago
If we discard the AI, which I don't think we should, but if we do - my life has been enriched a lot in terms of doing things I want to do vs things I don't want to. Very quick deliveries, I never have to go to a physical store, digital government online services, never having to wait in any queue, ability to search and find answers without having to go to libraries or know specific people. Online films, tv shows on demand, without ads. There are tons of those things that I feel have made my life so much easier.
ahofmann · 10 months ago
While I highly respect antirez, I think this post is full of good sounding, short statements, that wouldn't hold in a discussion.

One example: Newbies shouldn't reinvent the wheel. I think they should use the tools, that are available and common in the given context. When they want to tinker, they should write their own compiler. But they shouldn't use that in production.

Another: Backward API compatibility is a business decision in most cases.

Also, I think it doesn't help to start every sentence with "We are destroying software". This sounds much more gloomy, than it really is.

bayindirh · 10 months ago
> Newbies shouldn't reinvent the wheel.

I strongly disagree. They should, and fail and try again and fail. The aim is not to reinvent the wheel, but to understand why the wheel they're trying to reinvent is so complex and why the way it is. This is how I learnt to understand and appreciate the machine, and gave me great insight.

Maybe not in production at first, but they don't reinvent the wheel in their spare time either. They cobble up 200 package dependency chains to make something simple, because that’s what they see and taught. I can write what many people write with 10 libraries by just using the standard library. My code will become a bit longer, but not much. It'll be faster, more robust, easier to build, smaller, and overall better.

I can do this because I know how to invent the wheel when necessary. They should, too.

> Another: Backward API compatibility is a business decision in most cases.

Yes, business decision of time and money. When everybody says that you're losing money and time by providing better service, and lower quality is OK, management will jump on it, because, monies.

> Also, I think it doesn't help to start every sentence with "We are destroying software". This sounds much more gloomy, than it really is.

I think Antirez is spot on. We're destroying software. Converting it to something muddy and something for the ends of business, and just for it.

I'm all with Antirez here. Software came here, because we developed the software just for the sake of it, and evolved it to production ready where needed. Not the other way around (Case in point: Linux).

ryandrake · 10 months ago
> Yes, business decision of time and money. When everybody says that you're losing money and time by providing better service, and lower quality is OK, management will jump on it, because, monies.

Often that "saving money" is just externalizing the cost onto your users. Especially in mobile development. Instead of putting in the tiny amount of effort it takes to continue support for older devices, developers just increase the minimum required OS version, telling users with older hardware to fuck off or buy a new phone.

Another example is when you don't take the time to properly optimize your code, you're offloading that cost onto the user in the form of unnecessarily higher system requirements.

dasil003 · 10 months ago
> I'm all with Antirez here. Software came here, because we developed the software just for the sake of it, and evolved it to production ready where needed. Not the other way around (Case in point: Linux).

Growing up in the 80s and 90s I understand viscerally how you feel, but this take strikes me as willfully ignorant of the history of computers, and the capitalist incentives that were necessary for their creation. The first computer and the internet itself were funded by the military. The PC wouldn't have existed if mainframes hadn't proved the business value in order to drive costs down to the point the PC was viable. Even the foundational ideas that led to computers couldn't exist with funding—Charles Babbage's father was a London banker.

I think a lot of what you are reacting to is the failed promise of free software and the rise of the internet, when the culture was still heavily rooted in 60s counter-culture, but it hadn't crossed the chasm to being mainstream, so it was still possible to envision a utopian future based on the best hopes of a young, humanitarian core of early software pioneers operating largely from the sheltered space of academia.

Of course no such utopian visions ever survive contact with reality. Once the internet was a thing everyone had in their pocket, it was inevitable that software would bend to capitalist forces in ways that directly oppose the vision of the early creators. As evil as we thought Microsoft was in the early 90s, in retrospect this was the calm before the storm for the worst effects of tech. I hear Oppenheimer also had some regrets about his work. On the plus side though, I am happy that I can earn enough of a living working with computers that I have time to ponder these larger questions, and perhaps use a bit of my spare time to contribute something of worth back to the world. Complaining about the big picture of software is a fruitless and frustrating endeavour, instead I am interested in how we can use our expertise and experience to support those ideals that we still believe in.

JKCalhoun · 10 months ago
> Another: Backward API compatibility is a business decision in most cases.

Agree. That statement/sentiment though doesn't refute the point that it's destroying software.

layer8 · 10 months ago
I actually don’t agree. Maintaining or not maintaining backwards compatibility is often a decision made on the technical level, e.g. by a tech lead, or at least heavily based on the advice from technical people, who tend to prefer not being restricted by backwards compatibility over not breaking things for relying parties.
caseyy · 10 months ago
> Newbies shouldn't reinvent the wheel. I think they should use the tools, that are available and common in the given context. When they want to tinker, they should write their own compiler. But they shouldn't use that in production.

So basically they shouldn’t learn the prod systems beyond a shallow understanding?

epolanski · 10 months ago
> newbies shouldn't reinvent the wheel

They absolutely should, or they will never even get to understand why they are using these wheels.

Fun fact, try to question modern web developers to write a form, a simple form, without a library.

They can barely use html and the Dom, they have no clue about built-in validation, they have no clue about accessibility but they can make arguments about useMemo or useWhatever in some ridiculous library they use to build...ecommerces and idiotic crud apps.

antirez · 10 months ago
> When they want to tinker, they should write their own compiler. But they shouldn't use that in production.

Why? We should stop saying others how they want to write/use their code ASAP.

Many established technologies are a total shitstorm. If it is ok to use them, it is ok if somebody wants to use their own compiler.

drawkbox · 10 months ago
These systems also came from tinkering. Most programming languages even are really the investment of one person for a long time, doing apparently what you aren't supposed to do.

When it comes down to it, whatever works best and is usually the most simple, non-breaking, used to win out. That decision has been disconnected from the value creators to the value extractors. It is impossible to extract value before value is created.

Additionally, programming is a creative skill no matter how hard they try to make it not one. Creativity means trying new things and new takes on things. People not doing that will harm us long term.

dahart · 10 months ago
>> But they shouldn’t use that in production. > Why?

Generally speaking, because that’s very likely to end up being “pushing for rewrites of things that work”, and also a case of not “taking complexity into account when adding features”, and perhaps in some cases “jumping on a new language”, too.

This is an imagined scenario, but the likelihood of someone easily replacing a working compiler in production with something better is pretty low, especially if they’re not a compiler engineer. I’ve watched compiler engineers replace compilers in production and it takes years to get the new one to parity. One person tinkering and learning how to write a compiler almost for sure does not belong in production.

Ygg2 · 10 months ago
> Why?

Those who do not know history are doomed to repeat it. Or re-re-reinvent Lisp.

There was this anecdote about storm lamp or something. New recruit comes to a camp and sees old guard lighting lamps are turned upside down and lit sideways with a long stick. But he knows better, he tells them and they smirk. First day he lights them the optimal way with a lighter. He's feeling super smug.

But next day he finds the fuse is too short to reach so he takes the long stick...

Few months later, he's a veteran, he's turning lamp upside down using lighter sideways, with a long stick.

And the fresh recruit says he can do it better. And the now old guard smirks.

I'm sure I'm misremembering parts, but can't find the original for the life of me.

ahofmann · 10 months ago
> Why?

If someone would hand me a project, that is full of self invented stuff, for example a PHP project, that invented its own templating or has it's own ORM, I would run. There is laravel, slim or symfony, those are well established and it makes sense to use them. There are so much resources around those frameworks, people who posted about useful things, or packages that add functionality to those. It just doesn't make sense to reinvent the wheel for web frameworks and thousands of packages around those.

Writing software is standing on the shoulders of giants. We should embrace that, and yes one should learn the basics, the underlying mechanisms. But one should make a difference between tinkering around and writing software, that will be in production for years and therefore worked on by different developers.

The JavaScript world shows how to not do things. Every two years I have to learn the new way of building my stuff. It is annoying and a massive waste of resources. Everyone is always reinventing the wheel and it is exhausting. I understand why it is like this, but we as developers could have made it less painful, if we would embrace existing code instead of wanting to write our own.

mukunda_johnson · 10 months ago
the shitstorms usually have a community behind it. Even if it sucks, it's supported and will be maintained to a point of it "working." If someone writes their own thing, chances are they won't go the extra mile and build a community for it. Then, when it comes to maintaining it later, it might grow untenable, especially if the original "tinkerer" has moved on.
nthingtohide · 10 months ago
This reminds me of Jonathan Blow's talk. Software decays just like everything else if we don't tend to it.

Preventing the Collapse of Civilization / Jonathan Blow (Thekla, Inc)

https://www.youtube.com/watch?v=ZSRHeXYDLko

Software technology is in decline despite appearances of progress. While hardware improvements and machine learning create an illusion of advancement, software's fundamental robustness and reliability are deteriorating. Modern software development has become unnecessarily complex, with excessive abstraction layers making simple tasks difficult. This complexity reduces programmer productivity and hinders knowledge transfer between generations. Society has grown to accept buggy, unreliable software as normal. Unless active steps are taken to simplify software systems across all levels, from operating systems to development tools, civilization faces the risk of significant technological regression similar to historical collapses.

_hao · 10 months ago
I think that talk might be Jonathan Blow's most important work to date actually. I love The Braid and The Witness, but "Preventing the Collapse of Civilization" managed to articulate what at least me and my circle of devs have talked about and discussed for long time, but were never quite able to put into words. I'm very grateful for people like him and others like Casey Muratori, Mike Acton etc. who continue to point this out very real danger in the (at least) last decade.

Unfortunately my stance is that fundamentally things won't change until we get hit with some actual hardware limitations again. Most devs and people in general prefer a semblance of a working solution quickly for short-term gains rather than spending the actual time that's needed to create something of high quality that performs well and will work for the next 30 years. It's quite a sad state of affairs.

With that said I'm generally optimistic. There is a small niche community of people that does actually care about these things. Probably won't take over the world, but the light of wisdom won't be lost!

bathtub365 · 10 months ago
Oftentimes there isn’t a need for something to work for the next 30 years as the business will change in a much shorter timeframe and software exists to serve the business. While I agree that software quality varies wildly if the business can’t get off, the ground because the software isn’t ready, the software team will quickly stop existing with the rest of the business.
cageface · 10 months ago
You can get all the software quality you want if you're willing to pay for it.

Users have now been taught that $10 is a lot to pay for an app and the result is a lot of buggy, slow software.

dinkumthinkum · 10 months ago
It’s interesting that all three of the people you mention are very concerned with performance, something most programmers don’t even think about anymore or think they aren’t supposed to.
ChrisMarshallNY · 10 months ago
> Society has grown to accept buggy, unreliable software as normal.

That’s really the entirety of the issue, right there.

People aren’t just accepting bad software, they’re paying for it, which incentivizes a race to the bottom.

It’s like “The War on Drugs,” that looks only at the suppliers, without ever considering the “pull,” created by the consumers.

As long as there are people willing to pay for crap, there will be companies that will make and sell crap.

Not only that, companies that try to make “not-crap,” will be driven out of business.

Animats · 10 months ago
> Society has grown to accept buggy, unreliable software as normal.

That's easy to stop. Disallow warranty disclaimers.

turdprincess · 10 months ago
As an opposite view point I work on a 10 year old legacy iOS app which has almost no abstractions and it’s a huge mess. Most classes are many thousands of lines long and mix every concern together.

We are fixing it up by refactoring - many through adding abstractions.

I’m sure code with bad abstractions can scale poorly, but I’m not clear how code without abstractions can scale at all.

epolanski · 10 months ago
> Most classes are many thousands of lines long and mix every concern together.

That's quite unrelated to abstractions. It's just poorly written code, for whatever reasons may have led there.

nradov · 10 months ago
The notion that software defects could destroy civilization is just so silly. I think some people have the impression that the quality of tools and technologies was somehow better before. But since the beginning of the industrial age, most of our stuff has been kind of crap. You might see good quality stuff preserved by enthusiasts or museums but that's survivorship bias and what most people used was worse. People muddle through somehow.
bdangubic · 10 months ago
except everything fucking runs on software now mate…
washadjeffmad · 10 months ago
That's a feature. Permanent systemization of short term worldviews and lenses is pretty horrifying. That's how those techno-theocratic civilizations in pop culture happen.

Imagine if a company had been able to systemize the whims of a paranoid regime, allowing them to track and spy on their citizens with impunity in secret, and the population became so inured to the idea that it became an accepted facet of their culture.

Or what if a political party dominated a state and systematized a way to identify and detect oppositional thought, stamping it out before a counterculture could ever arise. Maybe that thought was tied to a particular religious, ethnic, and/or cultural group.

What if these companies are here today, selling those products to the highest (nation-state) bidders, and the methods they're employing to keep the conceptual wheels turning at scale rely on filtering that selects for candidates who will gladly jump through abstract hoops set before them, concerned only with the how and never the why of what they're doing.

Deleted Comment

danparsonson · 10 months ago
I think this is a straw man personally.

Think about what happens when two people video call each other from opposite sides of the world. How many layers of hardware and software abstraction are they passing through, to have a reliable encrypted video conversation on a handheld computer where the majority of the lag is due to the speed of light? How much of that would you like to remove?

I would venture an alternative bogeyman - "move fast and break things" AKA the drive for profits. It's perfectly possible (as illustrated above) to reliably extract great performance from a tower of abstractions, while building those abstractions in a way that empowers developers; what's usually missing is the will to spend time and money doing it.

chrz · 10 months ago
> How many layers of hardware and software abstraction are they passing through, to have a reliable encrypted video conversation on a handheld computer where the majority of the lag is due to the speed of light?

And then the video player crashes trying to play an ad

williamcotton · 10 months ago
There are maintenance costs and then there is depreciation/amortization.
jack_h · 10 months ago
> Software technology is in decline despite appearances of progress. While hardware improvements and machine learning create an illusion of advancement, software's fundamental robustness and reliability are deteriorating. Modern software development has become unnecessarily complex, with excessive abstraction layers making simple tasks difficult. This complexity reduces programmer productivity and hinders knowledge transfer between generations. Society has grown to accept buggy, unreliable software as normal. Unless active steps are taken to simplify software systems across all levels, from operating systems to development tools, civilization faces the risk of significant technological regression similar to historical collapses.

I haven't watched that talk by Blow yet so maybe he covers my concern.

I think you have to be mindful of incentives structures and constraints. There's a reason the industry went down the path that it did and if you don't address that directly you are doomed to failure. Consumers want more features, the business demands more stuff to increase its customer base, and the software developers are stuck attempting to meet demand.

On one hand you can invent everything yourself and do away with abstractions. Since I'm in the embedded space I know what this looks like. It is very "productive" in the sense that developers are slinging a lot of code. It isn't maintainable though and eventually it becomes a huge problem. First no one has enough knowledge to really implement everything to the point of it being robust and bug free. This goes against specialization. How many mechanical engineers are designing their own custom molding machine in order to make parts? Basically none, they all use mold houses. How many electrical engineers are designing their own custom PCB(A) processing machines or ICs/components? Again, basically none. It is financially impossible. Only in software I regularly see this sentiment. Granted these aren't perfect 1-to-1 analogies but hopefully it gets the idea across. On the other hand you can go down the route of abstractions. This is really what market forces have incentivized. This also has plenty of issues which are being discussed here.

One thought that I've had, admittedly not fully considered, is that perhaps F/OSS is acting negatively on software in general. When it comes to other engineering disciplines there is a cost associated with what they do. You pay someone to make the molds, the parts from the molds, etc... It's also generally quite expensive. With software the upfront cost to adopting yet another open source library is zero to the business. That is there is no effective feedback mechanism of if we adopt X we need to pay $Y. Like I said, I haven't fully thought through this but if the cost of software is artificially low that would seem to indicate the business and by extensions customers don't see the true cost and are themselves incentivized to ask for more at an artificially low price thus leading to issues we are currently seeing. Now don't misread me, I love open source software and have nothing but respect for their developers; I've even committed to my fair share of open source projects. As I've learned more about economics I've been trying to view this through the lens of resource allocation though and it has lead me to this thought.

caseyy · 10 months ago
Interesting. My experience is that bulky abstraction layers are harder to maintain than own software.

In game development, whenever we go with highly abstract middleware, it always ends up limiting us in what we can do, at what level of performance, how much we can steer it towards our hardware targets, and similar. Moreover, when teams become so lean that they can only do high level programming and something breaks close to the metal, I’ve seen even “senior” programmers in the AAA industry flail around with no idea how to debug it, and no skills to understand the low level code.

I’ve seen gameplay programmers who don’t understand RAII and graphics programmers who don’t know how to draw a shape with OpenGL. Those are examples of core discipline knowledge lost in the games industry. Aka what we have now, we might not know anymore how to build from scratch. Or at least most software engineers in the industry wouldn’t. It cannot end well.

Building your own in my exp is a better idea — then you can at least always steer it, improve and evolve it, and fix it. And you don’t accidentally build companies with knowledge a mile wide and an inch deep, which genuinely cannot ship innovative products (technically it is impossible).

FridgeSeal · 10 months ago
> Consumers want more features

I’m not even sure if this is true anymore. We got new features foisted on us most of the time.

MonkeyClub · 10 months ago
> Consumers want more features

I no longer think that's true. Instead, I think consumers want reliability, but more features is a way to justify subscription pricing segregation and increases.

nradov · 10 months ago
Most large enterprise IT departments are fully aware that the cost of adopting yet another open source library is very high even if the price is zero. This cost comes in the form of developer training, dependency management, security breaches, patent troll lawsuits, etc. There is a whole niche industry of tools to help those organizations manage their open source bill of materials.
paulryanrogers · 10 months ago
Adopting libraries and 3P solutions is like jumping in the pool, easy to do. Getting out of the pool is much harder. Or in some cases like jumping into quick sand. Sometimes it can be hard to tell which before you're in it.
thefz · 10 months ago
I can't take Blow seriously after his meltdown on how difficult it is to test games on different OSes while some developers already released on multiple platforms... all in a one man's band (i.e. https://www.executionunit.com/blog/2019/01/02/how-i-support-...).

Aside, he's treated like a celebrity in the game developer niche and I can't understand why.

JasserInicide · 10 months ago
He has some good takes (like the talk he gave in the OP) but also has some questionable ones. How he feels on work-life balance comes to mind. He seems to legitimately hate having to employ other people to help create his games.
the_mitsuhiko · 10 months ago
It can be difficult and at the same time be possible. Clearly he also released his games on multiple platforms. Braid I think was everywhere from Xbox 360, Windows, Mac, Linux, Switch and mobile phones.
cableshaft · 10 months ago
Just because one person is willing to do everything needed to test on all platforms doesn't mean everyone should therefore be willing to put the time and effort into it.

Depending on what tech you use it can be easier or harder to do as well. I'm making a game with Love2D now, which has made supporting Mac and Linux rather trivial so far, although I've run into challenges with mobile platforms and the web even though it support them (it does work, but it takes more low-level glue code to support native phone features, and the web doesn't seem to be well maintained and my game is throwing webassembly errors currently when I try to run it).

And my previous game (which is on the backburner for now) was made with Monogame, and while that technically has support for Mac and Linux (as well as mobile), I've had quite a few issues even just getting the Mac version working well, like issues with resolution, rendering 3D properly, getting shaders not to break, etc. And they haven't kept up with the latest Mac updates the past few years and have had to make a special push to try to get it caught back up. I've probably sunk a good 20+ hours trying to get it working well before putting that aside to work on the actual game again and I still might have to rearchitect things a bunch in order to get that working.

Meanwhile Unity would probably be pretty dirt simple to port, for the most part, but it comes with other tradeoffs, like not being open source, and trying to pull a stunt a couple years ago where they pulled the rug out from under developers with changing licensing to something aggressive (that convinced other developers to port their games away from the platform), etc.

And there's Godot, which seems to be getting more support again (which is great, I even considered it or my current game, I just like coding in Love2D a bit better), but if you ever want your game on consoles you have to pay a third party to port your games to consoles for you.

The guy you linked makes their own engine (and to be fair, so does Jonathan Blow, who you're critiquing), which is great, but not everyone wants to get that low level. I would rather spend more time focusing on building the games themselves, which is already hard enough, rather than spending all that time building an engine.

It was for that reason that I spent several years focused on board game design instead (as I can make a working game with drawing on index cards and some colored plastic cubes in less than an hour), although that has its own frustrations with significant hurdles to get your game signed by publishers as an unknown designer (I did get one signed four years ago, and it's still not released yet), and large financial risks being made for manufacturing and distribution.

Edit: Also the person you linked to isn't even sure it was financially worth it to support all of those platforms, they just do it for other reasons:

"Do you make your money back? It’s hard to say definitely which could mean no. Linux and macOS sales are low so the direct rewards are also low. ...For Smith and Winston [their current game] right now I don’t think it’s been worth it financially (but it will probably make it’s money back on Launch)"

ryandrake · 10 months ago
Link to the meltdown? I could use a good chuckle this morning.
jhatemyjob · 10 months ago
Look up Indie Game: The Movie
alabhyajindal · 10 months ago
All Jonathan Blow talks, or rants more accurately, are the same. He personifies old man yells at cloud.
chrz · 10 months ago
An opinionated ontake from a veteran is the best content mate
0xbadcafebee · 10 months ago
All of those things have been happening for over a decade. There is no actual discipline of software design today. It's just people googling shit, copy-and-pasting, and praying.

I often work with people who refuse to use software unless it's so well known that they can google for stackoverflow answers or blog walkthroughs. Not because well-known software is stable or feature-filled; no, they literally are just afraid of having to solve a problem themselves. I'm not sure they could do their jobs without canned answers and walkthroughs. (and I'm not even talking about AI)

This is why I keep saying we need a professional standards body. Somebody needs to draw a line in the sand and at least pretend to be real adults. There needs to be an authoritative body that determines the acceptable way to do this job. Not just reading random people's blogs, or skimming forums for sycophants making popular opinions look like the correct ones. Not just doing whatever a person feels like, with their own personal philosophy and justification. There needs to be a minimum standard, at the very least. Ideally also design criteria, standard parts, and a baseline of tolerances to at least have the tiniest inkling if something is going to fall over as soon as someone touches it. Even more should be required for actual safety systems, or things that impact the lives of millions. And the safety-critical stuff should have to be inspected, just like buildings and cars and food and electric devices are.

The lunatics are running the asylum, so that's not happening anytime soon. It will take a long series of disasters for the government to threaten to regulate our jobs, and then we will finally get off our asses and do what we should have long ago.

caseyy · 10 months ago
I agree with the frustration but I think heavily regulated professions often copy+paste even more. See: modern Western medicine, where most of what being a general physician is involves following a flow chart. You get really bad outcomes from it too.

I’d like to have standard professional certification because I could use it as proof of the effort I put into understanding software engineering that many ICs have not. But I think that many people have “that’ll do it” values and whatever professional context you put them in, they will do the worst possible acceptable job. The best you can do is not hire them and we try to do that already — with LeetCode, engineering interviews, and so on. That effort does work when companies make it.

rcxdude · 10 months ago
The design work in fields which are heavily regulated like that is even more copy-and-pasted. Not only will the average angineer be afraid of solving problems themselves, anyone who is willing to do it will be actively discouraged by the processes from doing so, even if the copy-and-paste answers have severe, known flaws. The grass is not greener on the other side here.

(Safety critical work is, in fact, inspected and accredited like you would wish, and I have seen the ugly, ugly, terrifying results inside the sausage factory. It is not a solution for people who don't care or don't have a clue, in fact it empowers them)

0xbadcafebee · 10 months ago
Oh for sure there's some bullshit out there. The self-attestations of the DoD alone are laughable. But I have also seen a number of critical systems with no inspection. Water, power, financial, health care, emergency services, etc. The kind of shit that from a National Security perspective we should have some eyes on, but don't.
anon-3988 · 10 months ago
A standard body standardized C++, clearly having a standard doesn't help.