20 years since this manifesto lamenting the previous 20 years its pretty clear this ship has no captain
In terms of resource consumption machine capacity has grown exponentially, but who can claim the same for user utility?
This is by now a problem so pervasive that even economists puzzle over (the missing information technology productivity gain)
But this text is part of the problem really, as it is too simplistic and doesnt feel like it identifies fundamental recipes for building better software
This is just a guess but I think that there are two problems here, not one: 1) inefficiency strictu sensu (more operations required for the exact same task), and 2) lower diminishing returns (that kitchen sink being included weights far more than the rest of the project, but maybe you should still not remove it because it still provides a small return of user utility).
I'm a proponent of minimalist software design (I wish Plan 9 had won!), but I frankly can't figure out what the author is trying to say. I'm not knowledgeable about the Futurist movement, but I can't see what it might have to say about software design. And I completely lost interest when I got to the bullet point “Structured programming = slow”.
My Linux laptop boots in about 20 seconds. CRT TV's took longer than that to warm up.
Yeah, those times are bullshit. Please tell me what platform has a video game boot in less than a second, because it hasn't been true for any generation of console I've played on (which dates back to the Sega Genesis).
This feels like it was written around 1995 (the newest reference is 1994, and the "hot new technology" they're shitting on is C++, which suggests that Java hasn't come out yet). There's a general smug air of "real programmers use assembly." For bonus points, they also criticize computer scientists on the basis of "most NEW ideas are feared and rejected", which is honestly a pretty good summary of the diatribe.
Back when video games came on ROM cartridges (before CD-ROMS), there would be no 'loading process' at all since all data is already mapped into memory. There may be a couple of logos at startup, but that's just wasting user time, not to hide any 'loading'.
The fact that game consoles have become more and more like PCs in terms of user experience over the decades is actually pretty sad.
I mean, isn't that kind of the problem? The technology of yesteryear was orders of magnitude slower and less resourceful, but today many tasks take orders of magnitude more time and resources for not even close to an order of magnitude more utility.
Ha. Thanks for mentioning video games in 1 second. I was wondering what universe had video games booting in 1 second, because I'd like to get in on that.
From the user's perspective, a PS5 appears to launch the most recently played game in about 1 second. A better word would probably be "resuming" than "booting."
The Futurists were over-the-top (and sometimes more typographically out-there than this web page, in a pre-DTP time!) so I highly doubt the page is meant to be taken entirely seriously.
The original Futurist Manifesto (1909) [1] is indeed over-the-top and out-there. Much more about them in Wikipedia [2]. I don't see the connection between those original Futurists and the linked Futurist Programming article, maybe none was intended.
Yep. There is (in my opinion) a genealogical line that connects Futurism to Dada to Punk. Certainly their typography presents very similarly. In spirit, they all spent a lot of energy rejecting the status quo and not much in developing alternatives.
What kind of TVs did you have? The CRT TVs I remember in the 1990s and 2000s all "booted" in 1-3 seconds from no power to displaying proper image. The flat-ish and flat TVs that came later also booted comparably quickly. It's only when Smart TVs appeared that I first saw a TV taking more than 5 seconds to start.
(I'm counting the time it took a TV to show broadcast programming or "normal" cable programming. IIRC cable decoder boxes would introduce some extra delay here for "cold start", but those boxes were all shitty garbage from the same kind of companies that now sell smart TVs, and we didn't use them anyway.)
Yep. I remember analog cable/satellite decoder boxes and even the first-generation digital ones being almost instantaneous too.
It was only when they became more like a computer that it started going down. When they started getting remote updates it became a nightmare.
The set-top box I had before cord-cutting was "always on", and consumed quite a lot of power even when "turned off". It was extremely hot all day. Pulling the plug off the wall when not using was enough to save money, but then it take a full minute to power on, and then it had to wait for something from the satellite. I eventually bought a timer so it would turn on at 8am.
> ...about 20 seconds. CRT TV's took longer than that to warm up
You must have had a very crappy CRT TV :) A good TV (or CRT computer monitor) was available near-instantly, and also switched channels much faster than a modern 'digital' TV.
Not quite sure just how old, but archive.org has it since 2006 (earliest HN submission 2011, no comments, though). And here [0] are the delightful viewing notes for the articles:
> These documents were designed to be viewed in a window that is 564 pixels wide when the scroll bar is visible. If you want to make these pages look the way I designed them, adjust the width of your Web browser until the arrows below are fully visible and centered. If you want to use some other width that's fine too.
The about me [1] page says some (?) documents on the site were published in HTML in 1994, so I guess the actual dates are even older.
The idea of rejecting waste has been largely rejected itself in time of desktop dominance. Who cares if a desktop program runs 100x as slow as it should have if it 1) runs 2) is paid for anyway.
Now, in a cloud, YOU care if your program runs 100x as slow as it could have because you pay the AWS bill. All the waste is now your expense.
The focus on superoptimization is laughable, and puts me off on even considering the other opinions presented here. Superoptimization is such a joke in so many cases that it's basically undeployable: if it takes a days of a srever's compute to save a microsecond, you better be planning to run that program 8.64e+10 times...
Took me a bit, but my take here is -- satire by someone who's actually completely given up on the idea of making software better? Where you make fun of a goal because, deep down, you believe that you'll never be good enough to attain it? That's the only read that makes this make sense to me.
Is it really about being "good enough"? With the crazy amount of interdependency between applications and system software, is it even possible for a single person to solve the issues enumerated by the author? Sure this seems to be written in the 90s, but even back them that seemed impossible.
Even luminaries like Alan Kay have been trying (with projects like Viewpoints Research Institute) to make computer software "simpler" (for lack of a better word), but is it really feasible when the whole industry is moving in the complete opposite direction?
A lot of people can make a fast and simple enough Operating System. A lot did when back in the day at https://osdev.org/, and some probably still do. The problem is "drawing the rest of the owl": the apps.
Honestly the rest of the owl is drivers, not applications. The favorite sports of programmers are rewriting things from scratch and porting DOOM. Not to mention that we have virtualization concepts built in to our modern architectures anyway. If your OS is good and works on their hardware people will get software working on it.
The problem is that it won't work on their hardware without an army of people writing drivers. Even Linux has driver issues despite having huge numbers of people working on it. Arguably that's a bit self inflicted because they insist on having everything in-tree, but still.
In terms of resource consumption machine capacity has grown exponentially, but who can claim the same for user utility?
This is by now a problem so pervasive that even economists puzzle over (the missing information technology productivity gain)
But this text is part of the problem really, as it is too simplistic and doesnt feel like it identifies fundamental recipes for building better software
My Linux laptop boots in about 20 seconds. CRT TV's took longer than that to warm up.
Maybe I over-value coherence.
> Television - 3 seconds
> Automobile - 2 seconds
> Microwave oven - less than 1 second
> Video game - less than 1 second
> Unix workstation - 120 seconds or more
Yeah, those times are bullshit. Please tell me what platform has a video game boot in less than a second, because it hasn't been true for any generation of console I've played on (which dates back to the Sega Genesis).
This feels like it was written around 1995 (the newest reference is 1994, and the "hot new technology" they're shitting on is C++, which suggests that Java hasn't come out yet). There's a general smug air of "real programmers use assembly." For bonus points, they also criticize computer scientists on the basis of "most NEW ideas are feared and rejected", which is honestly a pretty good summary of the diatribe.
The fact that game consoles have become more and more like PCs in terms of user experience over the decades is actually pretty sad.
instant on to command prompt
computers get a floppy disk drive:boot from floppy disk wait for command prompt
computers get a hard disk and can store more stuff like startup scripts:boot hard disk, startup script runs, wait longer for command prompt
computers get Graphical Operating Systems:startup screen, wait for desktop to load, wait even longer to open a command prompt program
computers can be multi-user:start up screen, wait for desktop to load, login, wait even a bit longer to open a command prompt program
computers and the internet are ubiquitous:start up screen, wait for desktop to load, login, get distracted by web browser, forget to open the command prompt program
tiny devices are everywhere:the battery in my phone is dead.
cf https://en.wikipedia.org/wiki/Futurist_cooking or http://www.designhistory.org/Avant_Garde_pages/Futurism.html
1. https://www.societyforasianart.org/sites/default/files/manif...
2. https://en.wikipedia.org/wiki/Futurism
(I'm counting the time it took a TV to show broadcast programming or "normal" cable programming. IIRC cable decoder boxes would introduce some extra delay here for "cold start", but those boxes were all shitty garbage from the same kind of companies that now sell smart TVs, and we didn't use them anyway.)
It was only when they became more like a computer that it started going down. When they started getting remote updates it became a nightmare.
The set-top box I had before cord-cutting was "always on", and consumed quite a lot of power even when "turned off". It was extremely hot all day. Pulling the plug off the wall when not using was enough to save money, but then it take a full minute to power on, and then it had to wait for something from the satellite. I eventually bought a timer so it would turn on at 8am.
You must have had a very crappy CRT TV :) A good TV (or CRT computer monitor) was available near-instantly, and also switched channels much faster than a modern 'digital' TV.
> These documents were designed to be viewed in a window that is 564 pixels wide when the scroll bar is visible. If you want to make these pages look the way I designed them, adjust the width of your Web browser until the arrows below are fully visible and centered. If you want to use some other width that's fine too.
The about me [1] page says some (?) documents on the site were published in HTML in 1994, so I guess the actual dates are even older.
[0]: http://www.graficaobscura.com/setup/index.html
[1]: http://www.graficaobscura.com/paul/index.html
https://www.graficaobscura.com/future/
The idea of rejecting waste has been largely rejected itself in time of desktop dominance. Who cares if a desktop program runs 100x as slow as it should have if it 1) runs 2) is paid for anyway.
Now, in a cloud, YOU care if your program runs 100x as slow as it could have because you pay the AWS bill. All the waste is now your expense.
Even luminaries like Alan Kay have been trying (with projects like Viewpoints Research Institute) to make computer software "simpler" (for lack of a better word), but is it really feasible when the whole industry is moving in the complete opposite direction?
A lot of people can make a fast and simple enough Operating System. A lot did when back in the day at https://osdev.org/, and some probably still do. The problem is "drawing the rest of the owl": the apps.
The problem is that it won't work on their hardware without an army of people writing drivers. Even Linux has driver issues despite having huge numbers of people working on it. Arguably that's a bit self inflicted because they insist on having everything in-tree, but still.
Some stuff about SW design is true. Some design criteria are really dogmatic for instance, losing connection with the end value of the product.
Also top futurist programming priorities looks quite ok and an ideal to achieve.