> The word modern has somehow become associated with “something better”…
Well, that's because modern typically is better.
Humans don't like problems, issues, inconveniences, accidents, or flaws. So many billions of us are constantly engaged in the enterprise of improving things. Or trying to, at least.
Of course, this requires trying new approaches to create "modern" things. Trying new approaches is the definition of experimentation. And experimentation produces failures, usually more often than successes. So many modern things are in fact worse than the old things. But some new things are actually good, and these good things tend to rise in popularity and last.
What's tricky is that even these modern+good things often seem bad at first because, well, they are kind of bad at first. They're too new to have had years of refinement and innovation. As the saying goes, when you invent the ship, you invent the shipwreck. But this doesn't mean you throw away ships. It means you keep iterating on ships to make them better than previous generations.
This is the process of modernization: Experimenting with a smorgasbord of new stuff, most of which sucks, but some of which is good, and yet still requires years of improvement to become great.
The good things from the past that the author is praising were themselves modern at some point. Today they're just further along in the process of being refined and improved.
This is a helpful analysis, but it’s also worth mentioning that new things can be created and adopted for reasons other than being in our long term benefit.
My assumptions whenever I see a program that advertises itself as modern.
1. It will have more dependencies than lines of code. Opening a source file to see how it works will reveal nothing, because there's nothing there.
2. It will have at least one annoying feature that can't be turned off, like autosaving the file every 10ms, which will completely tank performance on a network filesystem or any other storage slightly slower than the developer's top end workstation.
3. It will have at least one hard coded safeguard to save you from yourself, like completely refusing to save files to /tmp, because then you might lose work.
You will run into limitation #3 while trying to workaround problem #2, at which point you will discover development philosophy #1.
I'm cracking up reading this article because in Architecture (buildings not software) the word "modern" is basically used incorrectly as a rule. The "Modern" period started like 100 years ago, and yet it has somehow come to mean (tacky) high-contrast, open layout homes, only suitable for Mediterranean climates.
The author is totally right though, software people are subject to fads probably worse than the general public and second only to teenagers. This trendiness is rarely useful since we are constantly re-inventing the wheel and more effective solutions tend to take longer to win-out
to be fair a lot of Modernist buildings are not particularly climate appropriate either. Most of the Frank Lloyd Wright buildings have major leaking issues, for example.
I always subconsciously interpret the word 'modern' as 'in fashion,' probably because the word 'mode' means fashion in my language. I don't associate it with something good or advanced, but rather with something vain and ephemeral.
"Modern" as in "not repeating old mistakes" is good. It does not guarantee from making new mistakes, of course.
"Wow, modern" as in "very new, shiny, and surrounded by hype" is usually bad, because the emotions overshadow a rational view. This leads to incorrect assumptions, excessive optimism, and often to enthusiastic but incorrect application of the new and somehow worthy thing.
In European history, the early modern period is roughly 1500 to 1800, and "modern" without qualifiers usually refers to the period starting from ~1800.
In other fields, "modern" may have a specific meaning due to historical baggage, or it can simply refer to the present day. The latter is basically the literal meaning of the word.
I heard a story from the 1990s. Oracle was going around to encourage people to write their domain-specific components as a Data Cartridge. Their evangelist pissed off the owner of a local software company by casually referring to how customers port 'legacy' software to Oracle.
> For example, I have encountered situations where others would look at the tools that I am using, such as Neovim [...] and say things like “Why not use a modern editor like VS Code?” or “Why not use a modern web interface for your email?”.
VC Code will one day be the old legacy editor. Everyone on the modern train will be obliged to change their editor to the new modern one, multiple times. Some people reading this are probably thinking "but VS Code has so much development behind it that it will stay relevant"... this is exactly what users of "premodern" tools are saying now.
Funny that you should say that. Just met with a group of very senior engineers today and two of them said they weren't using VS Code any more. Instead they used:
I'm still on VS Code myself. Cursor at least is just a fork of VS Code with AI features, so you can still use VS Code extensions, so it's something I might try at some point.
Zed sounds cool (it's fast; the guy who used it said it made him feel like he was programming in the 90s again, and I know exactly what he means), but I love having all the extensions (a quick search finds no Tailwind extension for Zed, for instance, and I'm really loving the Tailwind autocomplete). Might give it a try at some point, but I doubt I'll change to it.
Thing is, I was one of those people who kept jumping to new editors. Every single time it was because the current editor had a serious problem of one flavor or another--and another editor solved that problem.
VS Code is likely to be Good Enough for a long time. Zed might gain some adherents like the guy I met today, but I didn't keep trying new editors because I was "on the modern train," but because every other editor sucked in one way or another. All the vims and emacs variants suck. This shouldn't even be a debate, to be honest, but they do suck.
If VS Code keeps on its current trajectory (i.e., Microsoft doesn't abandon it), it will likely be the Good Enough editor perpetually. It's already 9 years old and it's still the "modern editor" of choice. And when I suggest people try a modern editor, I didn't even really mean VS Code but really any editor that's modern. Sublime, Atom, Visual Studio, any of the JetBrains editors...anything modern is better than vim or emacs. But some folks trained their fingers and don't want to change. Nothing you can say to them to convince them.
As another comment points out, Shakespeare is written in "modern English." It's not about chasing the "modern train" as much as not continuing to use Atom or Neovim when everyone else is using VS Code, and VS Code has become the common IDE of the work environment, is already set up to debug the app everyone is working on, and has all of the extensions everyone needs to be on the same page.
"Modern" has several wibbly wobbly definitions in various contexts, but with regards to languages it is actually quite specific:
"denoting the form of a language that is currently used, as opposed to any earlier form. e.g. "modern German"
Yes, Shakespeare is modern English. Deal with it.
With respect to computer applications, I'd support a similar use of the word. There are plenty of languages and applications that are old but still actively used by many people. That means they're modern even if, like Shakespeare, they are outdated in some respects.
It's arguable that computing has evolved so quickly that the line between what is modern and what is not might be set more rigidly, but what year would that be? Many people are still using Fortran because it happens to be really good for certain things. Meanwhile, there are languages and tools that are much more recent that nobody is using because the were completely superseded by something better. Can something made just a decade ago not be modern, just because it's not currently in use?
Well, that's because modern typically is better.
Humans don't like problems, issues, inconveniences, accidents, or flaws. So many billions of us are constantly engaged in the enterprise of improving things. Or trying to, at least.
Of course, this requires trying new approaches to create "modern" things. Trying new approaches is the definition of experimentation. And experimentation produces failures, usually more often than successes. So many modern things are in fact worse than the old things. But some new things are actually good, and these good things tend to rise in popularity and last.
What's tricky is that even these modern+good things often seem bad at first because, well, they are kind of bad at first. They're too new to have had years of refinement and innovation. As the saying goes, when you invent the ship, you invent the shipwreck. But this doesn't mean you throw away ships. It means you keep iterating on ships to make them better than previous generations.
This is the process of modernization: Experimenting with a smorgasbord of new stuff, most of which sucks, but some of which is good, and yet still requires years of improvement to become great.
The good things from the past that the author is praising were themselves modern at some point. Today they're just further along in the process of being refined and improved.
1. It will have more dependencies than lines of code. Opening a source file to see how it works will reveal nothing, because there's nothing there.
2. It will have at least one annoying feature that can't be turned off, like autosaving the file every 10ms, which will completely tank performance on a network filesystem or any other storage slightly slower than the developer's top end workstation.
3. It will have at least one hard coded safeguard to save you from yourself, like completely refusing to save files to /tmp, because then you might lose work.
You will run into limitation #3 while trying to workaround problem #2, at which point you will discover development philosophy #1.
And when a UI is described as "modern", you can be pretty sure that you're going to be cursing it if you actually use the thing.
The author is totally right though, software people are subject to fads probably worse than the general public and second only to teenagers. This trendiness is rarely useful since we are constantly re-inventing the wheel and more effective solutions tend to take longer to win-out
A-FRICKING-MEN. It's exhausting.
"Wow, modern" as in "very new, shiny, and surrounded by hype" is usually bad, because the emotions overshadow a rational view. This leads to incorrect assumptions, excessive optimism, and often to enthusiastic but incorrect application of the new and somehow worthy thing.
Improvement is good, fashion-driven hype, bad.
[1] https://www.youtube.com/watch?v=GOjglFG2XAc
[2] https://en.wikipedia.org/wiki/Modern_architecture
The "Modern" Age instead ended at around the 17th or the 18th century.
Naming anything "modern" is pure hubris.
When people say they hate "modern" art they usually mean contemporary art.
In other fields, "modern" may have a specific meaning due to historical baggage, or it can simply refer to the present day. The latter is basically the literal meaning of the word.
VC Code will one day be the old legacy editor. Everyone on the modern train will be obliged to change their editor to the new modern one, multiple times. Some people reading this are probably thinking "but VS Code has so much development behind it that it will stay relevant"... this is exactly what users of "premodern" tools are saying now.
1. https://www.trycursor.com/
2. https://zed.dev/
I'm still on VS Code myself. Cursor at least is just a fork of VS Code with AI features, so you can still use VS Code extensions, so it's something I might try at some point.
Zed sounds cool (it's fast; the guy who used it said it made him feel like he was programming in the 90s again, and I know exactly what he means), but I love having all the extensions (a quick search finds no Tailwind extension for Zed, for instance, and I'm really loving the Tailwind autocomplete). Might give it a try at some point, but I doubt I'll change to it.
Thing is, I was one of those people who kept jumping to new editors. Every single time it was because the current editor had a serious problem of one flavor or another--and another editor solved that problem.
VS Code is likely to be Good Enough for a long time. Zed might gain some adherents like the guy I met today, but I didn't keep trying new editors because I was "on the modern train," but because every other editor sucked in one way or another. All the vims and emacs variants suck. This shouldn't even be a debate, to be honest, but they do suck.
If VS Code keeps on its current trajectory (i.e., Microsoft doesn't abandon it), it will likely be the Good Enough editor perpetually. It's already 9 years old and it's still the "modern editor" of choice. And when I suggest people try a modern editor, I didn't even really mean VS Code but really any editor that's modern. Sublime, Atom, Visual Studio, any of the JetBrains editors...anything modern is better than vim or emacs. But some folks trained their fingers and don't want to change. Nothing you can say to them to convince them.
As another comment points out, Shakespeare is written in "modern English." It's not about chasing the "modern train" as much as not continuing to use Atom or Neovim when everyone else is using VS Code, and VS Code has become the common IDE of the work environment, is already set up to debug the app everyone is working on, and has all of the extensions everyone needs to be on the same page.
It’s still being updated.
"denoting the form of a language that is currently used, as opposed to any earlier form. e.g. "modern German"
Yes, Shakespeare is modern English. Deal with it.
With respect to computer applications, I'd support a similar use of the word. There are plenty of languages and applications that are old but still actively used by many people. That means they're modern even if, like Shakespeare, they are outdated in some respects.
It's arguable that computing has evolved so quickly that the line between what is modern and what is not might be set more rigidly, but what year would that be? Many people are still using Fortran because it happens to be really good for certain things. Meanwhile, there are languages and tools that are much more recent that nobody is using because the were completely superseded by something better. Can something made just a decade ago not be modern, just because it's not currently in use?