I remember being shocked when I saw someone (on HN) use the phrase “running on the bare iron” to mean “in user space of a computer running a multitasking operating system”. They in turn were incredulous that it could mean anything else — apparently an OS is needed, right?
Still, in some ways it’s a good thing. People can write good code without knowing what an ALU is, much less a flip flop.
The bigger problem is that there is little incentive these days to write good code at all; those who do of necessity know more than just their tiny domain.
In my experience it is now such that even just writing and running a C++ program on a modern computer is considered "extremely low level". It's almost an extreme thing to do to decide to write a native program.
Eh, that's just an issue of term reuse being rampant in this industry.
In the context of servers, it's accepted usage to use "baremetal" to mean "running on the physical host's operating system" as opposed to in a VM or container.
Its a domain thing. And one domain in the world is expanding. Everyone in the world is becoming a web developer. Actually it already happened. Throw a dart at a group of software developers most likely that dart will hit a web developer.
Web developers can write shitty code because most of the time the bottleneck is in the database. You just need to write code thats faster then the database and you don't need to optimize beyond that.
Now all the optimizations center around pointless refactorings to keep up with the newest technology trend.
IME it's quite rare for the database to be the bottleneck. People just don't know how to use it right. It's a pretty simple optimization to gather web requests into batches, and that can easily 10x your database throughput. At my last job we had architects pushing to do the opposite and turn batch files into single requests! Argh!
Also if people are using RDS with EBS, they'll think storage IO is ~500x slower than a modern NVMe disk really is, which will warp their perception of how well an RDBMS should scale. Their gp3 SSD storage comes with 3k IOPS baseline up to 16k[0]. lol. "SSD storage". The volume size "scales" up to 16TB. You can get a new condition P5410 for $420 retail: 8 TB/drive with 100x the performance of a maxed out EBS volume.
Similar misconceptions must exist with application performance. Lambda advertises being able to scale up to "tens of thousands" of concurrent requests[1]. My 6th gen i5 can do that with a few GB of RAM allocated to a single JVM process...
> Web developers can write shitty code because most of the time the bottleneck is in the database.
That is because most of the time web developers are bad at database things, it isn't an intrinsic property of database engines.
A modern database engine on a single AWS VM can readily support 10M writes/sec through disk storage, not even in memory. And run operational queries against that data at the same time. Most web apps don't have requirements anywhere close to that. In practice, you are much more likely to run out of network bandwidth to the database than database throughput if your systems are properly designed. Even the database engines themselves are usually bandwidth bound these days.
Optimization has taken a back-seat now, I'd say. We focus more on developer productivity, and that extra mile optimization isn't deemed necessary if that effort and time can be put toward another project, squeezing out more value from the developer.
I wouldn't call syscall (and strace) knowledge, "hardware knowledge". It's mostly about OS knowledge. You can go a long way with strace without knowing how a PCI bus work or other hardware details.
Which brings me to another point.
Virtualization ensured you could decouple (to an extent) the behavior of an operating system from the actual hardware used underneath. Therefore, some hardware knowledge is often ignored, but you still need to know about the OS.
With containers, you are decoupling the application from the host operating system! Which means some OS knowledge is often ignored.
That said, abstractions can be leaky. Most of the time and in the most common scenarios, you can ignore the lower level details. No one develops a simple website thinking about NUMA cores or C-state transitions. But if you want to really squeeze every ounce of performance, or if if you run very complicated systems, then you still need to peek at the lower levels from time to time. Which means looking at the hypervisors for virtual machines (assuming you can... on public cloud you cannot), or looking at the operating system for containers.
> With containers, you are decoupling the application from the host operating system!
Yes and no. The host’s kernel is still running the show, which catches people by surprise. Reference Node (or more accurately, libuv) incorrectly reporting CPU count by relying on /proc/cpuinfo, etc.
> But if you want to really squeeze every ounce of performance
Or if your C-suite orders a massive reduction in cloud cost. My company’s devs were recently told to cut vCPU allocation by 50%. Then, when they did so, they were told to do it again. My team was told to right-size DB instances. In fairness, some of them were grossly overprovisoned. ZIRP was quite a thing.
Unsurprisingly, this has resulted in a lot of problems that were being papered over with massive headroom.
So you downsized databases? I get that you can, to an extent, scale databases horizontally (especially if you don't need JOINs or you just use them for read operations), but autoscaling databases sounds a bit risky
> With containers, you are decoupling the application from the host operating system! Which means some OS knowledge is often ignored.
containers are just a fancy "tar" package with a fancy "chroot". they shouldn't really have ever been viewed as being decoupled from the host operating system. you're still just installing software on the host.
Believe it or not, there are these things called books which contain all sorts of knowledge about things like low level C, system calls, assembly, even digital logic. I know because I have several of them sitting on my office shelf thanks to an EE oriented education.
I could have picked up the same set of books and obtained the same knowledge with a more CS focused background. The hardware stuff is harder to pick up compared to software when self learning (for most - I know counter examples). But it’s by no means impossible.
Point being, I don’t agree with the author. If you’re a cloud native engineer and you want to learn the lower levels of abstraction, you can. But the point of cloud native is that you don’t need hardware knowledge and you can focus on your product/users/hobbyist itch instead.
One can fund the other though. Besides, if you’re tinkering with hardware, it really is super cheap already. ATmega’s are less than a meal. Various capacitors and resistors on Amazon in variety packs for a couple 20s. A soldering iron station for $50. Power supply for $40. It’s the same cost of entry as an entry level laptop…
Has nothing to do with 8-bit. Deep OS level knowledge can and should be taught starting from 64 bit (where most OSs run) and maybe tiny speck of 16 bit because most BIOS still runs on 16 bit intel
It's about exposure to syscalls, C, assembly, lower level debugging and just making sure people aren't afraid to touch that layer. Same goes with other foundational knowledge like packets and networking protocols
Are you suggesting that 'Deep OS knowledge', syscalls, C, assembly, packets and networking protocols should be taught in schools?
What I think the author is getting at with '8-bit' is that presenting children with a complete machine that is simple enough to understand from top to bottom gives a foundation that they can build on when they encounter more complex systems later on.
The 32-bit ARMv7-M micro-controllers with Cortex-M4 or Cortex-M7 cores that are available for instance on $10 development boards from ST or from many other companies are conceptually simpler and easier to understand when programmed at the bare-metal level than the 8-bit computers of 40 to 50 years ago.
They are a much better target for teaching children about hardware.
The problem is that there are so many layers between someone's code and the bare metal that optimization often gets completely lost.
The mentality is that computing is cheap and developer time is expensive. Write an inefficient function that wastes several thousand compute cycles but saves an hour of developer time is given priority in most cases.
This mentality works great when the function is only run a few million times (or much less) over its lifetime. But when it gets added to some popular library; distributed across millions of machines; and used thousands of times each hour on each machine; those wasted cycles can really add up.
What percentage of worldwide compute (with its associated wasted electricity and heat) can be attributed to inefficient code? Spending a few hours optimizing some popular code might actually do more to help the environment that driving an EV for an entire year.
I agree. And it's common to any engineering topics: DO a prototype that achieve the thing it is designed for, and IF you're going to deploy it in mass, optimize it. ELSE you don't need to.
Honestly, maybe a majority of modern engineers not being familiar with anything but the pointy tip of the stack is a good thing.
I come very much from the old world - I learned to code on graph paper, as that was how you saved your work, and being able to wield a scope and iron was kinda mandatory for being able to meaningfully use a computer.
As tech grew up, so did I - and while it’s useful to be able to understand what is happening from clicking a button in a gui down to electrons tunnelling through a bandgap, particularly when it comes to debugging the truly arcane, I actually find that the level of abstraction I carry around in my head sometimes gets in the way.
I look at newer techies bouncing around with their lofty pyramids of containerised cloud based abstracted infrastructures, and I almost envy the superficiality of their engagement - I can’t help but look at it and see immense complexity, because I see behind the simple and intuitive userland, and that makes me run for vim and some nice bare metal where I know what my hardware is doing.
You're probably getting old, but the truth is that modern stacks are stupidly complex.
Layer after layer of abstraction, along with immense bloat. I've run immensely complex web properties on hardware 20 years ago, that dwarfed what the average dev deploys today, yet without all that cruft?
Well, let play php and node, and use laravel as an example. Eaily 10000x the codebase, for the same result.
It's not efficient, or lean, or performant at all.
But it does do one thing.
Allow people without extensive security, database, and coding experience to push safe code quickly.
You can throw a new grad at Laravel, and they're off to the races.
I liken it to C replacing assembler. A way to abstract.
> I can’t help but look at it and see immense complexity, because I see behind the simple and intuitive userland, and that makes me run for vim and some nice bare metal where I know what my hardware is doing.
But is it humanly possible to know details in every level of abstraction? There should be a balance between abstractions and knowing some details beneath some layers of abstraction. What do you think?
Maybe not to an expert level (for some, undoubtedly it is), but it’s eminently possible to have a working knowledge of the stack, from hardware to the nth abstraction.
I’ve recently become interested in how data makes it from {INSERT,SELECT} to the disk, and so set about understanding that. You can read Postgres or MySQL’s source code and use a debugger to understand how it goes through that layer, then strace to follow the base syscalls, then blkparse to watch it go to the device driver. From there, you can experiment with forcing OS page cache drops to see when a “disk read” isn’t really a disk read, and also observe read-ahead behavior. I’ve skipped the device driver layer for now, but beyond that you can also use dd to observe changes to the raw block device. The latter is easier on a simplistic level with a few bytes at a time.
You have to be genuinely interested in this stuff though, because there are precious few companies who are going to pay you to play around like this all day.
Why not? It's really not that many principles- to me, it is entities that add complexity by deviating standards or introducing esoteric nomenclature on-top of existing concepts.
Nah, but I hear you- sometimes its just knowledge bias. Much of the techie scene is dogmatic and bounces as you say, from one trend to the next. To me, this is why we are in such a low period of innovation.
Yeah, it might look nice and easy. But this is when it works. When it doesn't they are unable to understand many kind of problems and have to resort and try&error.
> I can’t help but look at it and see immense complexity, because I see behind the simple and intuitive userland
An idiot admires complexity, a genius admires simplicity, a physicist tries to make it simple, for an idiot anything the more complicated it is the more he will admire it. If you make something so clusterfucked he can't understand it he's gonna think you're a god cause you made it so complicated nobody can understand it.
Douglas Adams said this thing about how you should never consider how
utterly improbable it is that miraculously complex things keep
working, because suddenly they won't. I mean, if you woke up each day
and pondered the intricacies of cellular mitochondia and your own
endocrine system... the anxiety would be crippling.
After 50 odd years of computing which began soldering my own from TTL
chips and transistors in the 1980s I find the only way to navigate
current levels of abstraction is to join the kids in their kind of
optimistic, faith-based wishful thinking, just let go and float above
it all.
But every now and again something goes wrong and absolutely nobody but
people like you and I can even imagine what it might be.
More terrifying is meeting people with very inflated job titles, in
positions of enormous wealth, power and responsibility, whose
knowledge of even the most elementary physics and logic is absent.
They're actually kind of proud of that, and positively celebrate
ignorance all the "things they don't need to know". Part of me finds
that insulting and disrespectful.
I find myself thinking about the respiratory complexes more than I probably should. It’s unbelievable that we function, at the most basic level. Molecular motors? Complexes that can only work in conjunction? Good grief.
When I first started university I had already been tinkering with computers and programming for about 10 years. I started in Computer Science, where I took a Java class from Urs Hölzle, who later went on to be Google's first VP of engineering. It was an amazing class, and I almost scored a 100% on every test, except for the one where I accidentally wrote "<=" as a less-than sign with a line under it and got dinged a point. However at the end of my first year I felt profoundly unsatisfied, like I was just learning superficial tricks to get computers to do what you wanted, but I didn't feel like I really understood what was actually happening.
I switched schools and majors to a Computer Engineering course taught primarily by EE professors, hoping to learn about the lower-level stuff so I didn't feel so ignorant. I learned about logic gates, K-maps, BJT transistor characteristics, N/P-well doping, VLSI, and so forth. All was going swimmingly for me until it came time to take a physics course on crystal structures. At that point I realized that the rabbit hole goes very, very deep -- much deeper than I could ever hope to really fully "understand" from the quantum level all the way up to Java code.
Recognizing that I only had so much time and mental capacity to learn about the whole system, I had to come to peace with knowing that I would have to choose a layer in which to increase my expertise and then largely stick to that. For me that started out being architecture-level operating system coding in C and some assembly, but I popped up the stack a smidge to then get to C and applied cryptography.
I'm now one of the few people at my company that (IMO) understands operating systems at all. The organization is deathly afraid to do anything with the kernel that's being deployed in their Cloud instances, and most view it as some mystical "black magic" that just works if you don't touch it. Hardly anybody fully understands what a so-called "container" actually is, but they sure know all the API calls to "make the container work" and push container images around repositories. They're often surprised when things that I know will be issues with their containers ahead of time happen to them. Whenever they run into problems, they'll often open two dozen Stack Overflow tabs and try copying and pasting stuff until something seems to work. People approach me as some sort of mystical oracle who can bestow arcane knowledge of syscalls and what not when their Stack Overflow searches turn up dry.
I feel like the pendulum perhaps has swung too far into the layers of abstraction in our universities, but I'm not sure what to do about that. I wonder what will happen as people like me ride off into the sunset.
Your self reflection to go deeper and then resurface to the level you are comfortable expanding your expertise in parallels some of mine. Hailing from a university in my country which didn’t go deep enough in any of the subjects, I was left with this insatiable desire to do something about it. It led me to my Master’s in US where I got (or at least I thought I did) once in a lifetime opportunity to take one of the hardest and most fulfilling classes: Operating Systems. It opened my eyes to the magic I couldn’t understand and gave me the confidence of “I can do anything if I put my mind to it” by building a kernel. I learned enough to realise that it is not something I can do as a day job but still venture into should I need to while working on something. To this day, I thank my past self for making this decision of taking on a task that I felt so insurmountable. Whenever I come across a hard problem now which seems insurmountable, the confidence from that experience is what keeps me going.
Simon Wardley would like a word…in his model this is the natural order of things. As technology matures and standardized and a new generation of tools is built on top of new abstractions, and the details of that tech no longer need to be understood in order to use it.
Subjects and skills that were requisite basics a generation* ago, become advanced, under the hood topics for specialists. The next generation of people need different skills in the day to day.
This post is a great account of what that feels like from the inside, from the perspective of the newer generation learning these (now) ‘advanced’ topics.
(Funnily enough, I don’t (yet) see anyone commenting "real men write assembler" - a skill that has long ago moved from required by all developers to super-specialized and not particularly useful to most people.)
*I am using the word generation in the broadest sense as it relates to cycles of technology
Whether or not this state of affairs is "natural", I do not think it is "good".
Civil engineers still need to understand calculus and how to analyze structural integrity even though they can rely on modern computer modeling to do the heavy lifting.
All engineers are expected to have some requisite level of knowledge and skill. Only in software do we accept engineers having the absolute bare minimum knowledge and skill to complete their specific job.
Not that we shouldn't use modern tools, but having a generation of developers unable to do anything outside their chosen layer of abstraction is a sad state of affairs.
> Only in software do we accept engineers having the absolute bare minimum knowledge and skill to complete their specific job.
You can require that your frontend engineer absolutely must have good assembly knowledge but you'll pay more for them and fall behind your competitors. You can require that your DBA knows how to centre text with CSS, but you'll pay more for them and fall behind your competitors. You can require that the people managing the data centre understand the internals of the transformer architecture or that the data scientists fine tuning it understand the power requirements and layout of the nodes and how that applies to the specific data centre, you'll just pay more for someone who understands both.
Everyone requires the bare minimum knowledge to accomplish their job that's pretty much the definition of "require" and "minimum", limited by your definition of someones job.
"software" is such a ludicrously broad topic that you may as well bemoan that the person who specifies the complex mix of your concrete doesn't understand how the HVAC system works because it's all "physical stuff".
> but having a generation of developers unable to do anything outside their chosen layer of abstraction is a sad state of affairs.
Whether it's sad depends if they're better in their narrower field, surely. It's great if we can have a system where the genius at mixing concrete to the required specs doesn't need to know the airflow requirements of the highrise because someone else does, compared to requiring a large group of people who all know everything.
> but having a generation of developers unable to do anything outside their chosen layer of abstraction is a sad state of affairs.
This is the normal state of affairs, and is really the only reason we can build meaningful software systems. Software is much too complicated, to understand even one layer of abstraction can be a multi decade journey. The important thing though, is that when the abstractions are leaky (which they always are), the leakiness follows a good learning curve. This is not true for cloud though.
> All engineers are expected to have some requisite level of knowledge and skill. Only in software do we accept engineers having the absolute bare minimum knowledge and skill to complete their specific job.
Most software engineers just produce websites and nothing that impacts the safety of other humans. Other types of engineers have to ensure people do not die.
> All engineers are expected to have some requisite level of knowledge and skill. Only in software do we accept engineers having the absolute bare minimum knowledge and skill to complete their specific job.
If that was true, then there would be opportunities for entry into professional software engineering careers. Because the only opportunities there are for software engineering jobs are opportunities for "senior" software engineers. Which entails much more than the absolute bare minimum knowledge and skill.
So there's some inconsistency going on within the mindset of people who measure competence and fitness in engineering, in the broadest sense of the concept of engineering.
Maybe engineering itself, then, isn't even remotely the noble profession it is widely believed to be? Maybe engineers and even scientists aren't that really intelligent? Or intelligent at all? Maybe science and mathematics should be abandoned in favor of more promising pursuits?
> Not that we shouldn't use modern tools, but having a generation of developers unable to do anything outside their chosen layer of abstraction is a sad state of affairs.
Funnily enough my day job is writing software for structural engineers (and I am a licensed engineer). Your comments are absolutely on point. One of the most important discussions I have with senior engineers is "how will we train tomorrow’s engineers, now that the computer does so much work?"
40 years ago, the junior engineers were the calculators, using methods like moment distribution, portal frame, etc… today the computer does the calculation using the finite element method. Engineers coming straight out of school are plunged right into higher level work right away - the type of work that junior engineers a couple of generations ago might not have seen for 5-10 years.
My first career development discussion with a senior engineer was "Just work for 10-15 years, then you'll know what you need to be a good engineer."
I have discussed this under the theme of Generation Gap (https://www.youtube.com/watch?v=5gqz2AeqkaQ&t=147s, 2:27 - 8:58), and have a similar conclusion to you: what at first appears as a different generational approaches are actually different facets of a well-rounded, senior technical skill set. Maybe the kids are just learning things in a different order than we did?
Lots of HN commenters are younger generation folks, and lots of them have poor fundamentals. They will certainly deny the need for wider scope of knowledge, as they do not have it themselves.
While I mostly agree, I think one thing to keep in mind is that we still need people somewhere who know how to do that. e.g. FAANG might have data center people and sysadmins that know the hardware... we (they? not sure) just need to ensure that in the future, we still have _some_ people that posses that knowledge.
I do not think it is requisite that _all_ developers have that knowledge.
Yes, absolutely - skills move from mainstream to niche, but are still required! For example, a much smaller proportion of the population knows how to farm today than 100 years ago, but it's still important :)
Still, in some ways it’s a good thing. People can write good code without knowing what an ALU is, much less a flip flop.
The bigger problem is that there is little incentive these days to write good code at all; those who do of necessity know more than just their tiny domain.
But I haven’t tried to design a (extremely primitive) chip since I was in college, so there’s plenty for others to laugh at me for too.
In the context of servers, it's accepted usage to use "baremetal" to mean "running on the physical host's operating system" as opposed to in a VM or container.
But I recall the other meaning from OS class.
Web developers can write shitty code because most of the time the bottleneck is in the database. You just need to write code thats faster then the database and you don't need to optimize beyond that.
Now all the optimizations center around pointless refactorings to keep up with the newest technology trend.
Also if people are using RDS with EBS, they'll think storage IO is ~500x slower than a modern NVMe disk really is, which will warp their perception of how well an RDBMS should scale. Their gp3 SSD storage comes with 3k IOPS baseline up to 16k[0]. lol. "SSD storage". The volume size "scales" up to 16TB. You can get a new condition P5410 for $420 retail: 8 TB/drive with 100x the performance of a maxed out EBS volume.
Similar misconceptions must exist with application performance. Lambda advertises being able to scale up to "tens of thousands" of concurrent requests[1]. My 6th gen i5 can do that with a few GB of RAM allocated to a single JVM process...
[0] https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/general-...
[1] https://docs.aws.amazon.com/lambda/latest/dg/gettingstarted-...
That is because most of the time web developers are bad at database things, it isn't an intrinsic property of database engines.
A modern database engine on a single AWS VM can readily support 10M writes/sec through disk storage, not even in memory. And run operational queries against that data at the same time. Most web apps don't have requirements anywhere close to that. In practice, you are much more likely to run out of network bandwidth to the database than database throughput if your systems are properly designed. Even the database engines themselves are usually bandwidth bound these days.
Which brings me to another point.
Virtualization ensured you could decouple (to an extent) the behavior of an operating system from the actual hardware used underneath. Therefore, some hardware knowledge is often ignored, but you still need to know about the OS.
With containers, you are decoupling the application from the host operating system! Which means some OS knowledge is often ignored.
That said, abstractions can be leaky. Most of the time and in the most common scenarios, you can ignore the lower level details. No one develops a simple website thinking about NUMA cores or C-state transitions. But if you want to really squeeze every ounce of performance, or if if you run very complicated systems, then you still need to peek at the lower levels from time to time. Which means looking at the hypervisors for virtual machines (assuming you can... on public cloud you cannot), or looking at the operating system for containers.
Yes and no. The host’s kernel is still running the show, which catches people by surprise. Reference Node (or more accurately, libuv) incorrectly reporting CPU count by relying on /proc/cpuinfo, etc.
> But if you want to really squeeze every ounce of performance
Or if your C-suite orders a massive reduction in cloud cost. My company’s devs were recently told to cut vCPU allocation by 50%. Then, when they did so, they were told to do it again. My team was told to right-size DB instances. In fairness, some of them were grossly overprovisoned. ZIRP was quite a thing.
Unsurprisingly, this has resulted in a lot of problems that were being papered over with massive headroom.
containers are just a fancy "tar" package with a fancy "chroot". they shouldn't really have ever been viewed as being decoupled from the host operating system. you're still just installing software on the host.
Deleted Comment
I could have picked up the same set of books and obtained the same knowledge with a more CS focused background. The hardware stuff is harder to pick up compared to software when self learning (for most - I know counter examples). But it’s by no means impossible.
Point being, I don’t agree with the author. If you’re a cloud native engineer and you want to learn the lower levels of abstraction, you can. But the point of cloud native is that you don’t need hardware knowledge and you can focus on your product/users/hobbyist itch instead.
If I want to play with hardware, I have to buy it. I also risk having to replace it when I let the magic smoke out.
Or at least that's always been my intuition about getting started.
Deleted Comment
It's about exposure to syscalls, C, assembly, lower level debugging and just making sure people aren't afraid to touch that layer. Same goes with other foundational knowledge like packets and networking protocols
What I think the author is getting at with '8-bit' is that presenting children with a complete machine that is simple enough to understand from top to bottom gives a foundation that they can build on when they encounter more complex systems later on.
They are a much better target for teaching children about hardware.
The mentality is that computing is cheap and developer time is expensive. Write an inefficient function that wastes several thousand compute cycles but saves an hour of developer time is given priority in most cases.
This mentality works great when the function is only run a few million times (or much less) over its lifetime. But when it gets added to some popular library; distributed across millions of machines; and used thousands of times each hour on each machine; those wasted cycles can really add up.
What percentage of worldwide compute (with its associated wasted electricity and heat) can be attributed to inefficient code? Spending a few hours optimizing some popular code might actually do more to help the environment that driving an EV for an entire year.
I come very much from the old world - I learned to code on graph paper, as that was how you saved your work, and being able to wield a scope and iron was kinda mandatory for being able to meaningfully use a computer.
As tech grew up, so did I - and while it’s useful to be able to understand what is happening from clicking a button in a gui down to electrons tunnelling through a bandgap, particularly when it comes to debugging the truly arcane, I actually find that the level of abstraction I carry around in my head sometimes gets in the way.
I look at newer techies bouncing around with their lofty pyramids of containerised cloud based abstracted infrastructures, and I almost envy the superficiality of their engagement - I can’t help but look at it and see immense complexity, because I see behind the simple and intuitive userland, and that makes me run for vim and some nice bare metal where I know what my hardware is doing.
Maybe I’m just getting old.
Layer after layer of abstraction, along with immense bloat. I've run immensely complex web properties on hardware 20 years ago, that dwarfed what the average dev deploys today, yet without all that cruft?
Well, let play php and node, and use laravel as an example. Eaily 10000x the codebase, for the same result.
It's not efficient, or lean, or performant at all.
But it does do one thing.
Allow people without extensive security, database, and coding experience to push safe code quickly.
You can throw a new grad at Laravel, and they're off to the races.
I liken it to C replacing assembler. A way to abstract.
It is sad though.
But is it humanly possible to know details in every level of abstraction? There should be a balance between abstractions and knowing some details beneath some layers of abstraction. What do you think?
I’ve recently become interested in how data makes it from {INSERT,SELECT} to the disk, and so set about understanding that. You can read Postgres or MySQL’s source code and use a debugger to understand how it goes through that layer, then strace to follow the base syscalls, then blkparse to watch it go to the device driver. From there, you can experiment with forcing OS page cache drops to see when a “disk read” isn’t really a disk read, and also observe read-ahead behavior. I’ve skipped the device driver layer for now, but beyond that you can also use dd to observe changes to the raw block device. The latter is easier on a simplistic level with a few bytes at a time.
You have to be genuinely interested in this stuff though, because there are precious few companies who are going to pay you to play around like this all day.
An idiot admires complexity, a genius admires simplicity, a physicist tries to make it simple, for an idiot anything the more complicated it is the more he will admire it. If you make something so clusterfucked he can't understand it he's gonna think you're a god cause you made it so complicated nobody can understand it.
After 50 odd years of computing which began soldering my own from TTL chips and transistors in the 1980s I find the only way to navigate current levels of abstraction is to join the kids in their kind of optimistic, faith-based wishful thinking, just let go and float above it all.
But every now and again something goes wrong and absolutely nobody but people like you and I can even imagine what it might be.
More terrifying is meeting people with very inflated job titles, in positions of enormous wealth, power and responsibility, whose knowledge of even the most elementary physics and logic is absent. They're actually kind of proud of that, and positively celebrate ignorance all the "things they don't need to know". Part of me finds that insulting and disrespectful.
I switched schools and majors to a Computer Engineering course taught primarily by EE professors, hoping to learn about the lower-level stuff so I didn't feel so ignorant. I learned about logic gates, K-maps, BJT transistor characteristics, N/P-well doping, VLSI, and so forth. All was going swimmingly for me until it came time to take a physics course on crystal structures. At that point I realized that the rabbit hole goes very, very deep -- much deeper than I could ever hope to really fully "understand" from the quantum level all the way up to Java code.
Recognizing that I only had so much time and mental capacity to learn about the whole system, I had to come to peace with knowing that I would have to choose a layer in which to increase my expertise and then largely stick to that. For me that started out being architecture-level operating system coding in C and some assembly, but I popped up the stack a smidge to then get to C and applied cryptography.
I'm now one of the few people at my company that (IMO) understands operating systems at all. The organization is deathly afraid to do anything with the kernel that's being deployed in their Cloud instances, and most view it as some mystical "black magic" that just works if you don't touch it. Hardly anybody fully understands what a so-called "container" actually is, but they sure know all the API calls to "make the container work" and push container images around repositories. They're often surprised when things that I know will be issues with their containers ahead of time happen to them. Whenever they run into problems, they'll often open two dozen Stack Overflow tabs and try copying and pasting stuff until something seems to work. People approach me as some sort of mystical oracle who can bestow arcane knowledge of syscalls and what not when their Stack Overflow searches turn up dry.
I feel like the pendulum perhaps has swung too far into the layers of abstraction in our universities, but I'm not sure what to do about that. I wonder what will happen as people like me ride off into the sunset.
Subjects and skills that were requisite basics a generation* ago, become advanced, under the hood topics for specialists. The next generation of people need different skills in the day to day.
This post is a great account of what that feels like from the inside, from the perspective of the newer generation learning these (now) ‘advanced’ topics.
(Funnily enough, I don’t (yet) see anyone commenting "real men write assembler" - a skill that has long ago moved from required by all developers to super-specialized and not particularly useful to most people.)
*I am using the word generation in the broadest sense as it relates to cycles of technology
Civil engineers still need to understand calculus and how to analyze structural integrity even though they can rely on modern computer modeling to do the heavy lifting.
All engineers are expected to have some requisite level of knowledge and skill. Only in software do we accept engineers having the absolute bare minimum knowledge and skill to complete their specific job.
Not that we shouldn't use modern tools, but having a generation of developers unable to do anything outside their chosen layer of abstraction is a sad state of affairs.
You can require that your frontend engineer absolutely must have good assembly knowledge but you'll pay more for them and fall behind your competitors. You can require that your DBA knows how to centre text with CSS, but you'll pay more for them and fall behind your competitors. You can require that the people managing the data centre understand the internals of the transformer architecture or that the data scientists fine tuning it understand the power requirements and layout of the nodes and how that applies to the specific data centre, you'll just pay more for someone who understands both.
Everyone requires the bare minimum knowledge to accomplish their job that's pretty much the definition of "require" and "minimum", limited by your definition of someones job.
"software" is such a ludicrously broad topic that you may as well bemoan that the person who specifies the complex mix of your concrete doesn't understand how the HVAC system works because it's all "physical stuff".
> but having a generation of developers unable to do anything outside their chosen layer of abstraction is a sad state of affairs.
Whether it's sad depends if they're better in their narrower field, surely. It's great if we can have a system where the genius at mixing concrete to the required specs doesn't need to know the airflow requirements of the highrise because someone else does, compared to requiring a large group of people who all know everything.
This is the normal state of affairs, and is really the only reason we can build meaningful software systems. Software is much too complicated, to understand even one layer of abstraction can be a multi decade journey. The important thing though, is that when the abstractions are leaky (which they always are), the leakiness follows a good learning curve. This is not true for cloud though.
Likewise how much must a software engineer understand about how hardware works before they able to do a good job?
At some point in time there is diminishing returns for someone to be truly “full stack”.
Most software engineers just produce websites and nothing that impacts the safety of other humans. Other types of engineers have to ensure people do not die.
If that was true, then there would be opportunities for entry into professional software engineering careers. Because the only opportunities there are for software engineering jobs are opportunities for "senior" software engineers. Which entails much more than the absolute bare minimum knowledge and skill.
So there's some inconsistency going on within the mindset of people who measure competence and fitness in engineering, in the broadest sense of the concept of engineering.
Maybe engineering itself, then, isn't even remotely the noble profession it is widely believed to be? Maybe engineers and even scientists aren't that really intelligent? Or intelligent at all? Maybe science and mathematics should be abandoned in favor of more promising pursuits?
Funnily enough my day job is writing software for structural engineers (and I am a licensed engineer). Your comments are absolutely on point. One of the most important discussions I have with senior engineers is "how will we train tomorrow’s engineers, now that the computer does so much work?"
40 years ago, the junior engineers were the calculators, using methods like moment distribution, portal frame, etc… today the computer does the calculation using the finite element method. Engineers coming straight out of school are plunged right into higher level work right away - the type of work that junior engineers a couple of generations ago might not have seen for 5-10 years.
My first career development discussion with a senior engineer was "Just work for 10-15 years, then you'll know what you need to be a good engineer."
I have discussed this under the theme of Generation Gap (https://www.youtube.com/watch?v=5gqz2AeqkaQ&t=147s, 2:27 - 8:58), and have a similar conclusion to you: what at first appears as a different generational approaches are actually different facets of a well-rounded, senior technical skill set. Maybe the kids are just learning things in a different order than we did?
Pat Gelsinger et al's discussion of the demise of the tall, thin designer is another interesting perspective (https://www.researchgate.net/profile/Avinoam-Kolodny/publica...)
I do not think it is requisite that _all_ developers have that knowledge.
(And sometimes these mainstream, practical, everyday skills stick around in funny ways: https://www.hillelwayne.com/post/linked-lists/)