Like many other commenters (of a certain age?), I too have this unsatisfied feeling about a particular kind of modern software development. The kind where you never really dig down and design anything, you just plumb a bunch of stuff together with best practices you find on stack overflow.
Many commenters are attributing this problem to the modern high-level tools we now have access to. But I don't think this is the crux of the issue. You can face the same issue (you're plumbing things, not a designing a system) whether you are working with low level or high level components.
Heck, you could be working on a hardware circuit, but if the only thing you had to do was make sure the right wires, resistors, capacitors, etc. were in place between the chips you're still just doing plumbing work.
To me, one of the most satisfying things about programming is when you can build something great by starting with a concept for your lower-level primitives, your tools, and then work up through the higher levels of design, ultimately having the pieces you designed fit together to form something useful to the world.
This building-things-to-build-things idea is even satisfying in other areas. Just gluing a bunch of wood together to make a piece of furniture is fine, but building your own jigs and tools to be able to do the kind of cuts that enable the end design you envision is way more satisfying, and opens up the design space considerably.
If I had to lament anything (and perhaps this is what's most in alignment with the post) it's that most of the high-level primitives you touch these days tend to be sprawling, buggy, not focused, and just generally not of high quality or performance. It's possible for high-level primitives to avoid these pitfalls (e.g. Sqlite, the canonical example) but it does tend to be the exception.
I think there is still plenty of interesting and satisfying software engineering work to be done when starting with high-level libraries and tools. You just need to think about how to use their properties and guarantees (along with maybe some stuff you build yourself!) to enable the design of something more than just the (naively-plumbed) sum of the parts.
Challenge yourself to use stdlib only for a while, or the future.
It sounds unrealistic and I'm going to get flamed, but hear me out. It works. Most of my development these days is in reasonably complete languages like go, rust, zig, and various scripting languages, so your mileage may vary if you're writing in something like rune, hare or carbon that is still taking shape.
If you think I'm crazy but have a lingering skepticism, challenge yourself to spend one day, one week, or one month using the stdlib only. If that is unrealistic in your setting a compromise could be to only use libraries that were created by yourself. You wont come out of it as an enlightened samurai monk with a celebrity HN presence, but you will gain an immense sense of scrutiny that didn't exist before as you take a look at the libraries you want to use after that.
For those who think this is just 100% bull, consider how we survived before google, youtube, and stackexchange existed.
I'm not gonna flame you, but I will note that, as someone who gets paid to use my judgement to decide on the optimal trade-off between quality, time spent on the project, and its future maintainability... I feel like all three will suffer quite a bit with this self imposed "handicap".
I am thinking very hard about the CAP theorem right now while working on a billing system for a cloud API right now and it is an absolute joy. No, it won't deploy in version 1, 2, or 3, but it might in version 4, and if it does, it will be glorious.
You can find cool technical problems anywhere as long as you are willing to take the path less traveled.
Challenge yourself to not even use stdlib once in awhile. There are some interesting insights to glean about how much room for improvement we have even at the very bottom.
https://youtu.be/BrBb0mqoIAc
I wrote entire systems with Turbo Pascal, and then Delphi out of the box. Many others did the same with Visual Basic 6, or the Microsoft Office 2000 suite with VBA, before the .NET infection took hold and Microsoft lost it's mind.
A side effect of this is that almost all job position advertisement are disgusting to look at. They are all about this kind of mindless glue code programming, but wrapped in marketing speak to make it look like "you get to use awesome bleeding edge latest technologies" when in reality it is "you have to figure out how to configure 10 different things to work together to sort of kind of produce the intended behavior".
In the last 3 years I think I never saw even once a job description on any popular job board where they advertise that you will do some actually interesting programming. The only ones I've seen have been on Twitter but from companies doing things in areas I have no experience in (e.g. game engine programming).
I suspect that this is why leetcode tests are so prevalent.
They basically test for distance from school, and not much else, as the algorithms aren’t really reflective of real-world work, which, as the article states, is really fairly simple “glue,” binding together prefab sections.
If someone is good at, and energized by, writing “from scratch,” and "learning the whole system," then they are actually not what you want. You want people that are good at rote, can learn fairly shallow APIs quickly, and are incurious as to why things work.
I have exactly the same problem.. got sucked into "the cloud" 4-5 years ago at my current employer. Now I desperately want to get another job. Something with preferably no or minimal cloud involved. The trouble is that the jobs that sound interesting don't reflect my expertise..
Now should I try to start from 0 with a junior salary? Does not make sense with a family.
I don't really have an idea yet.. but I urgently need to change something because my current work is killing everything I ever felt for software development.
Or, just don't work in webapps. Get into embedded programming. Or join a games studio.
I have a friend who is writing code to run on a sort of exoskeleton meant to benefit disabled people and help them walk. He has never in his life "deployed to the cloud" and wouldn't have the foggiest idea of how to do it.
That sounds nice but how do you get into embedded if your experience are 5 years, 10 years or more in say distributed systems/cloud/web programming/etc?
Yes. Whenever I work on a "serverless" app, I spend more time messing around with IAC tools like Terraform than I do writing actual application code. It's sad.
>Heck, you could be working on a hardware circuit, but if the only thing you had to do was make sure the right wires, resistors, capacitors, etc. were in place between the chips you're still just doing plumbing work.
A lot of modern hardware design feels like that, take a microcontroller, some peripheral chips, connect them together, and copy the datasheets for whatever support passives they need.
I came to this same conclusion last week when I started writing my own webgpu renderer. I went into it with no knowledge of graphics and without using libraries. Having to create my own generic abstractions for pipelines, passes and buffers has been a massive creative and educational experience.
I haven't felt this satisfaction from programming in years from my day job.
As some other commenters have said, this is really a self-inflicted problem. The author has chosen to do an EDA task which is very manageable on a laptop via a convoluted stack of cloud services - possibly just to illustrate a point. But even if this all worked smoothly, the fact that it is far removed from "software engineering" has more to do with the fact that it's a data analysis project. If it were about about writing firmware for audio hardware it would look very different.
Nitpick: ! for shell commands is an IPython feature, not Jupyter. It doesn't work with other kernels.
It's also well-known that pandas is memory-inefficient and Dask would probably do the memory estimation they described for them (after the author dismissed it). They're really just showing they don't really understand these tools that well
Good article, great points. The Knuth quote on lack of creativity is right on the money. It's why I've been drifting away from hands-on programming even though I'd rather not. I still love programming as much as I ever did as a teenager. Late at night when everyone is asleep and I can work on my own code, it's as much joy as it ever was. What's different is that at $DAYJOB back in the 90s and 00s it used to be just as fun all day.
Nowadays when it's just gluing frameworks together and configuring AWS services... it doesn't really feel any different intellectually than cleaning toilets. Sure the pay is better but as a creative challenge they're pretty much on par.
> Nowadays when it's just gluing frameworks together and configuring AWS services... it doesn't really feel any different intellectually than cleaning toilets.
Comparing your six figure white collar job to basic janitorial work is pretty damn cringe and pretty objectively untrue.
> Comparing your six figure white collar job to basic janitorial work is pretty damn cringe and pretty objectively untrue.
I grew up doing hard farm/ranch labor outside in 95-105 degree TX heat and humidity. I agree with the ancestor comment that manual labor is a hell of a lot more satisfying and stimulating than gluing together AWS services with IAM/RAM snippets from stack overflow and updating some design doc about it. If it payed adequately I might do manual labor in the day and solve actually challenging and fulfilling technical problems at night. Programmers don't get to program much anymore :(
You're right. Cleaning toilets is at least a laborious task.
Your standard CRUD applications and web services are largely just a rigamarole of reciting the right incantation and duct taping bits together. It's immensely non-stimulating work when done properly.
This isn't an insult by any means. It's a testament to the triumphs of decades of engineering efforts to turn the process of orchestrating extremely complex electronic systems spanning continents into a largely trivial task for most projects.
I find it strange to see this sentiment... how far down the stack do you go before it stops being "configuration" to you?
Let's imagine building a video chat:
Are you going to build an OS from scratch?
Network protocols?
Encryption?
The rendering engine for it?
Compression algos?
Just how much of this do you consider "not configuration"? I think this all sounds very much like "old man yells at cloud" pining for a past that never actually existed.
What scares me about ChatGPT is less so that I’ll lose my job (though it’s possible), but more so that I’ll be using language models to work at higher and higher levels of abstraction doing mainly configuration tweaking. Some of the particular pain points expressed in the article should be removed with AI in the loop development, but it’s another step away from “real programming”, which is what attracted most of us to this field. Yes, creating things is the end goal, but I’m terrified that I won’t be able to extract nearly as much joy when my job becomes largely prompting GPTn to swap out frameworks and UI paradigms for me like magic.
How much joy do we really get from day-to-day work if we are really honest? I am using ChatGPT to help me get things done (Node.js programming) for my startup. It's getting me closer to having this client project done so I will have more time for my own internal AI thing I am adding to the main service.
No one is stopping you from writing 6502 assembly code in your spare time. That's actually still a somewhat popular hobby.
I already started using ChatGPT to solve problems because I can’t be arsed to read through ten vendors worth of documentation. It wrote me a fairly complete and accurate chunk of code the other day to solve a problem.
I've found it's often like pairing with a junior developer who is familiar with whatever problem you're describing and types insanely fast, and that's without learning too much how best to prompt it. A recent discovery was that you can ask it what problems may exist in its code, then ask it to fix them.
Its already like this. My first job in the 90s I wrote our own linked list classes, a logging framework and a persistence layer. Now it feels like I write css and yaml all day.
You still get to do something as close to the metal as writing raw CSS? I'm reliably told that's cavalier and you should be writing something that compiles to CSS.
By the way, the HiSOFT that created DevPac 4 (assembler/debugger) for the ZX Spectrum is still around. According to their about page they now build websites?!? https://www.hisoft.co.uk/
using a software framework like django for "rapid application development" gives me a feeling closer to writing configuration files rather than actually "programming" (where "programming" is writing hardcore algorithms).
but don't get me wrong, I liked doing that, I got paid to do it. But let's call it for what it is: that python code (django app) was really django framework config.
this is a simlar phenomenon but worse, at least I could look around all of the actual code of the program for which I was writing configuration as code.
then, the thing about hardcore algorithms is that they need to be written once, and then everybody can use them. this is a giant problem. as I think about this, such hardcore algorithms are digital artifacts, so they are subject to the same problems all other digital artifacts (media, videos) are. a problem also known as software piracy. but software has been about composing proven algorithms together since day 1. the problem by this point is socio-economic. not technical.
I see the whole debate around "who will pay for critical infrastructure software" as another instance of "how should artists make money in the digital era".
But this comment (which I'm editing) is already at 0 points. somebody doesn't like what I'm trying to say, but I knew this already.
Well, if you were writing abstract pseudo-code, then maybe. In practice, the same algorithms are often reimplemented multiple times.
> and then everybody can use them
ah, if only that were true... implementations are often not that flexible, nor are programmers that keen on utilizing existing implementations.
> this is a giant problem
Problem? To the extent it's true - it's a boon, not a problem. Image if whenever you made a chair - suddenly everyone could get a chair without taking your own chair or spending any time and resources on chair construction. It's a miracle!
> a problem also known as software piracy.
1. Piracy is when people on ships with guns rob other ships. Arrgh, matey!
2. Sharing and copying software or other media is a good thing, not a problem.
PS - I haven't downvoted you. Point taken about django "config-programming".
well sure, I completely agree that it's indeed a boon. a HUGE boon.
the problem I describe is not a technical one, but a social one. and it's only a problem due to the current way society works. It's only a problem for some people (e.g me); them who are well satisfied by this "status quo" don't see any issue beyond lacking enforcement of IP 'rights' and the need for better DRM, and copy protections, and other things like that.
you're focusing on the technical aspects (plus you seem to be deliberately using the wrong definition of piracy).
I'm talking about the social economic aspects: I'm saying that this boon brought about by digital technology is only benefiting a select few. I tend to think about this 'boon' as potential that we collectively seem to be choosing to forego; I think it's my life's mission to do everything I can to avoid this "foregoing" of the great potential unleashed by digital technology; I feel like I'm swimming against the current most of the time.
I think you nailed it with configuration vs programming. modern software development is configuration. notice that "development" is not programming either. so this trend has been going for long time
If you feel like "modern programming is boring configuration mess"
then ask yourself - what have you done in order to have interesting job, projects, challenges, etc?
I mean, if you decided at some point that $big_salary for throwing JSONs via REST from CRUD app
is what you want to do until you pay your loans (e.g decade), then it's fine, but don't be shocked that you aren't doing bleeding edge R&D at some fancy place.
Like what prevents you from putting effort for year? two? and switching from X to Y?
Capability trap. Not everyone is capable of effectively working 2 jobs at once, and generally speaking no matter how big that $big_salary is you aren't going to have more money or time.
Also there are lots of way for a company to say they are doing interesting/meaningful things or whatever and effectively not be. And no matter how great the work you may be doing, you don't want to be doing it if it doesn't pay enough.
Every project I worked on was JS/Node hell. Then I decided to switch to Elixir/Erlang and now I’m much happier. Of course, there are uninteresting or badly engineered projects but overall it’s much better.
Do architects reinvent the I-beam every time they design a new building? No, of course not.
The reason society works is because you can reuse abstractions that other people have already invented. It allows you to scale. Without economies of scale, you end up with Baumol's cost disease, which is extremely obvious in the US in industries like child and elderly care.
We don't want most software devs to be doing anything because gluing stuff together. If they weren't, we really screwed up somewhere.
I like this allegory because it puts the complexity into a visual mindset and brings to light an interesting question about the nature of our abstractions.. For example:
Are our abstractions I-beams or Pre-Fabs[0]?
We all know that the rise of pre-fabs is, at its heart, the story of cheap developments all lazily (and hastily) thrown together in arrangements that are of low quality, mid-to-low beauty and do not last for very long.
Skyscrapers of the early 20th century stand today (with I-Beams) and are considered by many to be beautiful, maintainable etc;
People want pre-fabs, since they're cheap, the economy will always be in the pre-fab.
But pre-fabs have a limited shelf-life, and reconstruction is more expensive than spending a little extra up front cost.
I beams are both an individual abstraction and part of the greater abstraction.
Your catalog of beams, bolts, brackets, weld patterns, rebar, piles, and concrete pour standards are all abstractions over extremely difficult subfields of materials engineering and structural engineering.
They exist so that your engineer can focus on building the structure using parts and resources with known, standardised behavior under the conditions the building will be put in.
Of course you'll see engineers break away from these abstractions when they need to for a given structure but those abstractions do exist and are commonly used so that a given structural engineer doesn't also need a PhD in materials engineering and countless other specialized fields.
Would you rather do some library gluing, or reinvent a thousand wheels with every project? The latter is neat the first few times. Whatever your preference, your value as an engineer is much higher if you can glue. Imagine if a carpentry workshop gives a carpenter a fully-fledged set of industrial power tools, but the carpenter insists she can recreate all the other tools with just her whittling knife because it's in the 'true spirit of carpentry'.
I'd rather re-invent some wheels. The problem with VendorOps as I see it is that quite often the vendored "solutions" aren't solutions: they do not meet the requirements! Yet … they sort of get like 20% of the way there, so they get adopted nonetheless, and the devs toil away on trying to get glue code to push it the remaining 80% of the way.
But if we had a system that we owned, then it could be adjusted to fit the requirements, elegantly. But we don't, so we can't.
The other problem is the "Ops" part: vendor owned systems are opaque AF, and when something goes wrong, impossible to debug. Then you become a support ticket monkey, praying you can convince the powers that be on the other end that a. it is truly their stuff that's broken, not yours and b. we pay for it, so yes, you should support it.
When a. or b. fails, then you end up writing yet more glue code to try to work around the bugs and outages that your vendor just doesn't give a shit about.
We got random failures on our API gateway to lambda connections, and the answer we got back from the support agent was something like “automatically retrying on failure is industry best practice”.
I just wanted to shout at them to fix their damn system, but of course we ended up implementing retries instead…
> Would you rather do some library gluing, or reinvent a thousand wheels with every project?
Wheels. Absolutely wheels. Library gluing is what gets us garbage like Electron that needs to die in fire.
Imagine if a carpentry workshop gives a carpenter a bunch of IKEA kits and tries to conflate wanting to make furniture that isn't cost-cut prefabricated crap with insisting she can recreate all the other tools with just her whittling knife because it's in the 'true spirit of carpentry'. (Honestly, if anything, comparing Electron to IKEA is a insult to IKEA - there are cases where using IKEA is actually reasonable, they just aren't actually carpentry.)
(You use libraries when it makes sense to use libraries, just like you use 2x4s when it makes sense to use 2x4s. Sometimes you can make the whole thing out of 2x4s, just like you can make a whole program out of:
Ideally a healthy blend of both. Wheels where it relates to your companies core competences or there's a gap in the market. Glue for everything else (you don't need to invent an infrastructure provisioning solution unless you're and infrastructure provisioning companies--there's plenty of mature solutions). Other places like application libraries--it might make sense.
Many commenters are attributing this problem to the modern high-level tools we now have access to. But I don't think this is the crux of the issue. You can face the same issue (you're plumbing things, not a designing a system) whether you are working with low level or high level components.
Heck, you could be working on a hardware circuit, but if the only thing you had to do was make sure the right wires, resistors, capacitors, etc. were in place between the chips you're still just doing plumbing work.
To me, one of the most satisfying things about programming is when you can build something great by starting with a concept for your lower-level primitives, your tools, and then work up through the higher levels of design, ultimately having the pieces you designed fit together to form something useful to the world.
This building-things-to-build-things idea is even satisfying in other areas. Just gluing a bunch of wood together to make a piece of furniture is fine, but building your own jigs and tools to be able to do the kind of cuts that enable the end design you envision is way more satisfying, and opens up the design space considerably.
If I had to lament anything (and perhaps this is what's most in alignment with the post) it's that most of the high-level primitives you touch these days tend to be sprawling, buggy, not focused, and just generally not of high quality or performance. It's possible for high-level primitives to avoid these pitfalls (e.g. Sqlite, the canonical example) but it does tend to be the exception.
I think there is still plenty of interesting and satisfying software engineering work to be done when starting with high-level libraries and tools. You just need to think about how to use their properties and guarantees (along with maybe some stuff you build yourself!) to enable the design of something more than just the (naively-plumbed) sum of the parts.
It sounds unrealistic and I'm going to get flamed, but hear me out. It works. Most of my development these days is in reasonably complete languages like go, rust, zig, and various scripting languages, so your mileage may vary if you're writing in something like rune, hare or carbon that is still taking shape.
If you think I'm crazy but have a lingering skepticism, challenge yourself to spend one day, one week, or one month using the stdlib only. If that is unrealistic in your setting a compromise could be to only use libraries that were created by yourself. You wont come out of it as an enlightened samurai monk with a celebrity HN presence, but you will gain an immense sense of scrutiny that didn't exist before as you take a look at the libraries you want to use after that.
For those who think this is just 100% bull, consider how we survived before google, youtube, and stackexchange existed.
You can find cool technical problems anywhere as long as you are willing to take the path less traveled.
Need a hashtable? Write one.
You won't come out with a celebrity HN presence, but you may gain enlightened samurai monk status.
All before Google, Youtube and Stack Exchange
In the last 3 years I think I never saw even once a job description on any popular job board where they advertise that you will do some actually interesting programming. The only ones I've seen have been on Twitter but from companies doing things in areas I have no experience in (e.g. game engine programming).
They basically test for distance from school, and not much else, as the algorithms aren’t really reflective of real-world work, which, as the article states, is really fairly simple “glue,” binding together prefab sections.
If someone is good at, and energized by, writing “from scratch,” and "learning the whole system," then they are actually not what you want. You want people that are good at rote, can learn fairly shallow APIs quickly, and are incurious as to why things work.
I have a friend who is writing code to run on a sort of exoskeleton meant to benefit disabled people and help them walk. He has never in his life "deployed to the cloud" and wouldn't have the foggiest idea of how to do it.
Now you have a cloud component.
A lot of modern hardware design feels like that, take a microcontroller, some peripheral chips, connect them together, and copy the datasheets for whatever support passives they need.
Deleted Comment
Lisp can enable fantastic development speed allowing you to build your own primitives. Racket's ecosystem's all high quality too.
I believe Julia has a less buggy ecosystem allowing you to pipe together natively written ML things, as opposed to Python's dumpster fire ecosystem
Love the Julia language. But its ecosystem is definitely less complete, less documented and arguably more buggy than the Python equivalences.
This is not a failing on Julias part, the Python ecosystem is 50x bigger and backed by multiple of the largest IT/ML firms on the planet.
People who are really serious about software should make their own hardware. - Alan Kay
... via https://github.com/globalcitizen/taoup
Nitpick: ! for shell commands is an IPython feature, not Jupyter. It doesn't work with other kernels.
https://wesmckinney.com/blog/apache-arrow-pandas-internals/
Nowadays when it's just gluing frameworks together and configuring AWS services... it doesn't really feel any different intellectually than cleaning toilets. Sure the pay is better but as a creative challenge they're pretty much on par.
Comparing your six figure white collar job to basic janitorial work is pretty damn cringe and pretty objectively untrue.
I grew up doing hard farm/ranch labor outside in 95-105 degree TX heat and humidity. I agree with the ancestor comment that manual labor is a hell of a lot more satisfying and stimulating than gluing together AWS services with IAM/RAM snippets from stack overflow and updating some design doc about it. If it payed adequately I might do manual labor in the day and solve actually challenging and fulfilling technical problems at night. Programmers don't get to program much anymore :(
Your standard CRUD applications and web services are largely just a rigamarole of reciting the right incantation and duct taping bits together. It's immensely non-stimulating work when done properly.
This isn't an insult by any means. It's a testament to the triumphs of decades of engineering efforts to turn the process of orchestrating extremely complex electronic systems spanning continents into a largely trivial task for most projects.
There is a big difference. one is cleaning shit other more shit creating more shit :)
Let's imagine building a video chat:
Are you going to build an OS from scratch? Network protocols? Encryption? The rendering engine for it? Compression algos?
Just how much of this do you consider "not configuration"? I think this all sounds very much like "old man yells at cloud" pining for a past that never actually existed.
No one is stopping you from writing 6502 assembly code in your spare time. That's actually still a somewhat popular hobby.
I already started using ChatGPT to solve problems because I can’t be arsed to read through ten vendors worth of documentation. It wrote me a fairly complete and accurate chunk of code the other day to solve a problem.
using a software framework like django for "rapid application development" gives me a feeling closer to writing configuration files rather than actually "programming" (where "programming" is writing hardcore algorithms).
but don't get me wrong, I liked doing that, I got paid to do it. But let's call it for what it is: that python code (django app) was really django framework config.
this is a simlar phenomenon but worse, at least I could look around all of the actual code of the program for which I was writing configuration as code.
then, the thing about hardcore algorithms is that they need to be written once, and then everybody can use them. this is a giant problem. as I think about this, such hardcore algorithms are digital artifacts, so they are subject to the same problems all other digital artifacts (media, videos) are. a problem also known as software piracy. but software has been about composing proven algorithms together since day 1. the problem by this point is socio-economic. not technical.
I see the whole debate around "who will pay for critical infrastructure software" as another instance of "how should artists make money in the digital era".
But this comment (which I'm editing) is already at 0 points. somebody doesn't like what I'm trying to say, but I knew this already.
Well, if you were writing abstract pseudo-code, then maybe. In practice, the same algorithms are often reimplemented multiple times.
> and then everybody can use them
ah, if only that were true... implementations are often not that flexible, nor are programmers that keen on utilizing existing implementations.
> this is a giant problem
Problem? To the extent it's true - it's a boon, not a problem. Image if whenever you made a chair - suddenly everyone could get a chair without taking your own chair or spending any time and resources on chair construction. It's a miracle!
> a problem also known as software piracy.
1. Piracy is when people on ships with guns rob other ships. Arrgh, matey!
2. Sharing and copying software or other media is a good thing, not a problem.
PS - I haven't downvoted you. Point taken about django "config-programming".
the problem I describe is not a technical one, but a social one. and it's only a problem due to the current way society works. It's only a problem for some people (e.g me); them who are well satisfied by this "status quo" don't see any issue beyond lacking enforcement of IP 'rights' and the need for better DRM, and copy protections, and other things like that.
you're focusing on the technical aspects (plus you seem to be deliberately using the wrong definition of piracy).
I'm talking about the social economic aspects: I'm saying that this boon brought about by digital technology is only benefiting a select few. I tend to think about this 'boon' as potential that we collectively seem to be choosing to forego; I think it's my life's mission to do everything I can to avoid this "foregoing" of the great potential unleashed by digital technology; I feel like I'm swimming against the current most of the time.
Deleted Comment
The problem is, that the tools and frameworks we use have such bad configuration.
Deleted Comment
That says something.
then ask yourself - what have you done in order to have interesting job, projects, challenges, etc?
I mean, if you decided at some point that $big_salary for throwing JSONs via REST from CRUD app
is what you want to do until you pay your loans (e.g decade), then it's fine, but don't be shocked that you aren't doing bleeding edge R&D at some fancy place.
Like what prevents you from putting effort for year? two? and switching from X to Y?
Also there are lots of way for a company to say they are doing interesting/meaningful things or whatever and effectively not be. And no matter how great the work you may be doing, you don't want to be doing it if it doesn't pay enough.
then ask yourself - what have you done in order to have interesting job, projects, challenges, etc?
These two things don't connect to each other at all
I don't think it only takes 1 yr to switch to something interesting. Any R&D lab wants you have a phd with published papers.
Maybe over years you'll gain an expertise to work in research, idk.
This thousand times for all the whiners. Learn an esoteric skill, get hired for esoteric job.
The reason society works is because you can reuse abstractions that other people have already invented. It allows you to scale. Without economies of scale, you end up with Baumol's cost disease, which is extremely obvious in the US in industries like child and elderly care.
We don't want most software devs to be doing anything because gluing stuff together. If they weren't, we really screwed up somewhere.
Are our abstractions I-beams or Pre-Fabs[0]?
We all know that the rise of pre-fabs is, at its heart, the story of cheap developments all lazily (and hastily) thrown together in arrangements that are of low quality, mid-to-low beauty and do not last for very long.
Skyscrapers of the early 20th century stand today (with I-Beams) and are considered by many to be beautiful, maintainable etc;
People want pre-fabs, since they're cheap, the economy will always be in the pre-fab.
But pre-fabs have a limited shelf-life, and reconstruction is more expensive than spending a little extra up front cost.
[0]: This is the sort of pre-fab I am talking about: https://en.wikipedia.org/wiki/Prefabricated_building#/media/...
Pre-fabs are massive frameworks where mixing them always looks hack-glued together. Same as if you glue two prefabs from different vendors.
That said, industrial plants aren't gothic cathedrals. They are a collection of buildings plopped together.
Your catalog of beams, bolts, brackets, weld patterns, rebar, piles, and concrete pour standards are all abstractions over extremely difficult subfields of materials engineering and structural engineering.
They exist so that your engineer can focus on building the structure using parts and resources with known, standardised behavior under the conditions the building will be put in.
Of course you'll see engineers break away from these abstractions when they need to for a given structure but those abstractions do exist and are commonly used so that a given structural engineer doesn't also need a PhD in materials engineering and countless other specialized fields.
Deleted Comment
But if we had a system that we owned, then it could be adjusted to fit the requirements, elegantly. But we don't, so we can't.
The other problem is the "Ops" part: vendor owned systems are opaque AF, and when something goes wrong, impossible to debug. Then you become a support ticket monkey, praying you can convince the powers that be on the other end that a. it is truly their stuff that's broken, not yours and b. we pay for it, so yes, you should support it.
When a. or b. fails, then you end up writing yet more glue code to try to work around the bugs and outages that your vendor just doesn't give a shit about.
I just wanted to shout at them to fix their damn system, but of course we ended up implementing retries instead…
Wheels. Absolutely wheels. Library gluing is what gets us garbage like Electron that needs to die in fire.
Imagine if a carpentry workshop gives a carpenter a bunch of IKEA kits and tries to conflate wanting to make furniture that isn't cost-cut prefabricated crap with insisting she can recreate all the other tools with just her whittling knife because it's in the 'true spirit of carpentry'. (Honestly, if anything, comparing Electron to IKEA is a insult to IKEA - there are cases where using IKEA is actually reasonable, they just aren't actually carpentry.)
(You use libraries when it makes sense to use libraries, just like you use 2x4s when it makes sense to use 2x4s. Sometimes you can make the whole thing out of 2x4s, just like you can make a whole program out of:
but if you're just gluing (screwing?) 2x4s together, you're going to get bad results when you need something that's not a 2x4.)