Earlier in my career I felt a lot of disgust at bad code and bad solutions.
Sometimes it tye badness was really effort due to unfamiliarity or not instantly understanding what I was looking at.my laziness.
Sometimes it was because it disagreed with what ever framework or methodology I was using to give me confidence in the face of ignorance. I feel like an imposter but at least I know design patterns so this guy who did MVC wrong is worse.
Sometimes it was looking at something genuinely bad.
Now, later on, maybe my emphasis is more on business outcome than perfect implementation or maybe I've been involved in making enough abominations due to time pressures and architectural compromises that I can read those forces in other people's work.
Either way, I don't feel that kind of disgust anymore. It's code. No one is going to read it. It will be replaced next year. It works or it doesn't. Having to rip stuff out when the business changes or someone ways to use a different stack for resume reasons is part of life.
I think when I was younger I had in general more strong opinions on how to do things.
They weren't really based on anything more than sounding like they were true.
I'd hop on every paradigm that sounded correct. Clean code. Pure functions. Effective java. Pragmatic programming. Defensive programming. Like it has that righteous vibe to it. I'd totally strap a bucket on my head and go conquer the holy land under any of those banners.
If only we could do it my way, I thought, then we wouldn't have to put up with all these chafing points that annoyed me. Never do this! Always do that! My mind was like thumbnails from fitness youtube.
Along the way I discovered that when I got to do things my way, it turned out that there were actually still a bunch of chafing points. Different, but it sure wasn't great. Maybe my 30-year-old ass didn't know everything.
Eventually, along the way, I sort of came to the insight that I've built what, 15 applications in the course of my private and professional career. I've worked with 3-4 programming languages in enough depth to be competent with them. I've tried a few architectural paradigms. If I work until I'm in my 60s, I'll maybe double that. Life isn't long enough to get much deeper than that into the craft.
Given this pitiful sample size, it's nothing but hubris to think that I or anyone else would have a clear grasp of what is the best way of doing things.
I had a very similar experience as you. But I think you're being too humble.
Whether you wrote 15 applications or just one or two, does that really matter? Designing, exploring, writing, iterating on and maintaining these applications _for years_ have given you insights, battle scars and tacit knowledge that can only be gained through experience and continuous learning. Not to mention the different environments technologies and foundational knowledge you explored and internalized.
You've accumulated a hard earned skill set and the ability make wide reaching, pragmatic decisions. Do you or someone else _know_ what the _best_ way of doing things is? Probably not. But I bet you have developed opinions, taste and a toolbox of approaches with different trade offs.
That's maybe where the OP is coming from as well. The mindset of being opinionated is very valuable if you can back it up.
That doesn't mean you're always right and don't let other speak. That doesn't mean you can't change your mind or that your approach excludes other people's perspectives and incentives.
It means you can strive for _better_ and that you're crazy enough to make bold decisions when necessary.
I worked the majority of my career at an "elite" Japanese corporation. It's one that has a brand pretty much synonymous with "Quality." Many of my peers were among the finest engineers and scientists in the world.
I was often the dumbest guy in the room, and I'm smarter than the average bear.
Dealing with these folks could be infuriating. Every time I would suggest orthogonal approaches (because, like, software is different from hardware), I'd be called "lazy," or "sloppy."
It made me write good code, though.
If those folks saw the way I work now, they'd be horrified. They'd call me a "reckless cowboy," or something to that effect.
But most folks in today's software industry think I'm a stuck-up prig.
I work quickly. I leave good, highly-documented code, that lasts a long time (For example, one of my C SDKs was still in use, 25 years later), and I don't want to toss my cookies, whenever I look at my old code (sometimes, though, I shake my head, and wonder what I was thinking).
I'm my own best customer. I'm the one that usually needs to go into my old codebases, and tweak them, so I write code that I want to see, in the future.
I've come to realize that the term "over-engineered" can mean a couple of things:
1) This code is too naive, complex, and byzantine, which makes it prone to bugs, inflexible, and difficult to maintain; or
2) I don't understand this code. That makes it bad.
I used to have an employee who was "on the spectrum."
Best damn programmer I've ever known. Crazy awesome. Had a high school diploma, and regularly stunned the Ph.Ds in Japan.
His code was written very quickly, was well-designed, well-structured, well-documented, bug-free, highly optimized, and an absolute bitch to understand.
I think the other thing that "over-engineered" can mean is code that's unnecessarily good for its purpose.
If you're building a quick demo of a product to get user feedback, and you write perfect code that's highly maintainable, you've wasted time - better to throw together something as quick as you can and rebuild it if it's actually going to be used by/sold to customers. That's really overengineering in my mind - doing a poor job with the quality/speed tradeoff given the purpose of the thing you're building.
> It's code. No one is going to read it. It will be replaced next year.
Then it truly is awful. Deeply awful. It sounds like you've never progressed past dealing with terrible code, so you have my condolences. Good code is read. Good code is not replaced in a year. Even most bad code is not replaced in a year. Truly you live in a world of absolute shit code.
Business requirements or engineering dependencies can change quickly in some scenarios, meaning code gets replaced regardless of its quality. I've had to delete a lot of code the past few years, much of it 1 year old and very carefully written. Someone wasted his time.
Playing games like Satisfactory and Factorio also forces you to come to terms with the imperfection of living systems. The first time you play, you can't possibly know how big you need the factory to be, and you don't have the tech unlocked for a lategame factory anyway. You just have to admit that you'll build a temporary facility now, and build a new one after you've unlocked the tech you'll need to scale up.
Isn't this just cargo culting and then realizing when you're cargo culting vs when changes are actually necessary? I also don't like fawning over good code, or what makes good code, but I think it's actually a good thing to anticipate certain architectural changes. Idk, but I don't think you can tell a stakeholder that you can't implement a feature X because you didn't give enough of a shit about architecture and ran out of flexibility to change something.
Over-architecting a design to be flexible in one way can be a huge problem when changes come along and they are for fundamentally different kinds of changes than were expected when the design was created - this can actually be worse than having an under-architected system.
I went through a similar evolution. Now, I don't judge the code I read. If I can understand it and it works correctly, it's fine.
I think it's a good thing. Much of what used to get me irritated (and, from my observations, what gets other irritated) are just matters of style, and what style is being used isn't really important.
Meta-disgust (“ugh disgust about code”) feels like another instance of what the article talks about.
If code doesn’t really matter cause “business,” then I think you are right when you say business is more your interest. That’s cool! I go through similar feelings at times.
This reminds me of my first experience with how executives think vs engineers. I was working for a day trading firm and was a hardcore C++ nerd at the time. We were having scalability issues and the CTO asked me if there was as anything I could do. So I tell him “Well, if you give us four or five weeks I think we can optimize the code and get about 30% better performance.” He just looks at me for a moment and then says “Or I could just buy another 25 servers, would that work? I can have them here in a few days.”
I'm a young guy working with way more tenured and experienced programmers who didn't want management positions, preferring to code. Respectfully to them, I can't stand what they do. They obsess over details that do NOT matter to the business, and our whole department is paralyzed like this. Meanwhile big-picture things like internal APIs are either neglected or too big-brained for anyone to understand. I think they're just too skilled, and we need some worse coders with smaller egos to get the job done.
For example, they swear by writing low-volume web backends in C++ "for performance" and object to any kind of framework. Ironically, having to move more slowly and carefully as a result has led to big compute inefficiencies, on top of the more important hit to dev productivity.
A good engineer creates solutions to the problem before them. That problem always includes constraints such as budget, deadlines, maintainability, etc.
Very often, the mathematically or logically "best" approach is not the correct engineering approach because it doesn't meet those practical requirements.
Engineers who insist on a sort of purity are, in my opinion, not the best engineers even if they are genius at writing code.
> No one is going to read it. It will be replaced next year
I've never seen this happen even once in my 25 years as a developer.
Quite the opposite, in fact. Codebases grow and become more entrenched every year the organization stays in business. The goal becomes to shoehorn into the product more and more features that the original designers never dreamed of. And those original devs are usually long gone. To do that shoehorning long-term in the face of developer churn requires intense discipline around communicating to the next person what you are doing and why.
Interestingly, I feel like this sort of attitude is a real issue at the precise moment the author describes it as a boon.
When you are early to mid career, it is crucial to look for ways to amplify the good you can do in your workplace and solidify your brand as an individual. To do this, you should be looking, ironically, to elevate others. Doing so is the only way to build a reputation that people are going to actively WANT to talk about (e.g. "oh, having trouble? You should call in Jim, he helped me with a related thing"). This is invaluable.
Perhaps I am speaking through a lens, but had I taken the authors advice and taken a more combative role at such a juncture, I believe I would have far fewer opportunities now.
This transition from junior to senior includes another important skillset: balancing social dynamics against engineering realities.
The key is illustrated in the book club parable: The elitism is directed outside of the group and becomes only a means of alleviating the fear of judgement for misjudging the paper. The grad student's approach clearly communicates the socially agreed upon reality: the whole paper is crap. This stance and boundary provides a clear decision space to the learning junior members: "if you think you see a mistake, those here will be happy to hear it; no sacred cows".
Bringing this practice into a situation where the target is a member of the group's work changes the dynamics such that you have to mind your Ps and Qs again -- and so, dampens learning.
My mantra related to that is "be mean to the code and nice with to programmer".
In order to learn and make things better you _have to_ be critical. But the way things are communicated is very important so everyone is on board.
"We could do better here" - no matter who exactly was responsible in the past.
"This made sense at the time but with what we learned..." - remind each other that improvement and learning is part of the whole deal.
"I like the simple and expressive core idea of this, but if we expand this further..." - elevate and develop the good stuff that's already there.
I make jokes about my mistakes, bring them up early and often. Everything is a bit lighter and easier with a bit of humor and without the fear of making mistakes.
And vice versa it is just as detrimental to be afraid to bring mistakes and inadequacies up and criticize them. It's much more fun and productive if things are continuously improving.
I think early to mid is too soon to be focused on others' work. Early to mid, you should be focused on the quality of your own work, because you're still developing your taste and judgment, and you need the direct and vivid feedback you get from immersing yourself in the consequences of the decisions you make. A huge trap in software development is to get disconnected from feedback and be a slave to rumor, ideology, and religious "best practices." The air is full of bullshit (in no small part because everyone is trying to "solidify their brand as an individual" and "amplify the good they do" before they actually learn the job) and the best way to learn how to sort through it is by grounding yourself in the consequences of making this decisions versus that one, choosing this approach versus that one.
If you start "amplifying" too early, you won't be amplifying selectively, and your coworkers would be just as well off sorting through search results themselves.
(Of course it's a progressive transition, and you're never too inexperienced to advise a coworker not to force-push master or submit a PR with failing tests.)
I get where the author is coming from, but at the same time, having spent most of my career to date as a front-end developer, I can tell you many stories of how hilariously off the rails this approach can go.
With a decade under my belt I'm feeling that I recently finally started learning and the conclusion so far is that half of the effectiveness of software engineering comes from obeying ultimately simple and common sense rules that anyone can follow, like "use idiomatic expressions", "read the documentation", "prefer pure functions and immutable data structures".
I'm an average(and kind of lazy) developer, but I found early on that I have an edge over more talented and hard-working people - I gather knowledge instead of compensating for the lack of it with hard work.
Can you frame it as elitism? I hope not, because I deeply believe the worst and laziest developers can use some of those rules so that they're both effective and still bad and lazy.
> prefer pure functions and immutable data structures
This sounds like common sense until it becomes common nonsense, because you use Python or Javascript or Ruby or whatever language where you don't have an optimizing compiler and optimized immutable data structures, so what could have been be a single-pass low-memory scan over a big dataset in 10 minutes that could run on a toaster is implemented as an inefficient clusterfuck that takes 12 hours and 4 GB of memory.
It's absolutely important to treat mutability as either a side effect or a local optimization from a design perspective. But Python is not Clojure no matter how hard you try, and at some point you're going to want to mutate a dictionary.
Anyway, the point is that what might seem like "common sense" in some situations is not always obvious or unambiguous in general.
I had a similar experience when I got into software.
As a junior, more than once I wished to just to quit the field - even after years of CS studies & it being one of the best opportunities one can have in my area.
The draining nature of the "this is shit, we need to do clean code" and "TDD is the way" discussions being constantly repeated day-in day-out, can quickly kill any interest in working on code. It's not that far from being forced to write a book using a 100 most common word list and if you use any word outside of that, you're a shit writer because it will make it harder for others to read, because you used a word that's not in the most common word list...
I'm quite glad that I moved away from that corporate environment into startup space a year later, which showed a whole different perspective. Where code itself is useless and what mattered is if it delivered value. If your hacky solution can deliver value - then you can justify making it better. Otherwise - who cares.
I do think there's way too much attachment to code & its perceived quality in the dev community. On the other hand, if you work in a team where majority of people are well into their careers - there's a lot more nuance when it comes to the extremes such as "TDD all day all night" and "daily pair programming". They are seen as tools to utilise when appropriate, rather than mantras to be repeated mindlessly.
It's almost like children versus grown-ups. Children have simplistic, almost magical thinking. Adults know the world is more complicated, and the childrens' simplistic solutions don't actually work very well.
One of the reasons is small sample size. Children don't have the sample size to see why their simplistic solutions don't work. Neither does a programmer with two years of experience.
Colleague of mine working in small company. New 'CTO' hired last year. He's very much of the "everyone's voice deserves to be heard... all input is valid!"
My colleague has 22 years of engineering experience across a wide range of problem spaces and industries and team sizes (large bigco to mom/pop orgs).
Another guy (X) started was a year out of high school.
X insists that 'tech XYZ' is the best. Current tech stack was partially rebuilt by my colleague, but wasn't finished (because... lots of reasons, mostly resourcing).
XYZ is not only not a great fit, but the ecosystem supporting the problem space is small, especially considering what's already in place. Terabytes and years of data need to be migrated (both physically to new data centers and code-wise - new structure handling has to be added to accommodate current and future needs).
In planning meetings, "all voices need to be heard"... so X pushes XYZ a lot. And randomly rebuilds small bits in XYZ. And when it doesn't work - blames everything else (it's the network, it's the supporting libraries, it's ...).
The CTO will not push back. "That sounds great! That sounds like it'll solve all our issues!". There's a criminal deficiency in the understanding of the current tech stack or problems, along with no experience in migrating anything. But any criticism is taken as "we need to be more inclusive and let more people speak up - some of the best ideas can come from people who've not traditionally been heard".
Up to a point, that can make sense. But when do you draw the line? 3 months? 6 months? 18 months? People insisting on promoting child-like understandings of problems and solutions - while not ever delivering anything resembling a working solution - at some point should not be listened to.
Why does my colleague stay? He's only part time right now, and was close to leaving, but there's been some shift to refocus the CTO on something else, which may - over the next month or so - leave the few competent people there alone enough to get things back on track. I think if this was a 'full time' gig for him, he'd have left already.
I thought you meamt the die hard proponents of these software dev tropes are the childlike ones.
Imo I think that is more apt. I see a lot of resources wasted in the name of conforming to standards when the standard does t really apply to the partocular scenario
Having spent 2 decades programming I feel I was indeed surrounded by opinionated people who were narrow minded and combative, and often I felt looked down upon regarding my tech stack or code or interests ( eg I remember there being a strongly negative bias against bitcoin in the early days). Now I know according to research most would have been on the autism spectrum and generally less socially adjusted as others with other careers.
I think my point is engineering may lend itself to strong opinions in people with poor social skills.
> eg I remember there being a strongly negative bias against bitcoin in the early days
So, in the one example you gave, they were right. And yet I get downvoted for suggesting: hey, guys, that senior developer who's insisting on doing something a certain way? Sorry you have to hear this, but they're often right.
I think there is some confusion here. I will clarify, but please don't get offended.
For your first point, no, they were wrong, I did very well on bitcoin - hence my example.
For your second point - I am confused - I agree senior developers know what they are talking about, I was one for nearly 2 decades.
I was referring to peers not superiors.
I was saying I have encountered narrow mindedness and ignorance. I did not say that is all I encountered. I also encountered kind, intelligent, superior and inspirational people.
Edit> On second thoughts - I re read your comment - Who downvoted you? When? Was it in regards to this article?
The elitism tends to show up once you've been around the block a few times but before your ego and position is secure. Years 6-8 or so; ie, post-doc age. Not really properly mid-career, at least outside of "retire at 35" FAANG world.
Which matches my observations. Folks may pretend otherwise but nearly all mid-career people intuitively understand there's lots of bigger fish in this world.
Elitism is usually meant as an insult for snobs pretending to have better tastes than anyone else. This is wrong because in questions of taste there can be no criteria other than subjective taste, however in contexts where there are objective criteria of quality elitism is just being a professional good at his job. When you open up a horrible spaghetti codebase that's inconsistent in style and full of nonsensical abstractions with 90% of bloat you should feel digust. Of course all bad code has a reason behind it, some more justified than others, but no reason should ever make a bad codbase feel good.
> however in contexts where there are objective criteria of quality elitism is just being a professional good at his job.
Imagine I look at the Linux kernel source code and I feel it's lacking in automated integration tests, and that C is a poor choice of language for security-critical code.
Am I a competent professional, applying objective quality criteria?
Or am I an arrogant dilettante, to imagine I know better than some of the most influential living programmers?
It depends on you and the context, are you actually working on Linux kernel and have worked on a lot of similar type projects?
Objectivity does not imply that it's easy to discern adequate criteria or that they are easy to know or that there is a consensus about them, just that it isn't purely subjective, and code isn't.
Sometimes it tye badness was really effort due to unfamiliarity or not instantly understanding what I was looking at.my laziness.
Sometimes it was because it disagreed with what ever framework or methodology I was using to give me confidence in the face of ignorance. I feel like an imposter but at least I know design patterns so this guy who did MVC wrong is worse.
Sometimes it was looking at something genuinely bad.
Now, later on, maybe my emphasis is more on business outcome than perfect implementation or maybe I've been involved in making enough abominations due to time pressures and architectural compromises that I can read those forces in other people's work.
Either way, I don't feel that kind of disgust anymore. It's code. No one is going to read it. It will be replaced next year. It works or it doesn't. Having to rip stuff out when the business changes or someone ways to use a different stack for resume reasons is part of life.
I wonder if this is an adaption or a maladaption.
They weren't really based on anything more than sounding like they were true.
I'd hop on every paradigm that sounded correct. Clean code. Pure functions. Effective java. Pragmatic programming. Defensive programming. Like it has that righteous vibe to it. I'd totally strap a bucket on my head and go conquer the holy land under any of those banners.
If only we could do it my way, I thought, then we wouldn't have to put up with all these chafing points that annoyed me. Never do this! Always do that! My mind was like thumbnails from fitness youtube.
Along the way I discovered that when I got to do things my way, it turned out that there were actually still a bunch of chafing points. Different, but it sure wasn't great. Maybe my 30-year-old ass didn't know everything.
Eventually, along the way, I sort of came to the insight that I've built what, 15 applications in the course of my private and professional career. I've worked with 3-4 programming languages in enough depth to be competent with them. I've tried a few architectural paradigms. If I work until I'm in my 60s, I'll maybe double that. Life isn't long enough to get much deeper than that into the craft.
Given this pitiful sample size, it's nothing but hubris to think that I or anyone else would have a clear grasp of what is the best way of doing things.
Whether you wrote 15 applications or just one or two, does that really matter? Designing, exploring, writing, iterating on and maintaining these applications _for years_ have given you insights, battle scars and tacit knowledge that can only be gained through experience and continuous learning. Not to mention the different environments technologies and foundational knowledge you explored and internalized.
You've accumulated a hard earned skill set and the ability make wide reaching, pragmatic decisions. Do you or someone else _know_ what the _best_ way of doing things is? Probably not. But I bet you have developed opinions, taste and a toolbox of approaches with different trade offs.
That's maybe where the OP is coming from as well. The mindset of being opinionated is very valuable if you can back it up.
That doesn't mean you're always right and don't let other speak. That doesn't mean you can't change your mind or that your approach excludes other people's perspectives and incentives.
It means you can strive for _better_ and that you're crazy enough to make bold decisions when necessary.
I was often the dumbest guy in the room, and I'm smarter than the average bear.
Dealing with these folks could be infuriating. Every time I would suggest orthogonal approaches (because, like, software is different from hardware), I'd be called "lazy," or "sloppy."
It made me write good code, though.
If those folks saw the way I work now, they'd be horrified. They'd call me a "reckless cowboy," or something to that effect.
But most folks in today's software industry think I'm a stuck-up prig.
I work quickly. I leave good, highly-documented code, that lasts a long time (For example, one of my C SDKs was still in use, 25 years later), and I don't want to toss my cookies, whenever I look at my old code (sometimes, though, I shake my head, and wonder what I was thinking).
I'm my own best customer. I'm the one that usually needs to go into my old codebases, and tweak them, so I write code that I want to see, in the future.
I've come to realize that the term "over-engineered" can mean a couple of things:
1) This code is too naive, complex, and byzantine, which makes it prone to bugs, inflexible, and difficult to maintain; or
2) I don't understand this code. That makes it bad.
I used to have an employee who was "on the spectrum."
Best damn programmer I've ever known. Crazy awesome. Had a high school diploma, and regularly stunned the Ph.Ds in Japan.
His code was written very quickly, was well-designed, well-structured, well-documented, bug-free, highly optimized, and an absolute bitch to understand.
If you're building a quick demo of a product to get user feedback, and you write perfect code that's highly maintainable, you've wasted time - better to throw together something as quick as you can and rebuild it if it's actually going to be used by/sold to customers. That's really overengineering in my mind - doing a poor job with the quality/speed tradeoff given the purpose of the thing you're building.
Then it truly is awful. Deeply awful. It sounds like you've never progressed past dealing with terrible code, so you have my condolences. Good code is read. Good code is not replaced in a year. Even most bad code is not replaced in a year. Truly you live in a world of absolute shit code.
I think it's a good thing. Much of what used to get me irritated (and, from my observations, what gets other irritated) are just matters of style, and what style is being used isn't really important.
If code doesn’t really matter cause “business,” then I think you are right when you say business is more your interest. That’s cool! I go through similar feelings at times.
For example, they swear by writing low-volume web backends in C++ "for performance" and object to any kind of framework. Ironically, having to move more slowly and carefully as a result has led to big compute inefficiencies, on top of the more important hit to dev productivity.
Very often, the mathematically or logically "best" approach is not the correct engineering approach because it doesn't meet those practical requirements.
Engineers who insist on a sort of purity are, in my opinion, not the best engineers even if they are genius at writing code.
I've never seen this happen even once in my 25 years as a developer.
Quite the opposite, in fact. Codebases grow and become more entrenched every year the organization stays in business. The goal becomes to shoehorn into the product more and more features that the original designers never dreamed of. And those original devs are usually long gone. To do that shoehorning long-term in the face of developer churn requires intense discipline around communicating to the next person what you are doing and why.
But we deliver the message more or less.
If we had to have perfectly rehearsed sentences,
We'd never talk
When you are early to mid career, it is crucial to look for ways to amplify the good you can do in your workplace and solidify your brand as an individual. To do this, you should be looking, ironically, to elevate others. Doing so is the only way to build a reputation that people are going to actively WANT to talk about (e.g. "oh, having trouble? You should call in Jim, he helped me with a related thing"). This is invaluable.
Perhaps I am speaking through a lens, but had I taken the authors advice and taken a more combative role at such a juncture, I believe I would have far fewer opportunities now.
The key is illustrated in the book club parable: The elitism is directed outside of the group and becomes only a means of alleviating the fear of judgement for misjudging the paper. The grad student's approach clearly communicates the socially agreed upon reality: the whole paper is crap. This stance and boundary provides a clear decision space to the learning junior members: "if you think you see a mistake, those here will be happy to hear it; no sacred cows".
Bringing this practice into a situation where the target is a member of the group's work changes the dynamics such that you have to mind your Ps and Qs again -- and so, dampens learning.
In order to learn and make things better you _have to_ be critical. But the way things are communicated is very important so everyone is on board.
"We could do better here" - no matter who exactly was responsible in the past.
"This made sense at the time but with what we learned..." - remind each other that improvement and learning is part of the whole deal.
"I like the simple and expressive core idea of this, but if we expand this further..." - elevate and develop the good stuff that's already there.
I make jokes about my mistakes, bring them up early and often. Everything is a bit lighter and easier with a bit of humor and without the fear of making mistakes.
And vice versa it is just as detrimental to be afraid to bring mistakes and inadequacies up and criticize them. It's much more fun and productive if things are continuously improving.
If you start "amplifying" too early, you won't be amplifying selectively, and your coworkers would be just as well off sorting through search results themselves.
(Of course it's a progressive transition, and you're never too inexperienced to advise a coworker not to force-push master or submit a PR with failing tests.)
With a decade under my belt I'm feeling that I recently finally started learning and the conclusion so far is that half of the effectiveness of software engineering comes from obeying ultimately simple and common sense rules that anyone can follow, like "use idiomatic expressions", "read the documentation", "prefer pure functions and immutable data structures".
I'm an average(and kind of lazy) developer, but I found early on that I have an edge over more talented and hard-working people - I gather knowledge instead of compensating for the lack of it with hard work.
Can you frame it as elitism? I hope not, because I deeply believe the worst and laziest developers can use some of those rules so that they're both effective and still bad and lazy.
This sounds like common sense until it becomes common nonsense, because you use Python or Javascript or Ruby or whatever language where you don't have an optimizing compiler and optimized immutable data structures, so what could have been be a single-pass low-memory scan over a big dataset in 10 minutes that could run on a toaster is implemented as an inefficient clusterfuck that takes 12 hours and 4 GB of memory.
It's absolutely important to treat mutability as either a side effect or a local optimization from a design perspective. But Python is not Clojure no matter how hard you try, and at some point you're going to want to mutate a dictionary.
Anyway, the point is that what might seem like "common sense" in some situations is not always obvious or unambiguous in general.
Dead Comment
The draining nature of the "this is shit, we need to do clean code" and "TDD is the way" discussions being constantly repeated day-in day-out, can quickly kill any interest in working on code. It's not that far from being forced to write a book using a 100 most common word list and if you use any word outside of that, you're a shit writer because it will make it harder for others to read, because you used a word that's not in the most common word list...
I'm quite glad that I moved away from that corporate environment into startup space a year later, which showed a whole different perspective. Where code itself is useless and what mattered is if it delivered value. If your hacky solution can deliver value - then you can justify making it better. Otherwise - who cares.
I do think there's way too much attachment to code & its perceived quality in the dev community. On the other hand, if you work in a team where majority of people are well into their careers - there's a lot more nuance when it comes to the extremes such as "TDD all day all night" and "daily pair programming". They are seen as tools to utilise when appropriate, rather than mantras to be repeated mindlessly.
One of the reasons is small sample size. Children don't have the sample size to see why their simplistic solutions don't work. Neither does a programmer with two years of experience.
My colleague has 22 years of engineering experience across a wide range of problem spaces and industries and team sizes (large bigco to mom/pop orgs).
Another guy (X) started was a year out of high school.
X insists that 'tech XYZ' is the best. Current tech stack was partially rebuilt by my colleague, but wasn't finished (because... lots of reasons, mostly resourcing).
XYZ is not only not a great fit, but the ecosystem supporting the problem space is small, especially considering what's already in place. Terabytes and years of data need to be migrated (both physically to new data centers and code-wise - new structure handling has to be added to accommodate current and future needs).
In planning meetings, "all voices need to be heard"... so X pushes XYZ a lot. And randomly rebuilds small bits in XYZ. And when it doesn't work - blames everything else (it's the network, it's the supporting libraries, it's ...).
The CTO will not push back. "That sounds great! That sounds like it'll solve all our issues!". There's a criminal deficiency in the understanding of the current tech stack or problems, along with no experience in migrating anything. But any criticism is taken as "we need to be more inclusive and let more people speak up - some of the best ideas can come from people who've not traditionally been heard".
Up to a point, that can make sense. But when do you draw the line? 3 months? 6 months? 18 months? People insisting on promoting child-like understandings of problems and solutions - while not ever delivering anything resembling a working solution - at some point should not be listened to.
Why does my colleague stay? He's only part time right now, and was close to leaving, but there's been some shift to refocus the CTO on something else, which may - over the next month or so - leave the few competent people there alone enough to get things back on track. I think if this was a 'full time' gig for him, he'd have left already.
I thought you meamt the die hard proponents of these software dev tropes are the childlike ones.
Imo I think that is more apt. I see a lot of resources wasted in the name of conforming to standards when the standard does t really apply to the partocular scenario
Now that is a quote for the ages.
I think my point is engineering may lend itself to strong opinions in people with poor social skills.
So, in the one example you gave, they were right. And yet I get downvoted for suggesting: hey, guys, that senior developer who's insisting on doing something a certain way? Sorry you have to hear this, but they're often right.
For your first point, no, they were wrong, I did very well on bitcoin - hence my example.
For your second point - I am confused - I agree senior developers know what they are talking about, I was one for nearly 2 decades. I was referring to peers not superiors. I was saying I have encountered narrow mindedness and ignorance. I did not say that is all I encountered. I also encountered kind, intelligent, superior and inspirational people.
Edit> On second thoughts - I re read your comment - Who downvoted you? When? Was it in regards to this article?
What on earth are you talking about?
Are you chatgpt?
Imagine I look at the Linux kernel source code and I feel it's lacking in automated integration tests, and that C is a poor choice of language for security-critical code.
Am I a competent professional, applying objective quality criteria?
Or am I an arrogant dilettante, to imagine I know better than some of the most influential living programmers?
Objectivity does not imply that it's easy to discern adequate criteria or that they are easy to know or that there is a consensus about them, just that it isn't purely subjective, and code isn't.
That should end well.