Eh, kinda. Calling something a "best practice" is basically an appeal to authority. It means, "this is the right way to do things, for reasons I don't have time to explain." There are times when that's appropriate.
But really, "best" and "right" are highly situational. Any rule of thumb, even the most basic and uncontroversial, has a situation where it doesn't apply. I was part of a discussion on a mailing list years ago, where somebody got flamed harshly when he mentioned using a global variable. He then explained that he was working on an embedded control system for cars and all the variables in that system were global. The team was well aware of the pitfalls of global state and used a combination of documentation, static analysis, custom tooling and elaborate testing to mitigate them. It was a considered design choice that made it possible to develop high-performance, hard-realtime software that could run on very limited hardware.
One thought to add here - when you appeal to an authority, which one is it?
In the OP example, it seems the senior dev is saying “on my authority”. And sometimes that is enough, especially if the senior dev can give examples of when not following this practice bit them.
But sometimes there is a higher authority, such as “it’s what is recommended in Google’s SRE book”, which is probably good advice if you are building a large SRE team. (Though as you say, not in all situations.)
I think in the worst case, “best practice” can indeed be used to shut down discussions of a leader’s preferences. But you can smell that out by asking for concrete examples, and asking how widely the practice is recommended.
A good “best practice” should be justifiable and explainable.
All that said, sometimes as the senior engineer you need to go with gut feel; “this design smells like it will give us trouble within a year” is the sort of thing I sometimes say. But I think it’s important to be honest that it’s a hunch, not a certainty in these cases, and discuss/weight accordingly.
> All that said, sometimes as the senior engineer you need to go with gut feel; “this design smells like it will give us trouble within a year” is the sort of thing I sometimes say.
That's a really great perspective. There are a bunch of times where I have a preference for or against something, but I can't point to a specific example of when that thing was good/bad, or any objective data about it. It's just that my experience suggests to me (via murky pattern matching in my brain) that particular thing will be good or bad.
It's certainly weaker evidence than data or concrete examples, but I think it's still valuable and worthy of consideration.
“Best practices” smells like such a marketing term it should be tossed in the bin.
It’s poetic language that has nothing to do with specific problems.
What people usually mean when it comes to engineering is “be safe, reliable, and correct.”
Security best practices to be safe.
Developer best practices for reliability.
Etc etc
“Best practices” is hand wave-y fluff for “do a good job” and doesn’t need a technical definition.
Make sure you’re secure, reliable, and correct given the engineering context, and odds are you result in a system that also has a lot of specific best practices in place.
Sometimes "best practice" is not because we need to worship a standard, but we need to have a standard. If everyone does everything in a different manner, you wind up with a tower of babel and support becomes impossible.
I'm of the mindset where an organization needs to agree to specific principles that everyone adheres. Not dogmatically but pragmatically.
Having that shared "best practice" makes is easier to support other people code/system, allows people to get up to speed faster and as a bonus, allows you to make a blog post on Medium where you can call yourself a thought leader.
> I'm of the mindset where an organization needs to agree to specific principles that everyone adheres.
Agreed. But GP was talking about context dependency of those best practices. E.g. what might be a good best practice for a large organization inside a FAANG might be very bad for others. Some organization have very immediate feedback cycles. A significant problem in Amazon's online shop will probably show up immediately in sales numbers. A significant problem in the control software of a plane, might kill a few hundred people in a few years before people realize there is a problem. Therefore you cannot a/b test which auto pilot works better in a certain situation. (Although Tesla might disagree about that /s).
The point is every so called "best practice" should state under which precondition it is supposed to be applied. And most blog articles and books ignore that completely.
My experience is that enforcing standards for everybody is really bad. Different projects are truly different and require different trade offs. Enforcing one way to do things is anti-agile.
> somebody got flamed harshly when he mentioned using a global variable.
Harsh bashing on global variables is such a dumb thing.
Yes, they can be dangerous. Yes, many had problems due to using them. Yes, we should tell beginners to avoid globals.
But there is no reason to ban them altogether. Experienced programmers should utilize them whenever it makes sense (instead of passing down a value of a local one to almost every function [sic]).
Global mutable variables are generally a bad idea because they have nonlocal side effects which are difficult to mitigate, like aliased mutable pointers but worse. Extra non-aliasing arguments and multiple return values are free of these issues, and a better tradeoff in almost all cases (unless you're sure you'll never run 2 instances of a system in the same address space, and you have specialized constraints possibly including embedded/safety-critical development and emulators).
When experienced programmers utilize global variables whenever it makes sense, it tends to bite future generations. Windows's GetLastError is a mess (some functions set it on error but don't clear it on success), I'm not sure about POSIX's errno, and when threads were introduced, these variables had to be changed to thread-local state.
On the contrary, I think beginners should use them because it reduces cognitive load in tiny applications.
My standard rebuttal to anti-global dogmatism is "there's only one instance, and never needs to be more. If/when we do, we can consider doing something else."
It is best practise to use RCCBs, because it turned out faulty wiring can kill people. But in Server rooms where you might not want to switch off the whole rack without warning when one device is faulty, you can use a device to monitor the residual current (RCM). Which issues a warning first, and only switches of when the residual current raises over the acceptable level. Different scenario, different best practise. (This is also the reason medical equipment is expensive).
I think a professional should be aware why a best practise exists and how to deal with a situation where for some reason it cannot be applied as you showed with the embedded example.
Don't forget however that many devs are against best practises out of lazyness or because they don't understand the reasons why they are best pracises.
Identifying a best practice is actually not as hard as people realize imo (although it may take some time), and only becomes hard when you let emotion and sources of emotion come into what should be a rational decision making process (such as preference for a certain tooling for reasons like familiarity or popularity in the field today rather than outright advantages vs other tooling).
To identify the best practice for anything, you start by doing a review of all the available practices in the field for a given problem you are working on. Then once you've reviewed the literature you can work out the pros, cons, caveats of each of these tools, and how these considerations affect your particular use case and your expected results. Then after doing that, the best practice out of available options will be readily apparent, or at the very least strongly justified, not by an appeal to authority or popularity or familiarity, but by looking at what the underlying technology actually does and how its relevant or not to your particular task at hand. In the end you will find the very best hammer available out of all the hammers people have made in this field for your particular unique nail.
That sounds like a beautiful example of letting perfect be the enemy of good.
Just like your design choices have trade-offs, is it important to realize that there's a trade-off between finishing sooner and making a better solution. Diminishing returns are usually very much in play with analysis.
(I would also challenge the notion that every situation has a different "best practice". That's just creating a solution. "Best practices" are usually general advise that is applicable in most situations.)
> Calling something a "best practice" is basically an appeal to authority.
If presented on its own, but then, any conclusion presented on its own without supporting context and analysis is the same.
> But really, "best" and "right" are highly situational
A description of a best practice that doesn't provide a sufficiently precise description of the situation to which it applies as a best practice is generally inappropriate, unless it is the conclusion of an analysis of applicable best practices to a certain situation, in which case the scope is specified in framing the analysis.
It is true that lots of things described as best practices for particular situations with supporting rationale and up getting detached from their logic and context and becoming cargo cult best practices.
Smart experienced software developers disagree on “best practices” all the time. What is “obviously” the best practice to you is “obviously” not the best practice for somebody else. Now what?
The biggest issue I saw with "best practices" in my career is the failure to take into account who it claiming it to be a best practice, and in what context. I saw too many junior developers read a rando blog article, then get a non-technical / semi-technical manager excited about something that made their life easier, even though it was by no means a good practice for the context at hand. Or alternatively, believe some vendor carte blanche when they tell you their product somehow follows a best practice.
The overarching problem is that yes, there is software engineering going on in the world, but most organizations are not willing to do engineering. I don't blame the technical staff - they often have good intentions - but rather the typical business is not willing to pay the cost in time or money to do long-lasting engineering practices. This is one of the things not enough of us think about in our career choices - am I going to a place that practices fire drills or engineering?
Spot on about reading a random blog or article. I think many engineers at work are under pressure to deliver, and are looking for quick solutions. They do a search and see a Medium post related to their problem written by someone who says they are an “<platform> developer at <company>”, and for some reason most readers see these authors as an authority in their domain (because why else would they be writing about it? /s), and just accept the blog post’s practices or conclusions. Once read an article a long time ago about Android’s async task , and some blog post claimed it should only be used for operations less than one second. I looked at the official documentation and while it did mention that it should be used for short operations, no where did it mention it should be less than one second specifically (unless I missed it). I saw the same advice mentioned by many other devs who referenced that same article.
To add on to this, the current most preached about "best practices" comes largely from a de-risking, never-ever fail point of view, ie. 'safety'. Unfortunately with such standards also comes a concept known as 'acccountability', ie. 'ass-covering' practices that provide little practical value.
This results in programmers no longer being able to iterate fast and having to rely on some third-party whose tradeoffs they don't understand, resulting in slow, bloated software and dissatisfied programmers.
I am not sure if I read you right here (correct me if I am wrong), but do you say that safety and stability concerns lead to bloated software because devs cannot blindly trust third party dependecies?
Because if so, yeah. You are responsible for the software you write and the dependecies you use. Software engineering is one of the least responsible engineering diciplines anyways. I am e.g. also a certified electrical engineer and of course I am responsible if my wrong decisions kill someone. If I would use a cheap chinese knockoff circuit breaker (because we can iterate faster if the stuff isn't expensive and certified) and someone gets killed, I go to jail. If someone gets killed and I can proof that I followed the currently agreed on state of technology my ass is covered. Of course you can get a lightbulb to light up without following any rule (and maybe this would be more efficient), but in EE the existing rules came as a consequence of deaths and rhe prevention of those is worth it.
In software engineering the worst that can usually happen (unless you work in IOT or industrial applications) is that you loose your user data. Many software devs don't care about whether they loose their users data and there are no tangible consequences. "Oh it was a software error" is still a good excuse, as if there was nothing which could have prevented that software error. I program software myself but I am al for stricter consequences in our profession because it would straighten out some heads who think this is all just fun and games for their personal joy.
> I saw too many junior developers read a rando blog article, then get a non-technical / semi-technical manager excited about something that made their life easier, even though it was by no means a good practice for the context at hand.
I’ve seen the other way around too - senior developers rejecting or pushing changes that they find convenient. IME it’s not to do with experience, rather stubbornness.
> How can Software Engineers call themselves engineers when there’s no rules, governing bodies, or anything to stipulate what true Software Engineering is?
We call ourselves software developers in Canada.
According to Canadian engineering[1]:
The "practice of engineering" means any act of planning, designing, composing, evaluating, advising, reporting, directing or supervising, or managing any of the foregoing, that requires the application of engineering principles, and that concerns the safeguarding of life, health, property, economic interests, the public welfare, or the environment.
To be considered a work of engineering, then, that a piece of software (or a software-intensive system) must meet two conditions:
1. The development of the software has required “the application of a systematic, disciplined, quantifiable approach to the development, operation, and maintenance of software.”
2. There is a reasonable expectation that failure or inappropriate functioning of the system would result in harm to life, health, property, economic interests, the public welfare, or the environment
I aggressively refuse to call myself an engineer. Programmer is fine. Developer is better, because programming is only a piece of it. Y’all can call yourselves whatever you like, but inside I’m thinking you retconned an industry insider euphemism into a non-existent sub-type of engineering, by protesting all the ways you apply rigor to what we do.
What of those who graduated from engineering programs taught at engineer colleges?
Lots of people in other engineering disciplines build stuff with less rigor than is used to build software. Maybe load test it in Fusion 360 before sending the file to be milled. And lots of software companies test their stuff quite rigorously.
People who write software have imposter syndrome. Other disciplines are not building things better than we do. For every Google there's a Ford, and for every scrappy startup there's a scrappy machine shop.
Isn’t it easier to just stick to the term and not fight with it? Everyone involved knows that “software engineering” is not like “civil engineering”, and those who don’t, usually don’t know either. It has no legal nor practical sense, so why bother?
Margaret Hamilton coined the term. But then again, anyone who can keep a lunar lander from turning into a new crater probably has more right to that term than most of us.
I believe I heard this in the Pragmatic Programmer: engineering is the practice of applying scientific understanding to real world problems. Or more succinctly, engineering is applied science. So software engineering is applying computer science to solve real world problems.
The second condition is a bit stupid, most failures or inappropriate functioning of the system in software development can be considered a harm in economic interests...
I write server-side software for a security system - anyone else can write a similarly complex piece of software to monitor grains of sand on the beach. So I am an engineer and they are not just because my software is used by some pre-described group of people?
If you work on the software in a self driving system in a car, that makes you an engineer. If you're working on software for a space satellite with no occupants... that's not an engineer?
Yes, I think we sometimes use the term "Engineer" a little too liberally in the US. For example, sometimes the janitor of an office building is called a "Custodial Engineer" which often just gets shortened to "Engineer". I remember the first time I heard a coworker say, "I'll call the engineer to have the thermostat adjusted". I thought that was overkill, but then a guy in overalls showed up and it made a little more sense.
So what do you call electrical engineers who design consumer level non mains voltage hardware? Looks like they fail the second bullet point. Sort of an honest question, since most of what you’d likely learn in an EE degree would probably not fall under number 2 either.
I believe that Shopify will change your job title from Engineer to Engineering Technician (or something of that nature) if you don't have your professional engineering certification.
Some of us call ourselves "engineers". It just depends whether you want to pay the order for the privilege of having the title (and pretty much no other tangible benefit in 90+% of cases) :).
Genuine question. What is so important about the word "engineer" that people not licensed to practice it want to call themselves engineer?
In Canada, I studied in a mechanical engineering technology program that lead to an engineering degree if you stayed on for 4 years. It was hammered into us that we weren't engineers until after you graduated and went through the professional licensing process.
In Canada there is a way to become a P.Eng. in software engineering (and until 2018 in the US there was the P.E. in software engineering), so there is a way for someone who wants to be called a software engineer to legally obtain it. So why is there so much resistance?
There is value in predictability, even if it stems from someone else's preferences.
Current home wiring requires you put wires in the wall at
(I think) between 10 and 20 cm from the border, and a small number of cm inside the wall. This means you only have to check that zone, and can use detectors, to find the wires.
My home is from the 1950's. Some wires go diagonally from top left to bottom right at the other side of the wall. We had great fun finding out where they were hiding.
Even if the new wires waste a lot more wiring and PVC tube, I vastly prefer them when redecorating or drilling holes.
100% that's true! I'm not saying developers having preferences and everybody abiding by them is a bad thing, more just that a lot of what constitutes "best practices" are often preferences and we should call them that.
Doesn't that kind of imply that everyone's opinions are equally valid? If 95% of your profession is on the same page with a certain practice then I'd argue it's really not reasonable to go against the grain without a very good reason.
I think "best practices" strikes a good balance between things that are personal preferences and things that are laws.
1950's onwards they used ribbon cables close to surfaces for household wiring. It was best practice, just not today's.
As a freelancer jumping projects, I can't help but see parallels, in that you've really got to study code bases wrt the time of authorship to understand their particular idiosyncrasies.
In the 2010's, people believed in "REST" without considering the context in which these concepts were introduced (eg thin browser UIs, or generalizations thereof as a baseline). Customers, even highly capable devs, flaunt their REST best practices, yet see HATEOAS as optional and pretentious, failing to see the entire point of loose coupling and discovery, and engaging in zealotry about squeezing parameters into URLs instead. Or pretend to, to stop pointless discussions with mediocre, bureaucratic peers.
I hate the expression "best practice", it's so, so often used by someone to justify applying cargo cult without actually understanding why.
"Hey why are you having a try-catch there, it seems like it's just gonna break our stack trace and we don't even graciously recover from it" - it's best practice
"Hey why are you using model/view/controller folders?" - it's best practice
"Why are you building microservices?" - best practice.
I got out of a code base with stylecop's settings set to max and treated as errors. Trailing white spaces, mandatory comments on everything, etc. The only exceptions are file size and method size. Files are often in the thousands of lines, and methods can reach that as well. 10 levels of nested if/else. So we're in an unmanageable code base, but at least we don't have trailing white spaces.
Most of the time it could be replaced by "tradition", "pattern", or "the way I've seen others do". If it's a tradition and most people agree to it, fine, might help with readability. If it's pattern, fine, tell me why it's applicable and helpful in our context. If you can't actually explain why you're doing something, maybe rethink it and educate yourself on the topic
So the only caveat I would put on that: it's useful to have common language for things, my threshold is that you understand why things are this way.
It's useful to tell someone: "you should have that thing in another class because it's clearly another responsibility", and have a debate on the responsibility itself ; or even on the merits of having a single responsibility per class.
What is not useful is when someone will start making monofunction classes that are 10 lines long on the hotel of "smaller functions are better" and sticking to that because "it's a best practice" with no better argument.
Basically my point is: if you understand what you are doing, great, use vocab all you want. But don't use it as a justification.
Interesting, I guess SOLID must have been a buzzword from an earlier era, because while I've vaguely heard of it I had to look it up. Don't think it's come up in my 8 years of professional work.
There's always an evolution of these methodologies and I often feel like in practice they're just a way to mix things up and make code health feel more interesting to those involved. Often people know there are problems with the design of their system, but they can't get buy in to fix them. These methodologies provide some authoritative justification for refactoring work.
> I hate the expression "best practice", it's so, so often used by someone to justify applying cargo cult without actually understanding why.
> Most of the time it could be replaced by "tradition", "pattern", or "the way I've seen others do".
The most common valid / defensible case I've seen it used is when a person fully understands the "why" in depth, and have formed that understanding on the basis of experience (not blind following or "tradition"), but doesn't have time to deliver a long in-depth explanation every time they do a code-review.
That's not to say it isn't still "developer preferences": there are multiple "best-practice" approaches out there and some even contradict eachother. But I strongly believe that even many of the very subjective, hotly-debated approaches are valid and useful in certain contexts.
e.g. I personally lean toward a pseudo-FP style of programming and am growing less and less enamoured with OO patterns for various reasons. But OO is still a useful abstraction, and if you're doing OO, I've found something like SOLID to contain a great deal of wisdom. I never sat down and decided to do things "the SOLID way", and learned that. Rather I wrote a lot of bad OO, ran into problems, gained insight from experience, and later stumbled across SOLID and found the pitfalls it mitigates familiar.
A sibling commenter has the following quote, which I agree with:
> SOLID is especially funny because majority of developers doesn't understand / can explain it, let alone have read papers
I think this is exactly why the term "best-practice" is so popular. I would not have understood SOLID in any depth if I had come to it fresh: I needed to learn SOLID informally by accident, internalise the challenges it's designed to overcome, and then recognise that intuitively from my experience when reading about SOLID later.
That's not a level of understanding you can typically impart easily in a conversation when suggesting someone do something differently. It's much easier to just say "best-practice".
This paragraph highlights something I've been saying for ages.
Most criticism of Java needs to be directed towards Java programmers and not the language itself.
The language allows you to simply make a class. You're not required to make an interface and then make a class that implements it, and yet, Java programmers do it anyways and then criticize the language for being verbose.
Getters and Setters? You probably don't need them. Classes can have public member variables.
Java programmers seem to have the hardest time understanding YAGNI.
> Getters and Setters? You probably don't need them. Classes can have public member variables.
The getter / setter debate and public fields is about exposing the internal implementation of the object (and thus making it unchangeable without breaking other code) and being able to reason about where the internals are used (if you do need to change them).
For example, if I've got a java.util.Date exposed as a public field and someone uses it, I can't change that later to a java.time.LocalDateTime unless I change all of the things using it. If this is a library that others are using it may mean a cascade of unknown changes.
If, on the other hand, the Date was exposed as a public Date getDate() { return date.clone(); } then I don't have to worry about it. When the internals are changed to LocalDateTime, then Date getDate() { return Date.from(local.atZone(ZoneId.systemDefault()).toInstant()); } and everything will still work.
I don't have an issue with the infrequently used "default package private" level of field visibility where only classes in the same package can see that field as that limits the range of the changes to the classes within single package (in a single project).
Getters and Setters also permit you to specify that something can only be retrieved or assigned, but not the other. This is non-trivial in most programming languages so having this as a common pattern is useful. Though I like C#'s properties versus seeing a bunch of getFoo and setFoo methods running around, if Java had provided the same or a similar capability there would probably be no controversy. It's the noisiness of the Java solution that seems to irk people more than the concept.
Apparently your field was used both for some internal shenanigans as well as a public field through the get/set indirection. Then you changed the field but kept the old public behavior. I wonder whether (1) that dual purpose internal/public role was wise to be begin with, and (2) whether that change of the internal logic of the getter might have just papered over a more significant change.
It might be good if a change just breaks client code. The client expects just to get/set something: they might not expect that some update migth add arbitrary logging or internal logic (like setting this value will also set another value, but we won’t tell you).
Most of the get/set stuff that I see are for value objects with either no or little business logic. Simple value objects _should_ just expose their implementation.
Thank you. I have programmed Java for years, but it's been frustrating. The clean language I learnt at university, that I love to bits, seems to be used approximately nowhere.
Instead, it's always some monstrosity held together by inheritance, XML/(awful) Gradle and liberal sprinklings of magical annotations that always (always!) bite you in the backside at runtime rather than at compile time, because why would you prefer to have good tooling that takes advantage of static typing to tell you what might go wrong.
I am appalled by what Spring mentality did to my language.
> The language allows you to simply make a class. You're not required to make an interface and then make a class that implements it, and yet, Java programmers do it anyways and then criticize the language for being verbose.
That's the 'D' in SOLID -- dependency inversion. An object's dependencies should be defined in terms of abstract interfaces, not concrete classes. The verbosity is just boilerplate, and it would appear in any statically typed language if you're following SOLID principles.
> Getters and Setters? You probably don't need them. Classes can have public member variables.
Getters and setters are part of the JavaBeans spec for being able to load and configure arbitrary beans. If you're writing JavaBeans, as many EJB or Spring application developers are, you'll use getters and setters.
> That's the 'D' in SOLID -- dependency inversion. An object's dependencies should be defined in terms of abstract interfaces, not concrete classes
DI does not talk about the interface keyword in Java. It talks about interface as the public surface of an object or system.
Programming to a Java interface declaration does not guarantee the code is depending on higher-level abstractions - this depends on how the interface is defined. An interface which just replicate the public surface of a class (a "header interface") is on the exact same abstraction level as the class, it just introduce needless boilerplate.
Java was for very long missing anonymous functions, so for anything involving callbacks you had to make gazillion of Listener- or Factory-classes instead, which further forced you to create an interface for each of them. Leading to all these infamously verbose design-patterns.
It was also for a very long time missing some fundamental Stream, List and String processing features. Just getting the contents of a file or converting a string to bytes, or even just initializing an Array often required 2-3 lines of verbose OutputStream(StreamBuilder().fromArray(Arrays.asList(1,2))).readBytes().add(3), wrapped in a try-catch for all the checked exceptions. For things that in C# would just be a oneliner of new List<int>({1,2,3}) or in Python [1,2,3].
This is mostly solved in current versions of the language, but the reputation and the "best practice" design-patterns remains.
Yeah, I'm actually a pretty big fan of Java programming. It gets a lot of hate on HN for things that have been mostly solved or can be solved if you implement it a certain way.
Steven Sinofsky gave a talk and said something to the effect of, we've been building roads, bridges and edifices for thousands of years. So, best practices and solved problems abound---and even then we still get it wrong sometimes. Whereas, software engineering is maybe 70 years old (generously)? So, there is much to learn and a lot of "baseline" knowledge that has yet to be established. I think it's a good way to think about things.
Eh, I mean we've been building computer chips for approximately the same amount of time as computer software, and it's pretty clear chip engineering is more like civil engineering than software engineering. I would guess many of the best practices in bridge building in the modern day were developed in the last 70 years.
I think it's that engineers of physical things have many more hard constraints they have to wrestle with, and software engineers largely don't. Your code doesn't need to obey the rules of gravity and chemistry and materials science, it just needs to somehow accomplish the task.
And you see those best practices in the places of software engineering where there are hard constraints: cryptography. high performance code. realtime systems.
It's not just a senior engineer's opinion whether you should use ruby or C if you're writing the firmware for your race car. If you use md5 to hash user passwords on a major site, you'll be hung from the rafters.
> Eh, I mean we've been building computer chips for approximately the same amount of time as computer software, and it's pretty clear chip engineering is more like civil engineering than software engineering.
Is it actually anything like civil engineering? To my knowledge chip engineering revolves around yield. There's no such analogous concept in designing buildings that can only be reasonably constructed correctly 70% of the time and attempting to reuse the bad buildings for other projects.
Civil (and other engineering) got better because there was motivation to improve that came from multiple directions: literal lives at stake, the pride of good craftsmanship, iterative or even grand steps forward in knowledge, etc.
Software engineering as a discipline is dominated by appeals to authority ("Clean Code", "Google does it this way", "Djikstra said so", etc.) without any (or at least not much) attempt to ask why or whether. I think we'll automate away much of software engineering (likely with very poor, inefficient, and buggy implementations) before it matures enough as an industry to be actual engineering. Engineering (and the science behind it for that matter) advances from curiosity and a healthy skepticism, not the rampant ego-driven self-promotion that runs through SE.
This is a symptom of the lack of knowledge not a cause. Imagine you want to built a building but no one really knows how, then copying successfully completed buildings, and established construction engineers and companies is a pretty good idea. That's what's going on in SE.
> If a builder constructs a house for a man but does not make it conform to specifications so that a wall then buckles, that builder shall make that wall sound using his own silver.
Which is why I don't buy the "software has only been around for 70 years so give it time" argument. Software has nothing to be grounded in like other engineers do with physics. It's most likely always going to be endless cargo culting.
> I suspect it is a mix of cognitive science, linguistics, and anthropology.
Computer Science (more abstract) and Computing Science (less abstract) are branches of Science that have given a foundation to Software Engineering.
The problem is that most of the programming and software development that happens nowadays (and what people pay for) doesn't use it.
I compare it as Chemistry and Alchemy. We are still in the "alchemy" stage of software development. Sure, people who see themselves as "experts" are combining existing stuff to create new things. But there is a few set of experts that use the science expertise to implement systems.
> I suspect it is a mix of cognitive science, linguistics, and anthropology.
I agree. Creating a solution that is clear to navigate for future developers is valuable, as is one that is flexible for future modifications. These things depend much more on how humans understand the code than on how machines understand it.
That said, other engineering disciplines aren't entirely bereft of that kind of human aspect either. Take architecture for example. It's critical for a building to handle the forces put on it. But it's also important that humans can use it. Having a good building layout is much more a matter of understanding humans than of understanding physics.
And then there's the whole style thing that architecture has. A building is not always only functional. It can very much be a work of art. And that style is not just decorative. A style can conjure up certain emotions to put people in a certain mindset when dealing with the building. (E.g. putting a modern art museum in a neo-classical building would be dissonant.)
Edit: this is a discussion forum. The downvote button is not an agree/disagree button, it's to penalize irrelevant comments. If you'd like to disagree, please reply and state your thinking instead.
Programming changes practice quickly and often because it's cheap to do so compared to physical engineering which is slowed down by execution time, high materials cost, and sunk costs.
The interesting question is this: would other engineering pursuits (say civil) have just as much chaos and lack of authoritative practices, if changing practices would be equally fast and cheap for them?
Best practices in civil engineering are connected to outcomes. You know that something is a good practice if not doing it causes things to collapse or catch on fire.
"Best practices" in software engineering are usually about internal development processes and don't have any verifiable connection to outcomes.
In other words, we have two entirely different things labeled with the same name. People who commonly use the phrase "best practices" for software are literally trying to confuse you and generally are not worth listening to.
That said, some things in software are worth analyzing to have a better process, but those things need to be examined within a specific context. If someone claims that you need to, say, create an interface for every class, they should be able to explain why and how it is relevant to your work. If people make claims they cannot explain by connecting them to meaningful outcomes, those people are, again, not worth listening to. They might be mindlessly parroting something they have heard without having any clue as to the meaning of the practice. Unfortunately, our field is full of "professionals" that do exactly that.
Software is a pretty messed up domain that is frequently a self-licking ice-cream cone. (You write code to solve problems created by other code.) Because of that, it's often socially mediated, just like non-engineering fields. To establish anything for real in this self-referential environment you need to be able to have conversations about costs, tradeoffs and outcomes - within a specific context.
But really, "best" and "right" are highly situational. Any rule of thumb, even the most basic and uncontroversial, has a situation where it doesn't apply. I was part of a discussion on a mailing list years ago, where somebody got flamed harshly when he mentioned using a global variable. He then explained that he was working on an embedded control system for cars and all the variables in that system were global. The team was well aware of the pitfalls of global state and used a combination of documentation, static analysis, custom tooling and elaborate testing to mitigate them. It was a considered design choice that made it possible to develop high-performance, hard-realtime software that could run on very limited hardware.
In the OP example, it seems the senior dev is saying “on my authority”. And sometimes that is enough, especially if the senior dev can give examples of when not following this practice bit them.
But sometimes there is a higher authority, such as “it’s what is recommended in Google’s SRE book”, which is probably good advice if you are building a large SRE team. (Though as you say, not in all situations.)
I think in the worst case, “best practice” can indeed be used to shut down discussions of a leader’s preferences. But you can smell that out by asking for concrete examples, and asking how widely the practice is recommended.
A good “best practice” should be justifiable and explainable.
All that said, sometimes as the senior engineer you need to go with gut feel; “this design smells like it will give us trouble within a year” is the sort of thing I sometimes say. But I think it’s important to be honest that it’s a hunch, not a certainty in these cases, and discuss/weight accordingly.
That's a really great perspective. There are a bunch of times where I have a preference for or against something, but I can't point to a specific example of when that thing was good/bad, or any objective data about it. It's just that my experience suggests to me (via murky pattern matching in my brain) that particular thing will be good or bad.
It's certainly weaker evidence than data or concrete examples, but I think it's still valuable and worthy of consideration.
Are you appealing to the authority of speed? Compilation time? Readability? Functionality?
I dunno, and it often (especially around readability) comes down to the developer's/senior engineers preference.
It’s poetic language that has nothing to do with specific problems.
What people usually mean when it comes to engineering is “be safe, reliable, and correct.”
Security best practices to be safe.
Developer best practices for reliability.
Etc etc
“Best practices” is hand wave-y fluff for “do a good job” and doesn’t need a technical definition.
Make sure you’re secure, reliable, and correct given the engineering context, and odds are you result in a system that also has a lot of specific best practices in place.
I'm of the mindset where an organization needs to agree to specific principles that everyone adheres. Not dogmatically but pragmatically.
Having that shared "best practice" makes is easier to support other people code/system, allows people to get up to speed faster and as a bonus, allows you to make a blog post on Medium where you can call yourself a thought leader.
Agreed. But GP was talking about context dependency of those best practices. E.g. what might be a good best practice for a large organization inside a FAANG might be very bad for others. Some organization have very immediate feedback cycles. A significant problem in Amazon's online shop will probably show up immediately in sales numbers. A significant problem in the control software of a plane, might kill a few hundred people in a few years before people realize there is a problem. Therefore you cannot a/b test which auto pilot works better in a certain situation. (Although Tesla might disagree about that /s).
The point is every so called "best practice" should state under which precondition it is supposed to be applied. And most blog articles and books ignore that completely.
Deleted Comment
Harsh bashing on global variables is such a dumb thing.
Yes, they can be dangerous. Yes, many had problems due to using them. Yes, we should tell beginners to avoid globals.
But there is no reason to ban them altogether. Experienced programmers should utilize them whenever it makes sense (instead of passing down a value of a local one to almost every function [sic]).
When experienced programmers utilize global variables whenever it makes sense, it tends to bite future generations. Windows's GetLastError is a mess (some functions set it on error but don't clear it on success), I'm not sure about POSIX's errno, and when threads were introduced, these variables had to be changed to thread-local state.
Welcome to Dependency Injection in .Net Core!
"Those who cannot remember the past are condemned to repeat it." -- George Santayana
My standard rebuttal to anti-global dogmatism is "there's only one instance, and never needs to be more. If/when we do, we can consider doing something else."
I think a professional should be aware why a best practise exists and how to deal with a situation where for some reason it cannot be applied as you showed with the embedded example.
Don't forget however that many devs are against best practises out of lazyness or because they don't understand the reasons why they are best pracises.
To identify the best practice for anything, you start by doing a review of all the available practices in the field for a given problem you are working on. Then once you've reviewed the literature you can work out the pros, cons, caveats of each of these tools, and how these considerations affect your particular use case and your expected results. Then after doing that, the best practice out of available options will be readily apparent, or at the very least strongly justified, not by an appeal to authority or popularity or familiarity, but by looking at what the underlying technology actually does and how its relevant or not to your particular task at hand. In the end you will find the very best hammer available out of all the hammers people have made in this field for your particular unique nail.
Just like your design choices have trade-offs, is it important to realize that there's a trade-off between finishing sooner and making a better solution. Diminishing returns are usually very much in play with analysis.
(I would also challenge the notion that every situation has a different "best practice". That's just creating a solution. "Best practices" are usually general advise that is applicable in most situations.)
If presented on its own, but then, any conclusion presented on its own without supporting context and analysis is the same.
> But really, "best" and "right" are highly situational
A description of a best practice that doesn't provide a sufficiently precise description of the situation to which it applies as a best practice is generally inappropriate, unless it is the conclusion of an analysis of applicable best practices to a certain situation, in which case the scope is specified in framing the analysis.
It is true that lots of things described as best practices for particular situations with supporting rationale and up getting detached from their logic and context and becoming cargo cult best practices.
Everyone should write Embedded at least once. It's a completely different world.
It really inverts some priorities (I mean development priorities, not the priority inversion on mars pathfinder)
The overarching problem is that yes, there is software engineering going on in the world, but most organizations are not willing to do engineering. I don't blame the technical staff - they often have good intentions - but rather the typical business is not willing to pay the cost in time or money to do long-lasting engineering practices. This is one of the things not enough of us think about in our career choices - am I going to a place that practices fire drills or engineering?
This results in programmers no longer being able to iterate fast and having to rely on some third-party whose tradeoffs they don't understand, resulting in slow, bloated software and dissatisfied programmers.
Because if so, yeah. You are responsible for the software you write and the dependecies you use. Software engineering is one of the least responsible engineering diciplines anyways. I am e.g. also a certified electrical engineer and of course I am responsible if my wrong decisions kill someone. If I would use a cheap chinese knockoff circuit breaker (because we can iterate faster if the stuff isn't expensive and certified) and someone gets killed, I go to jail. If someone gets killed and I can proof that I followed the currently agreed on state of technology my ass is covered. Of course you can get a lightbulb to light up without following any rule (and maybe this would be more efficient), but in EE the existing rules came as a consequence of deaths and rhe prevention of those is worth it.
In software engineering the worst that can usually happen (unless you work in IOT or industrial applications) is that you loose your user data. Many software devs don't care about whether they loose their users data and there are no tangible consequences. "Oh it was a software error" is still a good excuse, as if there was nothing which could have prevented that software error. I program software myself but I am al for stricter consequences in our profession because it would straighten out some heads who think this is all just fun and games for their personal joy.
I’ve seen the other way around too - senior developers rejecting or pushing changes that they find convenient. IME it’s not to do with experience, rather stubbornness.
We call ourselves software developers in Canada.
According to Canadian engineering[1]: The "practice of engineering" means any act of planning, designing, composing, evaluating, advising, reporting, directing or supervising, or managing any of the foregoing, that requires the application of engineering principles, and that concerns the safeguarding of life, health, property, economic interests, the public welfare, or the environment.
To be considered a work of engineering, then, that a piece of software (or a software-intensive system) must meet two conditions:
1. The development of the software has required “the application of a systematic, disciplined, quantifiable approach to the development, operation, and maintenance of software.”
2. There is a reasonable expectation that failure or inappropriate functioning of the system would result in harm to life, health, property, economic interests, the public welfare, or the environment
[1] - https://engineerscanada.ca/news-and-events/news/when-softwar...
Error: cycle detected. ^__^
Software is by its very nature systematic and quantifiable. Is the quibble with whether programmers are disciplined?
All software companies would meet the "failure or inappropriate functioning of the system would result in harm to … economic interests" criterion.
Define this word without using the word.
Lift: British English. A machine that lifts peo... damnit!
Lots of people in other engineering disciplines build stuff with less rigor than is used to build software. Maybe load test it in Fusion 360 before sending the file to be milled. And lots of software companies test their stuff quite rigorously.
People who write software have imposter syndrome. Other disciplines are not building things better than we do. For every Google there's a Ford, and for every scrappy startup there's a scrappy machine shop.
If you work on the software in a self driving system in a car, that makes you an engineer. If you're working on software for a space satellite with no occupants... that's not an engineer?
Any of them can be Software Engineers or not, because we call Software Engineers to the ones with the university career finished
https://www.hillelwayne.com/post/are-we-really-engineers/
https://www.hillelwayne.com/post/we-are-not-special/
https://www.oiq.qc.ca/en/media/pressReleases/Pages/default.a...
In Canada, I studied in a mechanical engineering technology program that lead to an engineering degree if you stayed on for 4 years. It was hammered into us that we weren't engineers until after you graduated and went through the professional licensing process.
In Canada there is a way to become a P.Eng. in software engineering (and until 2018 in the US there was the P.E. in software engineering), so there is a way for someone who wants to be called a software engineer to legally obtain it. So why is there so much resistance?
Current home wiring requires you put wires in the wall at (I think) between 10 and 20 cm from the border, and a small number of cm inside the wall. This means you only have to check that zone, and can use detectors, to find the wires.
My home is from the 1950's. Some wires go diagonally from top left to bottom right at the other side of the wall. We had great fun finding out where they were hiding.
Even if the new wires waste a lot more wiring and PVC tube, I vastly prefer them when redecorating or drilling holes.
I think "best practices" strikes a good balance between things that are personal preferences and things that are laws.
It's for people who aren't confident enough to admit to simply having a preference, or knowledgeable enough to be able to explain it
As a freelancer jumping projects, I can't help but see parallels, in that you've really got to study code bases wrt the time of authorship to understand their particular idiosyncrasies.
In the 2010's, people believed in "REST" without considering the context in which these concepts were introduced (eg thin browser UIs, or generalizations thereof as a baseline). Customers, even highly capable devs, flaunt their REST best practices, yet see HATEOAS as optional and pretentious, failing to see the entire point of loose coupling and discovery, and engaging in zealotry about squeezing parameters into URLs instead. Or pretend to, to stop pointless discussions with mediocre, bureaucratic peers.
"Hey why are you having a try-catch there, it seems like it's just gonna break our stack trace and we don't even graciously recover from it" - it's best practice
"Hey why are you using model/view/controller folders?" - it's best practice
"Why are you building microservices?" - best practice.
I got out of a code base with stylecop's settings set to max and treated as errors. Trailing white spaces, mandatory comments on everything, etc. The only exceptions are file size and method size. Files are often in the thousands of lines, and methods can reach that as well. 10 levels of nested if/else. So we're in an unmanageable code base, but at least we don't have trailing white spaces.
Most of the time it could be replaced by "tradition", "pattern", or "the way I've seen others do". If it's a tradition and most people agree to it, fine, might help with readability. If it's pattern, fine, tell me why it's applicable and helpful in our context. If you can't actually explain why you're doing something, maybe rethink it and educate yourself on the topic
SOLID is especially funny because majority of developers doesn't understand / can explain it, let alone have read papers e.g [0],
except maybe S letter, yet everybody acts like they do SOLID*.
Or Clean Code's small function craziness or misinterpretation and going into "avoid comments" approach
* - unless you ask for details
[0] - https://www.cs.cmu.edu/~wing/publications/LiskovWing94.pdf
It's useful to tell someone: "you should have that thing in another class because it's clearly another responsibility", and have a debate on the responsibility itself ; or even on the merits of having a single responsibility per class.
What is not useful is when someone will start making monofunction classes that are 10 lines long on the hotel of "smaller functions are better" and sticking to that because "it's a best practice" with no better argument.
Basically my point is: if you understand what you are doing, great, use vocab all you want. But don't use it as a justification.
There's always an evolution of these methodologies and I often feel like in practice they're just a way to mix things up and make code health feel more interesting to those involved. Often people know there are problems with the design of their system, but they can't get buy in to fix them. These methodologies provide some authoritative justification for refactoring work.
> Most of the time it could be replaced by "tradition", "pattern", or "the way I've seen others do".
The most common valid / defensible case I've seen it used is when a person fully understands the "why" in depth, and have formed that understanding on the basis of experience (not blind following or "tradition"), but doesn't have time to deliver a long in-depth explanation every time they do a code-review.
That's not to say it isn't still "developer preferences": there are multiple "best-practice" approaches out there and some even contradict eachother. But I strongly believe that even many of the very subjective, hotly-debated approaches are valid and useful in certain contexts.
e.g. I personally lean toward a pseudo-FP style of programming and am growing less and less enamoured with OO patterns for various reasons. But OO is still a useful abstraction, and if you're doing OO, I've found something like SOLID to contain a great deal of wisdom. I never sat down and decided to do things "the SOLID way", and learned that. Rather I wrote a lot of bad OO, ran into problems, gained insight from experience, and later stumbled across SOLID and found the pitfalls it mitigates familiar.
A sibling commenter has the following quote, which I agree with:
> SOLID is especially funny because majority of developers doesn't understand / can explain it, let alone have read papers
I think this is exactly why the term "best-practice" is so popular. I would not have understood SOLID in any depth if I had come to it fresh: I needed to learn SOLID informally by accident, internalise the challenges it's designed to overcome, and then recognise that intuitively from my experience when reading about SOLID later.
That's not a level of understanding you can typically impart easily in a conversation when suggesting someone do something differently. It's much easier to just say "best-practice".
This paragraph highlights something I've been saying for ages.
Most criticism of Java needs to be directed towards Java programmers and not the language itself.
The language allows you to simply make a class. You're not required to make an interface and then make a class that implements it, and yet, Java programmers do it anyways and then criticize the language for being verbose.
Getters and Setters? You probably don't need them. Classes can have public member variables.
Java programmers seem to have the hardest time understanding YAGNI.
The getter / setter debate and public fields is about exposing the internal implementation of the object (and thus making it unchangeable without breaking other code) and being able to reason about where the internals are used (if you do need to change them).
For example, if I've got a java.util.Date exposed as a public field and someone uses it, I can't change that later to a java.time.LocalDateTime unless I change all of the things using it. If this is a library that others are using it may mean a cascade of unknown changes.
If, on the other hand, the Date was exposed as a public Date getDate() { return date.clone(); } then I don't have to worry about it. When the internals are changed to LocalDateTime, then Date getDate() { return Date.from(local.atZone(ZoneId.systemDefault()).toInstant()); } and everything will still work.
I don't have an issue with the infrequently used "default package private" level of field visibility where only classes in the same package can see that field as that limits the range of the changes to the classes within single package (in a single project).
It might be good if a change just breaks client code. The client expects just to get/set something: they might not expect that some update migth add arbitrary logging or internal logic (like setting this value will also set another value, but we won’t tell you).
Most of the get/set stuff that I see are for value objects with either no or little business logic. Simple value objects _should_ just expose their implementation.
Instead, it's always some monstrosity held together by inheritance, XML/(awful) Gradle and liberal sprinklings of magical annotations that always (always!) bite you in the backside at runtime rather than at compile time, because why would you prefer to have good tooling that takes advantage of static typing to tell you what might go wrong.
I am appalled by what Spring mentality did to my language.
That's the 'D' in SOLID -- dependency inversion. An object's dependencies should be defined in terms of abstract interfaces, not concrete classes. The verbosity is just boilerplate, and it would appear in any statically typed language if you're following SOLID principles.
> Getters and Setters? You probably don't need them. Classes can have public member variables.
Getters and setters are part of the JavaBeans spec for being able to load and configure arbitrary beans. If you're writing JavaBeans, as many EJB or Spring application developers are, you'll use getters and setters.
DI does not talk about the interface keyword in Java. It talks about interface as the public surface of an object or system.
Programming to a Java interface declaration does not guarantee the code is depending on higher-level abstractions - this depends on how the interface is defined. An interface which just replicate the public surface of a class (a "header interface") is on the exact same abstraction level as the class, it just introduce needless boilerplate.
It was also for a very long time missing some fundamental Stream, List and String processing features. Just getting the contents of a file or converting a string to bytes, or even just initializing an Array often required 2-3 lines of verbose OutputStream(StreamBuilder().fromArray(Arrays.asList(1,2))).readBytes().add(3), wrapped in a try-catch for all the checked exceptions. For things that in C# would just be a oneliner of new List<int>({1,2,3}) or in Python [1,2,3].
This is mostly solved in current versions of the language, but the reputation and the "best practice" design-patterns remains.
We do need them if the objects interact with a lot the libraries that we use. Libraries that we didn’t make. They expect get/set.
I agree that getters/setters are very overrated. I tend to make public-final fields when I can. Most of the time I can’t though.
Deleted Comment
Which is to say, early ecosystems in Java certainly needed this criticism. They did adjust, though.
I think it's that engineers of physical things have many more hard constraints they have to wrestle with, and software engineers largely don't. Your code doesn't need to obey the rules of gravity and chemistry and materials science, it just needs to somehow accomplish the task.
And you see those best practices in the places of software engineering where there are hard constraints: cryptography. high performance code. realtime systems.
It's not just a senior engineer's opinion whether you should use ruby or C if you're writing the firmware for your race car. If you use md5 to hash user passwords on a major site, you'll be hung from the rafters.
Is it actually anything like civil engineering? To my knowledge chip engineering revolves around yield. There's no such analogous concept in designing buildings that can only be reasonably constructed correctly 70% of the time and attempting to reuse the bad buildings for other projects.
Software engineering as a discipline is dominated by appeals to authority ("Clean Code", "Google does it this way", "Djikstra said so", etc.) without any (or at least not much) attempt to ask why or whether. I think we'll automate away much of software engineering (likely with very poor, inefficient, and buggy implementations) before it matures enough as an industry to be actual engineering. Engineering (and the science behind it for that matter) advances from curiosity and a healthy skepticism, not the rampant ego-driven self-promotion that runs through SE.
> If a builder constructs a house for a man but does not make it conform to specifications so that a wall then buckles, that builder shall make that wall sound using his own silver.
- Code of Hammurabi, 1755–1750 BC
I adore the craft-like parts of software dev, wouldn't trade it for anything.
(Actually I dare say that sotware itself has no physical constraints at all: software artifacts and software executions do.)
Mechanical engineering has physics as a foundation.
Chemical engineering has chemistry as a foundation.
What is the scientific foundation of software engineering?
I suspect it is a mix of cognitive science, linguistics, and anthropology.
Computer Science (more abstract) and Computing Science (less abstract) are branches of Science that have given a foundation to Software Engineering.
The problem is that most of the programming and software development that happens nowadays (and what people pay for) doesn't use it.
I compare it as Chemistry and Alchemy. We are still in the "alchemy" stage of software development. Sure, people who see themselves as "experts" are combining existing stuff to create new things. But there is a few set of experts that use the science expertise to implement systems.
Section 3 gets into the "what is computer science". 3.10 through 3.13 get into the engineering aspect. Section 6.5 puts it into historical context.
I agree. Creating a solution that is clear to navigate for future developers is valuable, as is one that is flexible for future modifications. These things depend much more on how humans understand the code than on how machines understand it.
That said, other engineering disciplines aren't entirely bereft of that kind of human aspect either. Take architecture for example. It's critical for a building to handle the forces put on it. But it's also important that humans can use it. Having a good building layout is much more a matter of understanding humans than of understanding physics.
And then there's the whole style thing that architecture has. A building is not always only functional. It can very much be a work of art. And that style is not just decorative. A style can conjure up certain emotions to put people in a certain mindset when dealing with the building. (E.g. putting a modern art museum in a neo-classical building would be dissonant.)
Would love to hear your thoughts on this, esp. the linguistics and anthropology piece.
Perhaps you don't want to make broad generalizations?
Edit: this is a discussion forum. The downvote button is not an agree/disagree button, it's to penalize irrelevant comments. If you'd like to disagree, please reply and state your thinking instead.
The interesting question is this: would other engineering pursuits (say civil) have just as much chaos and lack of authoritative practices, if changing practices would be equally fast and cheap for them?
Dead Comment
"Best practices" in software engineering are usually about internal development processes and don't have any verifiable connection to outcomes.
In other words, we have two entirely different things labeled with the same name. People who commonly use the phrase "best practices" for software are literally trying to confuse you and generally are not worth listening to.
That said, some things in software are worth analyzing to have a better process, but those things need to be examined within a specific context. If someone claims that you need to, say, create an interface for every class, they should be able to explain why and how it is relevant to your work. If people make claims they cannot explain by connecting them to meaningful outcomes, those people are, again, not worth listening to. They might be mindlessly parroting something they have heard without having any clue as to the meaning of the practice. Unfortunately, our field is full of "professionals" that do exactly that.
Software is a pretty messed up domain that is frequently a self-licking ice-cream cone. (You write code to solve problems created by other code.) Because of that, it's often socially mediated, just like non-engineering fields. To establish anything for real in this self-referential environment you need to be able to have conversations about costs, tradeoffs and outcomes - within a specific context.