Readit News logoReadit News
cwp · 4 years ago
Eh, kinda. Calling something a "best practice" is basically an appeal to authority. It means, "this is the right way to do things, for reasons I don't have time to explain." There are times when that's appropriate.

But really, "best" and "right" are highly situational. Any rule of thumb, even the most basic and uncontroversial, has a situation where it doesn't apply. I was part of a discussion on a mailing list years ago, where somebody got flamed harshly when he mentioned using a global variable. He then explained that he was working on an embedded control system for cars and all the variables in that system were global. The team was well aware of the pitfalls of global state and used a combination of documentation, static analysis, custom tooling and elaborate testing to mitigate them. It was a considered design choice that made it possible to develop high-performance, hard-realtime software that could run on very limited hardware.

theptip · 4 years ago
One thought to add here - when you appeal to an authority, which one is it?

In the OP example, it seems the senior dev is saying “on my authority”. And sometimes that is enough, especially if the senior dev can give examples of when not following this practice bit them.

But sometimes there is a higher authority, such as “it’s what is recommended in Google’s SRE book”, which is probably good advice if you are building a large SRE team. (Though as you say, not in all situations.)

I think in the worst case, “best practice” can indeed be used to shut down discussions of a leader’s preferences. But you can smell that out by asking for concrete examples, and asking how widely the practice is recommended.

A good “best practice” should be justifiable and explainable.

All that said, sometimes as the senior engineer you need to go with gut feel; “this design smells like it will give us trouble within a year” is the sort of thing I sometimes say. But I think it’s important to be honest that it’s a hunch, not a certainty in these cases, and discuss/weight accordingly.

kelnos · 4 years ago
> All that said, sometimes as the senior engineer you need to go with gut feel; “this design smells like it will give us trouble within a year” is the sort of thing I sometimes say.

That's a really great perspective. There are a bunch of times where I have a preference for or against something, but I can't point to a specific example of when that thing was good/bad, or any objective data about it. It's just that my experience suggests to me (via murky pattern matching in my brain) that particular thing will be good or bad.

It's certainly weaker evidence than data or concrete examples, but I think it's still valuable and worthy of consideration.

floverfelt · 4 years ago
Yep exactly! I think if you broaden the concept of "authority" beyond literal authoritative sources it gets even more murky.

Are you appealing to the authority of speed? Compilation time? Readability? Functionality?

I dunno, and it often (especially around readability) comes down to the developer's/senior engineers preference.

chmsky00 · 4 years ago
“Best practices” smells like such a marketing term it should be tossed in the bin.

It’s poetic language that has nothing to do with specific problems.

What people usually mean when it comes to engineering is “be safe, reliable, and correct.”

Security best practices to be safe.

Developer best practices for reliability.

Etc etc

“Best practices” is hand wave-y fluff for “do a good job” and doesn’t need a technical definition.

Make sure you’re secure, reliable, and correct given the engineering context, and odds are you result in a system that also has a lot of specific best practices in place.

SkipperCat · 4 years ago
Sometimes "best practice" is not because we need to worship a standard, but we need to have a standard. If everyone does everything in a different manner, you wind up with a tower of babel and support becomes impossible.

I'm of the mindset where an organization needs to agree to specific principles that everyone adheres. Not dogmatically but pragmatically.

Having that shared "best practice" makes is easier to support other people code/system, allows people to get up to speed faster and as a bonus, allows you to make a blog post on Medium where you can call yourself a thought leader.

funcDropShadow · 4 years ago
> I'm of the mindset where an organization needs to agree to specific principles that everyone adheres.

Agreed. But GP was talking about context dependency of those best practices. E.g. what might be a good best practice for a large organization inside a FAANG might be very bad for others. Some organization have very immediate feedback cycles. A significant problem in Amazon's online shop will probably show up immediately in sales numbers. A significant problem in the control software of a plane, might kill a few hundred people in a few years before people realize there is a problem. Therefore you cannot a/b test which auto pilot works better in a certain situation. (Although Tesla might disagree about that /s).

The point is every so called "best practice" should state under which precondition it is supposed to be applied. And most blog articles and books ignore that completely.

Deleted Comment

mbrodersen · 4 years ago
My experience is that enforcing standards for everybody is really bad. Different projects are truly different and require different trade offs. Enforcing one way to do things is anti-agile.
Jorengarenar · 4 years ago
> somebody got flamed harshly when he mentioned using a global variable.

Harsh bashing on global variables is such a dumb thing.

Yes, they can be dangerous. Yes, many had problems due to using them. Yes, we should tell beginners to avoid globals.

But there is no reason to ban them altogether. Experienced programmers should utilize them whenever it makes sense (instead of passing down a value of a local one to almost every function [sic]).

nyanpasu64 · 4 years ago
Global mutable variables are generally a bad idea because they have nonlocal side effects which are difficult to mitigate, like aliased mutable pointers but worse. Extra non-aliasing arguments and multiple return values are free of these issues, and a better tradeoff in almost all cases (unless you're sure you'll never run 2 instances of a system in the same address space, and you have specialized constraints possibly including embedded/safety-critical development and emulators).

When experienced programmers utilize global variables whenever it makes sense, it tends to bite future generations. Windows's GetLastError is a mess (some functions set it on error but don't clear it on success), I'm not sure about POSIX's errno, and when threads were introduced, these variables had to be changed to thread-local state.

jerhewet · 4 years ago
> Yes, we should tell beginners to avoid globals.

Welcome to Dependency Injection in .Net Core!

"Those who cannot remember the past are condemned to repeat it." -- George Santayana

userbinator · 4 years ago
On the contrary, I think beginners should use them because it reduces cognitive load in tiny applications.

My standard rebuttal to anti-global dogmatism is "there's only one instance, and never needs to be more. If/when we do, we can consider doing something else."

Chris2048 · 4 years ago
But here's the thing: aren't most alternatives to global, some other kind of global state anyway, but possibly better managed?
atoav · 4 years ago
It is best practise to use RCCBs, because it turned out faulty wiring can kill people. But in Server rooms where you might not want to switch off the whole rack without warning when one device is faulty, you can use a device to monitor the residual current (RCM). Which issues a warning first, and only switches of when the residual current raises over the acceptable level. Different scenario, different best practise. (This is also the reason medical equipment is expensive).

I think a professional should be aware why a best practise exists and how to deal with a situation where for some reason it cannot be applied as you showed with the embedded example.

Don't forget however that many devs are against best practises out of lazyness or because they don't understand the reasons why they are best pracises.

asdff · 4 years ago
Identifying a best practice is actually not as hard as people realize imo (although it may take some time), and only becomes hard when you let emotion and sources of emotion come into what should be a rational decision making process (such as preference for a certain tooling for reasons like familiarity or popularity in the field today rather than outright advantages vs other tooling).

To identify the best practice for anything, you start by doing a review of all the available practices in the field for a given problem you are working on. Then once you've reviewed the literature you can work out the pros, cons, caveats of each of these tools, and how these considerations affect your particular use case and your expected results. Then after doing that, the best practice out of available options will be readily apparent, or at the very least strongly justified, not by an appeal to authority or popularity or familiarity, but by looking at what the underlying technology actually does and how its relevant or not to your particular task at hand. In the end you will find the very best hammer available out of all the hammers people have made in this field for your particular unique nail.

lucumo · 4 years ago
That sounds like a beautiful example of letting perfect be the enemy of good.

Just like your design choices have trade-offs, is it important to realize that there's a trade-off between finishing sooner and making a better solution. Diminishing returns are usually very much in play with analysis.

(I would also challenge the notion that every situation has a different "best practice". That's just creating a solution. "Best practices" are usually general advise that is applicable in most situations.)

mbrodersen · 4 years ago
Two different “experts” will do this and arrive at different conclusions. Now what.
dragonwriter · 4 years ago
> Calling something a "best practice" is basically an appeal to authority.

If presented on its own, but then, any conclusion presented on its own without supporting context and analysis is the same.

> But really, "best" and "right" are highly situational

A description of a best practice that doesn't provide a sufficiently precise description of the situation to which it applies as a best practice is generally inappropriate, unless it is the conclusion of an analysis of applicable best practices to a certain situation, in which case the scope is specified in framing the analysis.

It is true that lots of things described as best practices for particular situations with supporting rationale and up getting detached from their logic and context and becoming cargo cult best practices.

w0mbat · 4 years ago
Even worse are people that talk of "code smells", applying their personal opinion of style in a judgmental and often unjustifed way.
mbrodersen · 4 years ago
Smart experienced software developers disagree on “best practices” all the time. What is “obviously” the best practice to you is “obviously” not the best practice for somebody else. Now what?
swixmix · 4 years ago
Reminds me of US Generally Accepted Accounting Principles (GAAP). I don't need to be perfect, just consistently good enough.
RNCTX · 4 years ago
Or Tesla/Uber/etc, in which rules don't apply to you so do whatev
908B64B197 · 4 years ago
> He then explained that he was working on an embedded control system for cars and all the variables in that system were global.

Everyone should write Embedded at least once. It's a completely different world.

m463 · 4 years ago
and hard real-time.

It really inverts some priorities (I mean development priorities, not the priority inversion on mars pathfinder)

poulsbohemian · 4 years ago
The biggest issue I saw with "best practices" in my career is the failure to take into account who it claiming it to be a best practice, and in what context. I saw too many junior developers read a rando blog article, then get a non-technical / semi-technical manager excited about something that made their life easier, even though it was by no means a good practice for the context at hand. Or alternatively, believe some vendor carte blanche when they tell you their product somehow follows a best practice.

The overarching problem is that yes, there is software engineering going on in the world, but most organizations are not willing to do engineering. I don't blame the technical staff - they often have good intentions - but rather the typical business is not willing to pay the cost in time or money to do long-lasting engineering practices. This is one of the things not enough of us think about in our career choices - am I going to a place that practices fire drills or engineering?

Lich · 4 years ago
Spot on about reading a random blog or article. I think many engineers at work are under pressure to deliver, and are looking for quick solutions. They do a search and see a Medium post related to their problem written by someone who says they are an “<platform> developer at <company>”, and for some reason most readers see these authors as an authority in their domain (because why else would they be writing about it? /s), and just accept the blog post’s practices or conclusions. Once read an article a long time ago about Android’s async task , and some blog post claimed it should only be used for operations less than one second. I looked at the official documentation and while it did mention that it should be used for short operations, no where did it mention it should be less than one second specifically (unless I missed it). I saw the same advice mentioned by many other devs who referenced that same article.
oreally · 4 years ago
To add on to this, the current most preached about "best practices" comes largely from a de-risking, never-ever fail point of view, ie. 'safety'. Unfortunately with such standards also comes a concept known as 'acccountability', ie. 'ass-covering' practices that provide little practical value.

This results in programmers no longer being able to iterate fast and having to rely on some third-party whose tradeoffs they don't understand, resulting in slow, bloated software and dissatisfied programmers.

atoav · 4 years ago
I am not sure if I read you right here (correct me if I am wrong), but do you say that safety and stability concerns lead to bloated software because devs cannot blindly trust third party dependecies?

Because if so, yeah. You are responsible for the software you write and the dependecies you use. Software engineering is one of the least responsible engineering diciplines anyways. I am e.g. also a certified electrical engineer and of course I am responsible if my wrong decisions kill someone. If I would use a cheap chinese knockoff circuit breaker (because we can iterate faster if the stuff isn't expensive and certified) and someone gets killed, I go to jail. If someone gets killed and I can proof that I followed the currently agreed on state of technology my ass is covered. Of course you can get a lightbulb to light up without following any rule (and maybe this would be more efficient), but in EE the existing rules came as a consequence of deaths and rhe prevention of those is worth it.

In software engineering the worst that can usually happen (unless you work in IOT or industrial applications) is that you loose your user data. Many software devs don't care about whether they loose their users data and there are no tangible consequences. "Oh it was a software error" is still a good excuse, as if there was nothing which could have prevented that software error. I program software myself but I am al for stricter consequences in our profession because it would straighten out some heads who think this is all just fun and games for their personal joy.

Zababa · 4 years ago
Do you have any concrete examples of that? I think I kind of see what you mean, and I have a feeling that this is mostly right, but I'm not sure.
cptaj · 4 years ago
Managers excited about a new tech idea are probably the most destructive thing in the industry.
noidesto · 4 years ago
As are managers that conform to archaic tech when better options exist.
andi999 · 4 years ago
Nah, just rewrite the code base.
yunohn · 4 years ago
> I saw too many junior developers read a rando blog article, then get a non-technical / semi-technical manager excited about something that made their life easier, even though it was by no means a good practice for the context at hand.

I’ve seen the other way around too - senior developers rejecting or pushing changes that they find convenient. IME it’s not to do with experience, rather stubbornness.

dmalik · 4 years ago
> How can Software Engineers call themselves engineers when there’s no rules, governing bodies, or anything to stipulate what true Software Engineering is?

We call ourselves software developers in Canada.

According to Canadian engineering[1]: The "practice of engineering" means any act of planning, designing, composing, evaluating, advising, reporting, directing or supervising, or managing any of the foregoing, that requires the application of engineering principles, and that concerns the safeguarding of life, health, property, economic interests, the public welfare, or the environment.

To be considered a work of engineering, then, that a piece of software (or a software-intensive system) must meet two conditions:

1. The development of the software has required “the application of a systematic, disciplined, quantifiable approach to the development, operation, and maintenance of software.”

2. There is a reasonable expectation that failure or inappropriate functioning of the system would result in harm to life, health, property, economic interests, the public welfare, or the environment

[1] - https://engineerscanada.ca/news-and-events/news/when-softwar...

mLuby · 4 years ago
> "The 'practice of engineering' means any act … that requires the application of engineering principles"

Error: cycle detected. ^__^

Software is by its very nature systematic and quantifiable. Is the quibble with whether programmers are disciplined?

All software companies would meet the "failure or inappropriate functioning of the system would result in harm to … economic interests" criterion.

hinkley · 4 years ago
It's like everyone has forgotten third grade.

Define this word without using the word.

Lift: British English. A machine that lifts peo... damnit!

psyc · 4 years ago
I aggressively refuse to call myself an engineer. Programmer is fine. Developer is better, because programming is only a piece of it. Y’all can call yourselves whatever you like, but inside I’m thinking you retconned an industry insider euphemism into a non-existent sub-type of engineering, by protesting all the ways you apply rigor to what we do.
mywittyname · 4 years ago
What of those who graduated from engineering programs taught at engineer colleges?

Lots of people in other engineering disciplines build stuff with less rigor than is used to build software. Maybe load test it in Fusion 360 before sending the file to be milled. And lots of software companies test their stuff quite rigorously.

People who write software have imposter syndrome. Other disciplines are not building things better than we do. For every Google there's a Ford, and for every scrappy startup there's a scrappy machine shop.

wruza · 4 years ago
Isn’t it easier to just stick to the term and not fight with it? Everyone involved knows that “software engineering” is not like “civil engineering”, and those who don’t, usually don’t know either. It has no legal nor practical sense, so why bother?
hinkley · 4 years ago
Margaret Hamilton coined the term. But then again, anyone who can keep a lunar lander from turning into a new crater probably has more right to that term than most of us.
plandis · 4 years ago
Well crap, I guess I’m going to have to burn my diploma that clearly says I’m an electrical engineer.
chana_masala · 4 years ago
I believe I heard this in the Pragmatic Programmer: engineering is the practice of applying scientific understanding to real world problems. Or more succinctly, engineering is applied science. So software engineering is applying computer science to solve real world problems.
Kronen · 4 years ago
The second condition is a bit stupid, most failures or inappropriate functioning of the system in software development can be considered a harm in economic interests...
kuratkull · 4 years ago
I write server-side software for a security system - anyone else can write a similarly complex piece of software to monitor grains of sand on the beach. So I am an engineer and they are not just because my software is used by some pre-described group of people?
cobbal · 4 years ago
“It’s not real engineering if someone can’t get hurt” seems like a strange definition to me.
milkytron · 4 years ago
That's because it is a strange definition.

If you work on the software in a self driving system in a car, that makes you an engineer. If you're working on software for a space satellite with no occupants... that's not an engineer?

bonestamp2 · 4 years ago
Yes, I think we sometimes use the term "Engineer" a little too liberally in the US. For example, sometimes the janitor of an office building is called a "Custodial Engineer" which often just gets shortened to "Engineer". I remember the first time I heard a coworker say, "I'll call the engineer to have the thermostat adjusted". I thought that was overkill, but then a guy in overalls showed up and it made a little more sense.
chana_masala · 4 years ago
Was your colleague from the UK? A number of trade professions in the UK are referred to as Engineers, such as for the boiler etc.
milesvp · 4 years ago
So what do you call electrical engineers who design consumer level non mains voltage hardware? Looks like they fail the second bullet point. Sort of an honest question, since most of what you’d likely learn in an EE degree would probably not fall under number 2 either.
Kronen · 4 years ago
We call Software Developers to coders and then, Consultant or Software Architect to the ones who design the systems. In between you have Analysts.

Any of them can be Software Engineers or not, because we call Software Engineers to the ones with the university career finished

gr__or · 4 years ago
The wonderful Hillel Wayne has two great essays on the "Engineer"ing question:

https://www.hillelwayne.com/post/are-we-really-engineers/

https://www.hillelwayne.com/post/we-are-not-special/

agentultra · 4 years ago
It'll be interesting to see if they can ever manage to enforce it. I haven't heard of any cases yet since Microsoft challenged them and won.
auxym · 4 years ago
xav0989 · 4 years ago
I believe that Shopify will change your job title from Engineer to Engineering Technician (or something of that nature) if you don't have your professional engineering certification.
skipants · 4 years ago
And yet, in Canada, every Software Engineer I know works no differently than a Software Developer.
8note · 4 years ago
They can sign and stamp things though, which is a difference on the work your company can do
908B64B197 · 4 years ago
I've met Canadian Software Engineers who referred to themselves as Software Engineers...
allo37 · 4 years ago
Some of us call ourselves "engineers". It just depends whether you want to pay the order for the privilege of having the title (and pretty much no other tangible benefit in 90+% of cases) :).
wheelinsupial · 4 years ago
Genuine question. What is so important about the word "engineer" that people not licensed to practice it want to call themselves engineer?

In Canada, I studied in a mechanical engineering technology program that lead to an engineering degree if you stayed on for 4 years. It was hammered into us that we weren't engineers until after you graduated and went through the professional licensing process.

In Canada there is a way to become a P.Eng. in software engineering (and until 2018 in the US there was the P.E. in software engineering), so there is a way for someone who wants to be called a software engineer to legally obtain it. So why is there so much resistance?

hyperman1 · 4 years ago
There is value in predictability, even if it stems from someone else's preferences.

Current home wiring requires you put wires in the wall at (I think) between 10 and 20 cm from the border, and a small number of cm inside the wall. This means you only have to check that zone, and can use detectors, to find the wires.

My home is from the 1950's. Some wires go diagonally from top left to bottom right at the other side of the wall. We had great fun finding out where they were hiding.

Even if the new wires waste a lot more wiring and PVC tube, I vastly prefer them when redecorating or drilling holes.

floverfelt · 4 years ago
100% that's true! I'm not saying developers having preferences and everybody abiding by them is a bad thing, more just that a lot of what constitutes "best practices" are often preferences and we should call them that.
only_as_i_fall · 4 years ago
Doesn't that kind of imply that everyone's opinions are equally valid? If 95% of your profession is on the same page with a certain practice then I'd argue it's really not reasonable to go against the grain without a very good reason.

I think "best practices" strikes a good balance between things that are personal preferences and things that are laws.

erdo · 4 years ago
I agree, at least in my experience of android development, "Best Practice" often means: what I read on Medium, or what Google said.

It's for people who aren't confident enough to admit to simply having a preference, or knowledgeable enough to be able to explain it

tannhaeuser · 4 years ago
1950's onwards they used ribbon cables close to surfaces for household wiring. It was best practice, just not today's.

As a freelancer jumping projects, I can't help but see parallels, in that you've really got to study code bases wrt the time of authorship to understand their particular idiosyncrasies.

In the 2010's, people believed in "REST" without considering the context in which these concepts were introduced (eg thin browser UIs, or generalizations thereof as a baseline). Customers, even highly capable devs, flaunt their REST best practices, yet see HATEOAS as optional and pretentious, failing to see the entire point of loose coupling and discovery, and engaging in zealotry about squeezing parameters into URLs instead. Or pretend to, to stop pointless discussions with mediocre, bureaucratic peers.

AdrianB1 · 4 years ago
In many countries this is coded (mandated by standards), it is not a best practice.
charles_f · 4 years ago
I hate the expression "best practice", it's so, so often used by someone to justify applying cargo cult without actually understanding why.

"Hey why are you having a try-catch there, it seems like it's just gonna break our stack trace and we don't even graciously recover from it" - it's best practice

"Hey why are you using model/view/controller folders?" - it's best practice

"Why are you building microservices?" - best practice.

I got out of a code base with stylecop's settings set to max and treated as errors. Trailing white spaces, mandatory comments on everything, etc. The only exceptions are file size and method size. Files are often in the thousands of lines, and methods can reach that as well. 10 levels of nested if/else. So we're in an unmanageable code base, but at least we don't have trailing white spaces.

Most of the time it could be replaced by "tradition", "pattern", or "the way I've seen others do". If it's a tradition and most people agree to it, fine, might help with readability. If it's pattern, fine, tell me why it's applicable and helpful in our context. If you can't actually explain why you're doing something, maybe rethink it and educate yourself on the topic

tester756 · 4 years ago
Same with Clean Code, SOLID, DD

SOLID is especially funny because majority of developers doesn't understand / can explain it, let alone have read papers e.g [0],

except maybe S letter, yet everybody acts like they do SOLID*.

Or Clean Code's small function craziness or misinterpretation and going into "avoid comments" approach

* - unless you ask for details

[0] - https://www.cs.cmu.edu/~wing/publications/LiskovWing94.pdf

charles_f · 4 years ago
So the only caveat I would put on that: it's useful to have common language for things, my threshold is that you understand why things are this way.

It's useful to tell someone: "you should have that thing in another class because it's clearly another responsibility", and have a debate on the responsibility itself ; or even on the merits of having a single responsibility per class.

What is not useful is when someone will start making monofunction classes that are 10 lines long on the hotel of "smaller functions are better" and sticking to that because "it's a best practice" with no better argument.

Basically my point is: if you understand what you are doing, great, use vocab all you want. But don't use it as a justification.

tdeck · 4 years ago
Interesting, I guess SOLID must have been a buzzword from an earlier era, because while I've vaguely heard of it I had to look it up. Don't think it's come up in my 8 years of professional work.

There's always an evolution of these methodologies and I often feel like in practice they're just a way to mix things up and make code health feel more interesting to those involved. Often people know there are problems with the design of their system, but they can't get buy in to fix them. These methodologies provide some authoritative justification for refactoring work.

lucideer · 4 years ago
> I hate the expression "best practice", it's so, so often used by someone to justify applying cargo cult without actually understanding why.

> Most of the time it could be replaced by "tradition", "pattern", or "the way I've seen others do".

The most common valid / defensible case I've seen it used is when a person fully understands the "why" in depth, and have formed that understanding on the basis of experience (not blind following or "tradition"), but doesn't have time to deliver a long in-depth explanation every time they do a code-review.

That's not to say it isn't still "developer preferences": there are multiple "best-practice" approaches out there and some even contradict eachother. But I strongly believe that even many of the very subjective, hotly-debated approaches are valid and useful in certain contexts.

e.g. I personally lean toward a pseudo-FP style of programming and am growing less and less enamoured with OO patterns for various reasons. But OO is still a useful abstraction, and if you're doing OO, I've found something like SOLID to contain a great deal of wisdom. I never sat down and decided to do things "the SOLID way", and learned that. Rather I wrote a lot of bad OO, ran into problems, gained insight from experience, and later stumbled across SOLID and found the pitfalls it mitigates familiar.

A sibling commenter has the following quote, which I agree with:

> SOLID is especially funny because majority of developers doesn't understand / can explain it, let alone have read papers

I think this is exactly why the term "best-practice" is so popular. I would not have understood SOLID in any depth if I had come to it fresh: I needed to learn SOLID informally by accident, internalise the challenges it's designed to overcome, and then recognise that intuitively from my experience when reading about SOLID later.

That's not a level of understanding you can typically impart easily in a conversation when suggesting someone do something differently. It's much easier to just say "best-practice".

Sohcahtoa82 · 4 years ago
> Java is infamous for its verbosity. [...]

This paragraph highlights something I've been saying for ages.

Most criticism of Java needs to be directed towards Java programmers and not the language itself.

The language allows you to simply make a class. You're not required to make an interface and then make a class that implements it, and yet, Java programmers do it anyways and then criticize the language for being verbose.

Getters and Setters? You probably don't need them. Classes can have public member variables.

Java programmers seem to have the hardest time understanding YAGNI.

shagie · 4 years ago
> Getters and Setters? You probably don't need them. Classes can have public member variables.

The getter / setter debate and public fields is about exposing the internal implementation of the object (and thus making it unchangeable without breaking other code) and being able to reason about where the internals are used (if you do need to change them).

For example, if I've got a java.util.Date exposed as a public field and someone uses it, I can't change that later to a java.time.LocalDateTime unless I change all of the things using it. If this is a library that others are using it may mean a cascade of unknown changes.

If, on the other hand, the Date was exposed as a public Date getDate() { return date.clone(); } then I don't have to worry about it. When the internals are changed to LocalDateTime, then Date getDate() { return Date.from(local.atZone(ZoneId.systemDefault()).toInstant()); } and everything will still work.

I don't have an issue with the infrequently used "default package private" level of field visibility where only classes in the same package can see that field as that limits the range of the changes to the classes within single package (in a single project).

Jtsummers · 4 years ago
Getters and Setters also permit you to specify that something can only be retrieved or assigned, but not the other. This is non-trivial in most programming languages so having this as a common pattern is useful. Though I like C#'s properties versus seeing a bunch of getFoo and setFoo methods running around, if Java had provided the same or a similar capability there would probably be no controversy. It's the noisiness of the Java solution that seems to irk people more than the concept.
avgcorrection · 4 years ago
Apparently your field was used both for some internal shenanigans as well as a public field through the get/set indirection. Then you changed the field but kept the old public behavior. I wonder whether (1) that dual purpose internal/public role was wise to be begin with, and (2) whether that change of the internal logic of the getter might have just papered over a more significant change.

It might be good if a change just breaks client code. The client expects just to get/set something: they might not expect that some update migth add arbitrary logging or internal logic (like setting this value will also set another value, but we won’t tell you).

Most of the get/set stuff that I see are for value objects with either no or little business logic. Simple value objects _should_ just expose their implementation.

selfhoster11 · 4 years ago
Thank you. I have programmed Java for years, but it's been frustrating. The clean language I learnt at university, that I love to bits, seems to be used approximately nowhere.

Instead, it's always some monstrosity held together by inheritance, XML/(awful) Gradle and liberal sprinklings of magical annotations that always (always!) bite you in the backside at runtime rather than at compile time, because why would you prefer to have good tooling that takes advantage of static typing to tell you what might go wrong.

I am appalled by what Spring mentality did to my language.

bitwize · 4 years ago
> The language allows you to simply make a class. You're not required to make an interface and then make a class that implements it, and yet, Java programmers do it anyways and then criticize the language for being verbose.

That's the 'D' in SOLID -- dependency inversion. An object's dependencies should be defined in terms of abstract interfaces, not concrete classes. The verbosity is just boilerplate, and it would appear in any statically typed language if you're following SOLID principles.

> Getters and Setters? You probably don't need them. Classes can have public member variables.

Getters and setters are part of the JavaBeans spec for being able to load and configure arbitrary beans. If you're writing JavaBeans, as many EJB or Spring application developers are, you'll use getters and setters.

goto11 · 4 years ago
> That's the 'D' in SOLID -- dependency inversion. An object's dependencies should be defined in terms of abstract interfaces, not concrete classes

DI does not talk about the interface keyword in Java. It talks about interface as the public surface of an object or system.

Programming to a Java interface declaration does not guarantee the code is depending on higher-level abstractions - this depends on how the interface is defined. An interface which just replicate the public surface of a class (a "header interface") is on the exact same abstraction level as the class, it just introduce needless boilerplate.

Too · 4 years ago
Java was for very long missing anonymous functions, so for anything involving callbacks you had to make gazillion of Listener- or Factory-classes instead, which further forced you to create an interface for each of them. Leading to all these infamously verbose design-patterns.

It was also for a very long time missing some fundamental Stream, List and String processing features. Just getting the contents of a file or converting a string to bytes, or even just initializing an Array often required 2-3 lines of verbose OutputStream(StreamBuilder().fromArray(Arrays.asList(1,2))).readBytes().add(3), wrapped in a try-catch for all the checked exceptions. For things that in C# would just be a oneliner of new List<int>({1,2,3}) or in Python [1,2,3].

This is mostly solved in current versions of the language, but the reputation and the "best practice" design-patterns remains.

avgcorrection · 4 years ago
> Getters and Setters? You probably don't need them. Classes can have public member variables.

We do need them if the objects interact with a lot the libraries that we use. Libraries that we didn’t make. They expect get/set.

I agree that getters/setters are very overrated. I tend to make public-final fields when I can. Most of the time I can’t though.

rajacombinator · 4 years ago
Very true but Java devs tend to have limited perspective and flexibility. If you get rid of the cruft it’s a fine language.

Deleted Comment

taeric · 4 years ago
Amusingly, j2ee did require you to make an interface and an implementation. I seem to recall you had to also have a "stub" class.

Which is to say, early ecosystems in Java certainly needed this criticism. They did adjust, though.

floverfelt · 4 years ago
Yeah, I'm actually a pretty big fan of Java programming. It gets a lot of hate on HN for things that have been mostly solved or can be solved if you implement it a certain way.
fitzn · 4 years ago
Steven Sinofsky gave a talk and said something to the effect of, we've been building roads, bridges and edifices for thousands of years. So, best practices and solved problems abound---and even then we still get it wrong sometimes. Whereas, software engineering is maybe 70 years old (generously)? So, there is much to learn and a lot of "baseline" knowledge that has yet to be established. I think it's a good way to think about things.
habitue · 4 years ago
Eh, I mean we've been building computer chips for approximately the same amount of time as computer software, and it's pretty clear chip engineering is more like civil engineering than software engineering. I would guess many of the best practices in bridge building in the modern day were developed in the last 70 years.

I think it's that engineers of physical things have many more hard constraints they have to wrestle with, and software engineers largely don't. Your code doesn't need to obey the rules of gravity and chemistry and materials science, it just needs to somehow accomplish the task.

And you see those best practices in the places of software engineering where there are hard constraints: cryptography. high performance code. realtime systems.

It's not just a senior engineer's opinion whether you should use ruby or C if you're writing the firmware for your race car. If you use md5 to hash user passwords on a major site, you'll be hung from the rafters.

Drew_ · 4 years ago
> Eh, I mean we've been building computer chips for approximately the same amount of time as computer software, and it's pretty clear chip engineering is more like civil engineering than software engineering.

Is it actually anything like civil engineering? To my knowledge chip engineering revolves around yield. There's no such analogous concept in designing buildings that can only be reasonably constructed correctly 70% of the time and attempting to reuse the bad buildings for other projects.

sidlls · 4 years ago
Civil (and other engineering) got better because there was motivation to improve that came from multiple directions: literal lives at stake, the pride of good craftsmanship, iterative or even grand steps forward in knowledge, etc.

Software engineering as a discipline is dominated by appeals to authority ("Clean Code", "Google does it this way", "Djikstra said so", etc.) without any (or at least not much) attempt to ask why or whether. I think we'll automate away much of software engineering (likely with very poor, inefficient, and buggy implementations) before it matures enough as an industry to be actual engineering. Engineering (and the science behind it for that matter) advances from curiosity and a healthy skepticism, not the rampant ego-driven self-promotion that runs through SE.

dtech · 4 years ago
This is a symptom of the lack of knowledge not a cause. Imagine you want to built a building but no one really knows how, then copying successfully completed buildings, and established construction engineers and companies is a pretty good idea. That's what's going on in SE.
pjmlp · 4 years ago
What about we apply this to software engineering?

> If a builder constructs a house for a man but does not make it conform to specifications so that a wall then buckles, that builder shall make that wall sound using his own silver.

- Code of Hammurabi, 1755–1750 BC

julianlam · 4 years ago
I feel like I already spend 90% of my day gluing together various disparate APIs. Is this the logical conclusion to software development?

I adore the craft-like parts of software dev, wouldn't trade it for anything.

lou1306 · 4 years ago
Furthermore, there are far fewer physical constraints in software, so the range of possible designs is dramatically wider.

(Actually I dare say that sotware itself has no physical constraints at all: software artifacts and software executions do.)

JohnWhigham · 4 years ago
Which is why I don't buy the "software has only been around for 70 years so give it time" argument. Software has nothing to be grounded in like other engineers do with physics. It's most likely always going to be endless cargo culting.
afarrell · 4 years ago
Also, there isn't that much in the way of scientific grounding.

Mechanical engineering has physics as a foundation.

Chemical engineering has chemistry as a foundation.

What is the scientific foundation of software engineering?

I suspect it is a mix of cognitive science, linguistics, and anthropology.

xtracto · 4 years ago
> I suspect it is a mix of cognitive science, linguistics, and anthropology.

Computer Science (more abstract) and Computing Science (less abstract) are branches of Science that have given a foundation to Software Engineering.

The problem is that most of the programming and software development that happens nowadays (and what people pay for) doesn't use it.

I compare it as Chemistry and Alchemy. We are still in the "alchemy" stage of software development. Sure, people who see themselves as "experts" are combining existing stuff to create new things. But there is a few set of experts that use the science expertise to implement systems.

NikolaeVarius · 4 years ago
Math. Which seems that many practitioners are proud of not knowing
shagie · 4 years ago
An old HN post - https://news.ycombinator.com/item?id=20912718 and its corresponding material - https://cse.buffalo.edu/~rapaport/Papers/phics.pdf looks to address some of that.

Section 3 gets into the "what is computer science". 3.10 through 3.13 get into the engineering aspect. Section 6.5 puts it into historical context.

lucumo · 4 years ago
> I suspect it is a mix of cognitive science, linguistics, and anthropology.

I agree. Creating a solution that is clear to navigate for future developers is valuable, as is one that is flexible for future modifications. These things depend much more on how humans understand the code than on how machines understand it.

That said, other engineering disciplines aren't entirely bereft of that kind of human aspect either. Take architecture for example. It's critical for a building to handle the forces put on it. But it's also important that humans can use it. Having a good building layout is much more a matter of understanding humans than of understanding physics.

And then there's the whole style thing that architecture has. A building is not always only functional. It can very much be a work of art. And that style is not just decorative. A style can conjure up certain emotions to put people in a certain mindset when dealing with the building. (E.g. putting a modern art museum in a neo-classical building would be dissonant.)

floverfelt · 4 years ago
> cognitive science, linguistics, and anthropology.

Would love to hear your thoughts on this, esp. the linguistics and anthropology piece.

choeger · 4 years ago
How about math? Complexity theory, all kinds of logic, computational theory, category theory?
edejong · 4 years ago
Well, if you make broad generalizations like that, then software engineering has mathematics as a foundation.

Perhaps you don't want to make broad generalizations?

optymizer · 4 years ago
Computer Science?

Edit: this is a discussion forum. The downvote button is not an agree/disagree button, it's to penalize irrelevant comments. If you'd like to disagree, please reply and state your thinking instead.

MarkLowenstein · 4 years ago
Programming changes practice quickly and often because it's cheap to do so compared to physical engineering which is slowed down by execution time, high materials cost, and sunk costs.

The interesting question is this: would other engineering pursuits (say civil) have just as much chaos and lack of authoritative practices, if changing practices would be equally fast and cheap for them?

Dead Comment

gambler · 4 years ago
Best practices in civil engineering are connected to outcomes. You know that something is a good practice if not doing it causes things to collapse or catch on fire.

"Best practices" in software engineering are usually about internal development processes and don't have any verifiable connection to outcomes.

In other words, we have two entirely different things labeled with the same name. People who commonly use the phrase "best practices" for software are literally trying to confuse you and generally are not worth listening to.

That said, some things in software are worth analyzing to have a better process, but those things need to be examined within a specific context. If someone claims that you need to, say, create an interface for every class, they should be able to explain why and how it is relevant to your work. If people make claims they cannot explain by connecting them to meaningful outcomes, those people are, again, not worth listening to. They might be mindlessly parroting something they have heard without having any clue as to the meaning of the practice. Unfortunately, our field is full of "professionals" that do exactly that.

Software is a pretty messed up domain that is frequently a self-licking ice-cream cone. (You write code to solve problems created by other code.) Because of that, it's often socially mediated, just like non-engineering fields. To establish anything for real in this self-referential environment you need to be able to have conversations about costs, tradeoffs and outcomes - within a specific context.