Readit News logoReadit News
taylorallred · 5 months ago
I really appreciate how this article explains why certain design patterns became a thing. Usually, it was to address some very practical problem or limitation. And yet, a lot of younger programmers treat these patterns like a religious dogma that they must follow and don't question if they really make sense for the specific situation they are in.
layer8 · 5 months ago
The main motivation for the concept of design patterns is to give unique names to existing programming patterns, so that when someone says “Strategy pattern”, everyone knows what pattern that refers to, and vice versa that the same pattern isn’t called ten different things. It’s to make communication about program design efficient, by defining a vocabulary of patterns that tend to reoccur, and a common structure for describing them. Not all patterns were successful in that way, but it’s the main idea behind design patterns.

The question of when a using a given pattern is appropriate is orthogonal to that. The fact that a named pattern has been defined doesn’t imply a recommendation to use it across the board. It depends on the context and on design forces, and those change with time and circumstances. Anti-patterns are patterns as well.

It’s a pity that the idea of design patterns ended up (after the pattern language craze faded) being almost exclusively associated with the specific patterns named and described in the GoF book.

dragonwriter · 5 months ago
> The main motivation for the concept of design patterns is to give unique names to existing programming patterns

No, naming them is not the main purpose, preserving and transmitting knowledge of what they are and what they are useful for, so that people aren't fofced to rediscover solutions to the same problems over and over again. [0] Naming is obviously important for that purpose, but isn't the main goal, but a means of supporting it.

[0] If this sounds like a subset of the purpose of a reusable code library, it is, which is why in languages with sufficient abstraction facilities to allow the generic implementation of a pattern to be reusable, well documented (for the “where and when to use this” piece) code libraries replace documents that have the explanation paired with implementation recipes that one can modify to one’s particular use.

dkarl · 5 months ago
> It’s a pity that the idea of design patterns ended up (after the pattern language craze faded) being almost exclusively associated with the specific patterns named and described in the GoF book

That book is the closest we came to establishing a common language. I remember brushing up on the names of design patterns whenever I had an interview. Ultimately, though, it didn't yield any benefit that the industry is missing now.

Like you said, the fundamental idea behind the book was that consciously naming, cataloging, and studying design patterns would improve communication among programmers. There was also an idea that studying design patterns would give beginning programmers a richer repertoire of programming techniques faster than if they had to figure them out themselves.

Looking back with decades of hindsight, my belief is that awareness and intentional use of design patterns made no difference whatsoever. Some names stuck, and would have anyway. Others didn't, and years of status as official "design patterns" in a book widely studied across the industry couldn't make them. The younger programmers I work with who had no exposure to the GoF book, and for whom "design patterns" is something that dusty old farts used to talk about, use patterns like Flyweight, Proxy, Command, Facade, Strategy, Chain of Responsibility, Decorator, etc. without knowing or needing a name for them, and they communicate amongst themselves just as efficiently as my generation did at the height of the design pattern craze.

In the final analysis, I have never looked at the less experienced programmers around me and thought, "This situation would go faster and smoother if they had studied design patterns." The generation that learned to program after design patterns had faded as an idea learned just as quickly and communicates just as well as the generation that studied them assiduously as junior programmers like I did.

seadan83 · 5 months ago
> a lot of younger programmers treat these patterns like a religious dogma

First you learn what the pattern is. Then you learn when to use it. Then you learn when not to use it.

The gap between the first and third step can be many years.

arcanemachiner · 5 months ago
> The gap between the first and third step can be many years.

I admire your optimism!

AceJohnny2 · 5 months ago
The explanations are great! The condescension, not so much.

> Simple: we just use the language like it was meant to be used.

> Use Default Arguments Like a Normal Human

etc

revskill · 5 months ago
Yes, programmer is super human
zelphirkalt · 5 months ago
Probably also a problem that exists because of how programmers are taught. Using Java and being presented with the patterns as solutions to what Java does.
motorest · 5 months ago
> I really appreciate how this article explains why certain design patterns became a thing. Usually, it was to address some very practical problem or limitation.

I don't agree at all. I feel that those who criticise design patterns as solution to practical problems are completely missing the point of design patterns, and the whole reason they have the name the have: design patterns. I'll explain.

Design patterns are solutions to common design problems, but "problems" isn't the kind of problems you think it is. It's "problems" in the sense that there are requirements to be met. A state pattern is a way to implement a state machine, but you still have a state machine and a state pattern if your state classes don't handle state transitions.

More to the point, look at singletons. It's irrelevant if they are implemented with a class or a closure or a module. What makes a singleton a singleton is the fact that there is an assurance that there will be a single instance of an object. Does an implementation that allow multiple instances or doesn't return the same instance qualifies as a singleton? Obviously not.

Design patterns are recurring solutions to recurring problems. They are so recurring that they get their name and represent a high level concept. A message queue is a design pattern. An exception is a design pattern. Lazy loading is a design pattern. Retries and exponential backoffs are design patterns. Etc. Is anyone arguing that Python has none of it?

So many people trying to criticise the GoF but they don't even bother to be informed or form an educated opinion.

hinkley · 5 months ago
I see you’ve been downvoted.

Design patterns aren’t solutions to common design problems. They’re after the fact descriptions of solutions for design problems. That’s the issue. That’s the beef. Everyone thought of that book as a cook book instead of a naturalists’ musings on an ecosystem, which is what they are.

Those of us who designed before people discovered that stupid book were constantly asked what the differences were between this pattern and that. And the book just isn’t thick enough and Eric Gamma was just trying to complete a thesis not write a book, so despite having at least 8 years in industry before completing his masters he cocked it up. And ruined Java in the process.

We had a contemporary of Vlissades teach a class at my last company and he crystallized all of my suspicions about the GoF book and added a whole lot more.

My advice for at least fifteen years is, if you think you want to read GoF, read Refactoring instead. If you’ve read Refactoring and still want to read GoF, read Refactoring a second time because it didn’t all sink in.

Refactoring is ten times the value of GoF for teaching you how to do this trade and how to break up architectural brambles.

happymellon · 5 months ago
I also completely disagree with the builder description.

It sounds like they've not had to use it for any meaningful work, and basically described a constructor. Yeah, named parameters are great, and I miss them when I don't have them, but if you think a builder only works like this

Object.setX().setY().build()

Then it tells me that you haven't built anything meaningful. It's a way of building a state and running an operation at the end in a contained manner. If your build method doesn't run some sort of validation then it's probably a waste of time, might as well just call setters, return this and call it a day if you want to be verbose and chain commands.

orwin · 5 months ago
I mean, people usually call those 'feature' when they're built-in. I would never call 'lazy evaluation' in Haskell a design pattern, because it's part of the language.

If I have to implement something similar myself in C++ however, I'll use a niche design pattern.

kccqzy · 5 months ago
I don't however appreciate that the author doesn't actually know about Java or C++ well enough such that they are spewing falsehoods about Java or C++. Saying things like "There’s no clean way to say 'this is private to this file' (in C++)" is just bonkers. The author is well intentioned, but coming up with the wrong reason is worse than not offering any reason.
dgfitz · 5 months ago
I had that thought too, and thought I must have misunderstood something. I generally assume I’m the dummy. :)
hinkley · 5 months ago
I am an ex Java developer. Enterprise Fizz Buzz is highly entertaining. That stupid masters thesis pretending to be a design book landed right an inflection point and ruined half a generation of developers.

What isn’t entertaining is using OpenTelemetry, which takes me right back to Java for over-engineering. Moving to OTEL from StatsD cost us about 3% CPU per core, which on 32 core machines is an entire CPU lost to telemetry. Or more accurately, an entire second CPU lost to telemetry. That is not right.

Prometheus doesn’t have these problems. And isn’t trying to fix quite as many problems I’ve never had.

acedTrex · 5 months ago
I don't agree with the builder pattern. For basic objects yes its a bit silly.

But the ACTUAL value of the builder pattern is when you want to variadically construct an object. Create the base object, then loop or otherwise control flow over other state to optionally add stuff to the base object.

Then, additionally, the final "build" call can run validations over the set of the complete object. This is useful in cases where an intermediate state could be invalid but a subsequent update will update it to a valid state. So you dont want to validate on every update.

zem · 5 months ago
I've used the builder pattern in python when I wanted to have the mutable and immutable version of a class be different types. you do a bunch of construction on the mutable version then call "freeze" which uses the final data to construct the "immutable" class.
ptx · 5 months ago
Couldn't you build up a dictionary of keyword arguments instead and do all the validation in the __init__ method? E.g.

  kwargs = {}
  if is_full_moon() and wind_direction == EAST:
    kwargs["baz"] = 42
  
  thing = Thing(foo=3.14, bar=1, **kwargs)

chuckadams · 5 months ago
Builders have some neat properties like partial evaluation, which becomes especially neat when you use stateless builders that return new instances. They can also be subclassed, allowing not only behavior that can be overridden at individual method granularity, but able to build a different subclass.

Obviously don't reach for a builder if you don't have these use cases though.

Deleted Comment

xigoi · 5 months ago
> Create the base object, then loop or otherwise control flow over other state to optionally add stuff to the base object.

That’s what list comprehensions are for.

> Then, additionally, the final "build" call can run validations over the set of the complete object.

The constructor function can do that too.

acedTrex · 5 months ago
The constructor can not do it because the constructor does not have all the data. This is lazy evaluation.
kuschku · 5 months ago
In languages with currying, you could avoid the builder pattern by currying the constructor.
eduardofcgo · 5 months ago
Could you not just use dicts and some schema validation logic for this?
tptacek · 5 months ago
Peter Norvig has a well-known piece that goes into more depth on why the GoF-style patterns don't make much sense in high-level languages:

https://www.norvig.com/design-patterns/

shawn_w · 5 months ago
Those slides would be a lot more useful with a transcript of the talk that went along with them. Or a video of it. Wonder if anything like that still exists.
AtlasBarfed · 5 months ago
Builder patterns are seriously useful for high complexity state construction with the ability to encore rules to prevent degenerate state.

A good example from my experience might be connecting to a Cassandra cluster it other type of database that can have extremely complex distributed settings and behaviors: timeouts, consistency levels, failure modes, retry behavior, seed connector sets.

Javaland definitely had a problem with overuse of patterns, but the patterns are legitimate tools even outside of Oop.

I haven't done much research into testing frameworks in other languages, but the spock testing framework in groovy/javaland is a serious piece of good software engineering that needs singletons and other "non hard coded/not global" approaches to work well.

Spring gets a ton of hate outside of jabs, and I get it, they tried to subsume every aspect of programming and apis especially web into their framework, but the core spring framework solved complex object graph construction in a very effective way

Oh you hate "objects" but have thousand line struct graph construction code?

It's kind of sad that groovy never took off. It offered all the good parts of java with a ton of good python, ruby, and other langs with the solid foundation of the jvm for high speed execution.

But it's effectively dead. Kind of like Cassandra is effectively dead. The tech treadmill will eventually leave you behind.

mystifyingpoi · 5 months ago
In my experience everyone will hate on Spring, showing how much easier other frameworks are using tiny unrealistic examples, until they hit a really hard architectural challenge (imagine reimplementing @Transactional in pure Java) and that's where Spring shines.

Yeah, it's sad, I like Groovy a lot. It got relegated to a second-class citizen role on Jenkins, for the most part.

chuckadams · 5 months ago
I'd say Kotlin took most good parts of Groovy syntax and put it into a decent type system, then Clojure peeled off the folks who still preferred a more dynamic language. Languages can't all live forever, otherwise there'd be no room for new growth.
zahlman · 5 months ago
On a closer read, TFA shows strong evidence of being AI-generated, at least in parts. Overall it's just super padded and not especially insightful, and it has this quirky writing style that seems... rather familiar. But I especially want to complain about:

> Okay, maybe you want to delay creating the object until it’s actually needed — lazy initialization. Still no need for Singleton patterns.

> Use a simple function with a closure and an internal variable to store the instance:

The given example does not actually defer instantiation, which would be clear to anyone who actually tried testing the code before publishing it (for example, by providing a definition for the class being instantiated and `print`ing a message from its `__init__`) or just understands Python well enough.

But also, using closures in this way actually is an attempt to implement the pattern. It just doesn't work very well, since... well, it trivially allows client code to end up with multiple separate instances. In fact, it actually expects you to create distinct ordinary instances in order to call the "setter" (instead of supplying new "construction" arguments).

So actually it's effectively useless, and will just complicate the client code for no reason.

mekoka · 5 months ago
If I had but one design pattern I would just LOVE to see disappear from Python, it's the need for super(). Don't get me wrong, super() is a clever piece of engineering, but if your code actually needs what it's useful for (C3 linearization, MRO, etc), then you've made things too complicated. I deplore the proliferation of libraries that have embraced the seductive, but ultimately deceptive ways of the mixin, because they saw all the big boys reaching for it. The devil gave multiple inheritance a cooler name, some new outfits, and sunglasses to confuse the Pythonistas and they embraced it with open arms.

Refactor to favor composition over inheritance. But if you really must inherit, single over multiple, and shallow over deep. Eventually your code will less and less need super() and it'll become pointless to use it over the more explicit mechanism, which incidentally makes everything cognitively lighter.

ddejohn · 5 months ago
Completely agreed. The codebase I work on really badly abused multiple inheritance all over the place. Some of our classes are 5+ layers of inheritance deep.

All the code I've written since joining has used `typing.Protocol` over ABCs, simple dependency injection (i.e., no DI framework), no inheritance anywhere, and of course extensive type annotations... and our average test coverage has gone from around 6% to around 45%.

It's honestly baffling to see how insanely over-complicated most of the Python is that I see out in the wild, especially when you consider that like 90% of the apps out there are just CRUD apps.

ameliaquining · 5 months ago
Isn't super() also commonly used in languages that have only single inheritance?
mekoka · 5 months ago
If the language is limited to only single and shallow inheritance, then super() becomes a syntactic convenience that saves everyone the burden of spelling out the inheriting class. But in Python, even if your code emulates these constraints, you lose in clarity from using super() because someone reading your source has to wonder if or why it was specifically needed, since its main purpose is to resolve the kinds of conflicts that arise in complex inheritance scenarios (diamond, problematic cycles, and such). So, to need it is to make your code complicated. To not need it while using it, is to lose in clarity.
ayhanfuat · 5 months ago
Brandon Rhodes has a series of talks on this topic. Here's the most up to date one:

Classic Design Patterns: Where Are They Now - Brandon Rhodes (https://www.youtube.com/watch?v=pGq7Cr2ekVM)

aswerty · 5 months ago
The Zen of Python: there should be one obvious way to do things.

Python in practice: there is more ways of doing it than in any other programming language.

Oh Python, how I love and hate you.

dkarl · 5 months ago
I don't think any of the examples in the article contradict the Zen of Python. Even if there's one simplest and clearest way to do it in Python, there's nothing stopping people from using a more complicated solution that they got used to while working in a different language. They might not know to look for a simpler way, because they're used to working in a language where their way is the simplest.
nhumrich · 5 months ago
People misunderstand the target audience and code base for the zen of python
ddejohn · 5 months ago
Who's it for, then?
shlomo_z · 5 months ago
> than in any other programming language After reading the article, I couldn't believe anyone designs their systems like that. His "solutions" seemed to be the obvious way to do things.
zahlman · 5 months ago
The actual text:

  $ python -c 'import this' | grep way
  There should be one-- and preferably only one --obvious way to do it.
  Although that way may not be obvious at first unless you're Dutch.
There are many layers to this, but the most important thing to point out is that having only one obvious way is just a preference (or ideal). In practice, any deliberate attempt to prevent something logical from working is counter-productive, and there is really no way to control what other people think is or isn't "obvious". And we all sometimes just expect things to work very differently than they actually do, even in ways that might seem bizarre in retrospect. We can't all "be Dutch" all the time.

But let me dig into just one more layer. Pay attention to the hyphens used to simulate em-dashes, and how they're spaced, versus what you might think of as "obvious" ways to use them. I'm assured that this is a deliberate joke. And of course, now that we have reasonably widespread Unicode support (even in terminals), surely using an actual emdash character is the "obvious" way. Or is it? People still have reasons for clinging to ASCII in places where it suffices. Then consider that this was written in 2004. What was your environment like at that point? How old was Unicode at that point? What other options did you have (and which ones did you have to worry about) for representing non-ASCII characters? (You can say that all those "code pages" and such were all really Unicode encodings, but how long did it take until people actually thought of them that way?) On the other hand, Python had a real `unicode` type since 2.0, released in 2001. But who do you know who used it? On yet another hand, an emdash in a terminal will typically only be one column wide (just as an 'm' character is), and barely visually distinct from U+002D HYPHEN-MINUS. (And hackers will freely identify "dash" with this character, while Unicode recognizes 25 characters as dashes: https://www.compart.com/en/unicode/category/Pd) Reasonable people can disagree on exactly when it should have become sensible to use actual emdashes, or even whether it is now. Or on whether proper typography is valuable here anyway.