I'm reading Petzold's Code [1], and it dawned on me that I didn't understand logic gates intuitively until now. I took a Computer Architecture course back in college, and I understood what logic gates meant in boolean algebra but not empirically. Petzold clarified this for me by going from the empirical to the theoretical using a lightbulb, a battery, wires, and relays (which he introduces when he talks about the telegraph as a way to amplify a signal).
Another concept is the relationship between current, voltage, and resistance. For example, I always failed to understand why longer wires mean more resistance while thicker wires mean less resistance.
This manifest in all sorts of ways - from people not being there when you need them the most, from friends dying off as soon as proximity changes, to how and why get people get promoted in jobs. This isn't necessarily bad, but if you don't know how to navigate this it can be quite painful and confusing.
2. Representation matters.
I knew this for a long time, but it didn't fully click until years had gone by and I realized I had unconsciously held myself back from pursuing a wide range of things because I just didn't see anyone like me there.
3. Rules in life are just constructs that we as humans have created.
Starting a business helped the most on this one. That's when I started to see that "rules" or "procedure" are all made up and exceptions can always be made.
(Edit: typos)
For those with anxiety, one of the best pieces of advice I ever received (that also took me years to internalize!) was the corollary to this: nobody is thinking about you in a critical way, at the level that you are criticizing yourself, because they are their own main character. Which is incredibly freeing, because the anxious person’s assumption that one embarrassing moment will turn into them obsessing about your failure… is absolutely nonsensical, because the only person they are obsessing about is the main character to them, themself.
I think there are shades of this, which is to say the level of judgement can vary.
I feel that in a smaller city my social anxiety is not as strong as when I’m in a larger city.
In larger cities, I feel much more evaluated and judged ie eyes on me looking at my clothes, checking me out, etc despite being a much more detached and impersonal environment than smaller cities where I just feel ignored. Maybe it’s just more people around putting me more on edge.
Deleted Comment
> the only person they are obsessing about is the main character to them, themself
seems obvious in retrospect.
“The ultimate, hidden truth of the world is that it is something that we make, and could just as easily make differently.”
Deleted Comment
That's why "rules are just constructs we humans created" rings stupid/horrifying. Obviously they were created by humans but that is also why they matter. I prefer "rules were written by humans, mostly in blood". It took years for OP to get the first part, here's to them getting the second part faster.
One doing something wrong to another would get families into an infinite loop of mutual revenge--until we got around precisely to create rules and put up a trusted authority to ensure justice without the need for vengeance. Rules is how our civilization functions and its only hope.
Those are some of the oldest and most important ones
Anyway, I'd be interested to hear more about the psychology of this.
I remember when I was growing up in the 90s and 00s in California, people talked about race way less than they do today. When ethnic representation became a common topic of conversation, I had a hard time believing it at first, because it seemed so self-evidently obvious to me that race wasn't a particularly important characteristic of a person. I actually had the experience of thinking back to my time in jr high/high school and thinking "wow, that friend of mine had dark skin, and they weren't from India... I guess they were Black, huh".
I'm not trying to claim that I didn't have subconscious biases related to race as a kid. I'm sure I did. But I do suspect they have become a lot more severe as a result of people talking about race so much -- it has become a much more salient characteristic. (I'm also more aware of trying to mitigate my biases and avoid microaggressions and so on, of course.)
So yeah, I'm curious to compare notes with other 90s kids in this regard. I'm white, but if I was Black, I imagine that I'd be way more self-conscious about it now than I was when I was growing up. (Like, if I'm the only white person in a group, I feel self-conscious about it now in a way that I didn't feel when I was a kid.)
There's plenty of evangelicals of every race, but overall the communities are very segregated. It was clear no matter their professed faith, in practice the community I grew up in was hostile to PoC.
This makes me curious. Would you mind answering, from your point of view, why do you think that is? Is it a specific scenario or type of scenario you wish to avoid or is there a generalized concern that comes with it? Is that due to uncertainty or past experience?
Although a small sample size, despite their ambitions the BIPOC people I know haven’t been able to reap any professional benefits from it. Whether from access to executive roles, getting taken seriously by venture capital firms, or in their attempts to join venture capital firms. There is a level of discretion in these team forming situations that is not extended to them whether it has anything to do with their race or not, its pretty clear the upwards mobility is not coming from this credential.
For people with their own capital and leverage, it amplifies their ambition if they want. BIPOC don't really have this.
The “average salary” of MBA alumnis is not what is interesting about getting one, for me or them.
Some, or more, examples to the contrary would make it seem less like a total waste of time.
Deleted Comment
It is hypothesised that the reason behind this is because chess is a boys’ club, so to speak, and thus there is not a lot of representation.
Absence of representation means that it may seem that you are the only one doing XYZ, which in and of its own can be terrifying because we often feel that the odds are stacked against us (which is a self-fulfilling prophecy), or that we are held to much higher standards than others.
Personally, I enjoy seeing diverse representation even if I am not represented. I want people to not be afraid to pursue their dreams and goals, I don’t want implicit prejudices due to lack of representation either.
> Starting a business helped the most on this one. That's when I started to see that "rules" or "procedure" are all made up and exceptions can always be made.
This is a big one that I learned through the same experience. Everything is arbitrary, the rules are made up and the points don't matter.
It made me appreciate those who recognize this, and in turn treat others well, even when shit hits the fan metaphorically, and I have absolutely zero tolerance for bullshit hoop jumping and assholes. I know you don't have to play by that book, so I won't, and I'm happier for it.
Here is just one example of the total wrongness of something I tend to be automatically sure of: everything in my own immediate experience supports my deep belief that I am the absolute centre of the universe; the realest, most vivid and important person in existence. We rarely think about this sort of natural, basic self-centredness because it’s so socially repulsive. But it’s pretty much the same for all of us. It is our default setting, hard-wired into our boards at birth. Think about it: there is no experience you have had that you are not the absolute centre of. The world as you experience it is there in front of YOU or behind YOU, to the left or right of YOU, on YOUR TV or YOUR monitor. And so on. Other people’s thoughts and feelings have to be communicated to you somehow, but your own are so immediate, urgent, real.
David Foster Wallace
Corollary to this is that human rules aren't like programming rules and that the words that make up the rules get interpreted by a human.
One thing that this means is that you can't "hack" a human rule by picking the semantic meaning of a word which works best for you. You have to actually convince the arbiter of that rule that they agree with your meaning. If they don't, and they have a hundred years of legal ruling behind them that they've read and you haven't, then you're screwed.
And good human rules usually do have exceptions to them ("yelling fire in a crowded theater" being the most well understood). This is also why "the exception that proves the rule" is not a stupid saying.
And this is a feature and not a bug. The worst rules we generate are usually the ones that require the human arbiter to be rigid and mechanical. That tends to produce injustices like "three strikes you're out" and "mandatory minimum sentencing" (or any attempt to make the handball rule in soccer/football be objective and just winds up making it worse).
So when you explain something, 10 people hear 10 slightly different things as their own experiences and biases, and even hopes, interpret your statement.
That’s why being able to communicate accurately, clearly, and concisely is a very difficult and important thing. If you can do it tailored to specific groups and with humour, bonus points.
But for example, seeing that a philosophy major can have a successful computer science experience
Representation is literally about people seeing themselves. If you can't nail down what "people like me" look like when it comes to representation, it might be that representation isn't quite the right way to frame the problem. It's only one aspect of diversity.
We are more likely to pursue careers or interests if we know someone (and are friends with them or related to them) that is interested in the same thing or has had experience in the same thing.
Your comment reminds me a bit of something I heard years ago from a manager at Facebook claiming that the way to solve imposter syndrome was to have people select which teams and projects they worked on because they would be more motivated to work at it. Totally off from what my experience had been, but likely applicable to some people.
However, I'm not saying you are wrong and I'm right. I think what's surfacing here is that what I posted doesn't apply to everyone and what you are saying doesn't apply to everyone either. It likely helps different types of people in different situations.
To me, that would be the other way around: a chance to stand out and be one of a kind, act in a way nobody else had acted before, and reap the benefits of being unique, or the first at least.
(Seeing this as exciting or frightening probably depends on your level of sociopathy.)
This is a great attitude to have. I know that not everyone has this ability (yet) to function this way, but it really should be the goal of how to operate.
For me, it took me years to get there. Variety of reasons, but I realized much later in life that I was still carrying around traumas of being targeted in public by random strangers at a young age just by being out in public (yes, just walking down the street). That sort of stuff can erode psychological safety that we bring with us and unknown situations can cause us to react rather than respond. This is one of the reasons I think representation matters. Not just that it creates a mental model of what's possible, but proof (hopefully) that it's also safe and allowed.
Strange, I had the opposite encounter. I realized the only thing keeping me from doing things was myself. There are definitely real barriers (hiring quotas, affirmative action, etc.) but without artificial constraints the only thing stopping you is you. You might feel a little uncomfortable but that's something easily overcome - and almost like a superpower when you realize you can overcome an external locus of control.
With that said, some form of representation helped me greatly with it. It doesn't need to be an exact match, but for me it needed to be enough to make me break my assumptions and see whatever weird walls I had put up in my thinking.
Being a white male but simply being unique in coming from a poor community in a poor city was enough for me to inflict a lot of unnecessary pain on myself through undergrad by seeing myself as different from the majority of elites in my program.
To be a plumber, yes.
To be a Doctor, kind of.
To be anything really competitive, not really, no.
There is a reason startup and regular CEO's are way over-represented from upper middle class families, and not ultra poor classes.
If you don't grow up playing Golf (expensive), it's highly unlikely you're going to the PGA.
For the poor kids to even fathom they could do something, they need to be exposed to the concept in a material way, on the whole. Obviously it's not always the case but representation 'is a thing'. And of course it can be way overstated in importance in many cases.
It clicked for me when I let my publicist go wild and she got me on listicles of BIPOC founders, before, we just had lots of quotes and interviews. Only people already interested in the project on its own merits were following along. After, there were lots of people that are interested in the representation in that kind of niche who otherwise just wouldn't know how to find that representation. Or just wouldn't be able to tell by founder names alone.
and of course we got the amplified engagement from people arguing about “why does race matter” in the LinkedIn comments. so shoutout to the useful idiots, publicists expect that to exist and calculate it.
Deleted Comment
Deleted Comment
But for example, seeing that a philosophy major can have a successful programming career, can encourage others in the humanities to see themselves do it too.
Or seeing someone with ADHD run a business successfully and overcome executive function challenges, can also help others with ADHD.
These are great call outs. The definition for representation can be highly nuanced and very personal. This is not to say that issues like race and gender aren't important and don't need support. Just that there are so many things that people bring to the table that can make them feel like "the other" which holds them back. It's surprising how lack of role models can lead some people to think it's not feasible or not even possible.
Deleted Comment
As a well represented person I can tell you this has nothing to do with representation, its just that the vast majority of humans in the modern world have close to zero agency and/or don't think they can actually change things.
As a well-represented person, you're probably making assumptions about the effects of representation that are less informed than you realize.
I was a total STEM math nerd in school. I used to frequently complain how I don't get what's the point of it, or how it's a waste of time and I'm learning nothing. I still think the emphasis of school was off, but I get the point of it now.
Stories are like code for humans. You can't tell someone what it means to be good or bad, or to give them a course in philosophy and they will become good people. But you can tell them a good story, that engages with them emotionally, and it will change their perception. And history shows that in fact, those stories being told and repeated aren't just interesting minor curiosity, but they have shaped the direction of humanity and they are driving it. A single person with a single story can change history in such a way that it would be completely different without it. And some stories about stories need to be told as a warning so that people will not fall for those kinds of stories again.
If they instead taught students easier books in schools, student would never develop the reading skills to tackle the classics, and a much smaller percentage of adults would ever bother rereading the classics or even acknowledge their power.
Of course, the teaching has to be improved so that students never hate it.
What's the point? Math has clear purpose. I do enjoy some books. I just didn't get the point of studying them beyond reading. Why are they babbling about ? Is there any use for this information or is it just a torture in memorization? Why would I ever care about all these abstract literature terms?
Ironically they missed telling the meta story: why would anyone care about stories.
If stories are just something you read for fun, why would anyone care to teach me how to analyze them?
EDIT: To be clear, these were the questions that weren't answered back then.
Only now, years later with life experience the purpose is clear.
I also remember reading 1984 on my own time in high school- it wasn't in the curriculum the year I might have. It blew me away. But if I'd been forced to read it, I probably would have been bored.
Learning is one thing, being tested on facts and trivia is totally something else.
Then one day the teacher didn’t want to teach and instead showed an episode of Connections, and I was blown away. Learning about how and why our science and technology became what it is was something I could related to and seemed actually useful. I still don’t care for military history, though.
https://archive.org/details/ConnectionsByJamesBurke
Is that actually true? Do we have good reason to believe that people who study history/literature behave more ethically?
Really depends on HOW you learn it IME. If it's just regurgitating dates/names/whatever it isn't helpful at all, at least for me. If you establish that event x led to y because of z, it just clicks and suddenly makes sense.
For example, let's take hitlers rise to power: "He became the chancellor of germany in 1933" That is just about useless. "Hitler rose to power with the help of the nazi party, which was partially formed in response to the treaty of versailles' excessively harsh terms, leading to an extreme amount of inflation and a harsh drop in industry. This set the stage for hitler arguing to use a war as a means of getting rid of the penalties of versailles and bringing germany out from the slump"
For me, in school I was mostly taught the first variation.
One thing which held me back for a very long time was not following up with people who didn't show much interest initially.
I wasted so many good leads thinking it is impolite to follow up with people after contacting them once. My whole life changed once I understood the power of follow ups and understanding that most people are so busy that it takes at least 6 reminders before most people will take any substantial action.
The reverse is also true. People say a lot of things and most of the times you never cross the bridge or reach it. Nowadays, I rarely argue about anything and don't act on stuff until a person reminds me once or twice. This small filter can be like a miracle for saving your time and energy.
Most people are only willing to do something after repeated attempts because they're polite and want you to stop. If a website asks you whether you want cookies, you only click "yes" because the alternative is more work. If you constantly pester and threaten someone long enough, you'll always get a "yes" at some point. But that's not consent, which is why it's illegal.
Even if lawyers cost money, if I'm just expensive and stressful enough to stop people like you from sending even a single unsolicited message, it's worth it. Time is the most valuable resource we humans have, you've got no right to waste other people's.
(The point of the parable, as I understand it, is not "One Weird Social Engineering Trick That Will Always Get You What You Want!!!11" but rather that we should persevere in prayer and petitions to God because, if even the unjust judge was eventually moved, how much more likely is our heavenly Father to grant us our heart's desire because he isn't unjust like the judge. This is an example of an "a fortiori" argument, https://en.wikipedia.org/wiki/Argumentum_a_fortiori , which are fairly common in scripture.)
The principal element in sales is asking for the order. Everything, and I mean everything, follows from that. If you had been trained this way, following up would be second nature.
My grandmother was a terrible narcissist. I loved her dearly and she had a lot of wonderful qualities, but The quality that stood out the most, sadly, was narcissism.
My mother was also a narcissist to a somewhat lesser degree. It didn't occur to me that I too was a narcissist until I was about 35 years old. It took waking up in the corner of the living room in my friends one bedroom apartment Early one morning to see it.
I had pushed away my wife and kids because in my mind all of my problems were their fault. I had blamed others for every thing that had ever happened to me or every feeling that I had felt. And in that moment I realized:
It's ME.
Everything changed in that instant. It was no longer just about me anymore. I stopped seeing the people closest to me as opponents and started seeing them as what they were, family. My support system. The love of my life.
As the years have gone by since then I have seen more of my past through that light and things have become so much more clear.
Understanding that my grandmother was a very damaged person who turned a narcissist to deal with it, then raise my mother similarly, help me understand two things. The first was that the things I blamed myself for in the past weren't my fault. Secondly, it helped me forgive them for some of the awful things that happened. I'm not saying it's okay to be a narcissist. But recognizing that their narcissism affected my life, and it was something that I could shed in my own personality was a serious life changer. And the funny part is that after I realized all of this, my debilitating depression essentially went away. And that was a big deal.
I also learned not even 2 years ago that I have ADHD which was like a light bulb moment for me as well because it explained so much of my life.
it seems more likely that there is a genetic/biological basis for this personality trait
Importantly, the skills can be learned as an adult it’s just hard to do. I come from a long lineage of family members who did not know how to swim, but I don’t think that there’s a genetic basis for this.
Random theory: Children of narcissists like IT because it's a world that is very predictable with rational explanations.
Thank you for sharing.
Please continue to contribute here on HN, I really appreciated your comment and experience.
I suggest you look up the difference between the common sense of the word narcissism and the psychiatric diagnosis (which has a more rigorous and specific definition).
Things like Repositories, Aggregates, Bounded Contexts, and so on are going to be a net drag on your system if you only have a few 100 kloc of code in a monolith. But they really start to shine as you grow beyond that. Bounded Contexts in particular are a gem of an idea, so good that Uber re-discovered them in their microservices design: https://www.uber.com/blog/microservice-architecture/.
(Edited to clarify the book author)
Both books are great. Read whichever one aligns with your practice best.
I recommend Wlaschin's book for anyone curious about FP, without hesitation. He's great at explaining things from first principles, without veering off into "monad tutorial" territory.
https://pragprog.com/titles/swdddf/domain-modeling-made-func...
An easier introduction I recommend is InfoQ’s “Domain Driven Design Quickly”, available in print or a free PDF ebook: https://www.infoq.com/minibooks/domain-driven-design-quickly...
- Aggregates are too heavy. You need to make the decision about what is or is not an aggregate way too early in the design process. Boundaries are fuzzy.
- Actual concepts don't exist in nicely packaged bounded contexts. Concepts overlap a lot. You need to make the decision about which concept fits into which bounded context too early in the design process. Boundaries are fuzzy. Things are kinda like other things. The definition of "Employee" is not the same in the Scheduling context as the HR context as the Payroll context, yet they do overlap a lot, and you can't just treat them as completely separate things. If you break everything down into tiny contexts to deal with this, you just make Contexts and Aggregates the same.
- Repositories are not original to DDD and I think are very likely to foster absolutely horrific SELECT N+1 or even SELECT N^2 or N^3 performance. You simply can't let one bounded context do all its expensive operations in a vacuum; not when you have lots of contexts and lots of operations. In a complex system, most parts need to be planning, not doing. The results of most operations should be a plan that you can compose with other plans, analyze, and possibly even have an "optimization pass" if you need one.
- Ubiquitous language is the right idea. If you take nothing else from DDD, take this.
I've not found that Aggregates need to be designed at the beginning; I've found it works fine to define an Aggregate after you start seeing performance issues with query patterns (i.e. define an Aggregate and forbid direct access to sub-objects when you see pathological access patterns/deadlocking).
Personally I've found Repositories to be a good way of enforcing that N+1 queries DO NOT happen. For example, in Django you can have the repository run select_related/prefetch_related and `django-seal` on the ORM query to forbid unexpected queries. This somewhat neuters the flexibility of the ORM, which can be a big cost, but lets you build much more restrictive queries that are guaranteed to perform well. It's a trade-off and I don't think it's a clear win for every use-case, but particularly when dealing with Aggregates I think having a limited number of ways to query is beneficial. (This might mean you're running some sub-optimal SQL queries, over-fetching etc., but for most line of business applications, that's actually a viable trade in exchange for simpler domain interfaces and protection against N+1 queries.)
Regarding "planning" vs. "doing", that seems to fit quite well with doing most of your work in POJO/POPO domain models, then only rendering those into DB writes at the edges. I think Repos can help with that. (IME you get N+1 selects when you use ORM models that abstract away the SQL queries and have application code interacting directly with ORM models; if you remove the ORM from your core domain logic and force it to live at the edges in Repos, this is not possible.)
I've had the exact same complaint. I think there's a lot of great stuff to take away from DDD -- I also hear myself frequently making the same point about it's ubiquitous language -- but going all-in on some of its concepts, especially early in a projects life, is probably always a mistake that may well end in disaster.
I remember what made it click: I was designing an animation system, which had a bunch of different interdependent moving parts. Once I started treating each part like an object and letting it manage its own state it all just clicked. I started with this massively complex functional-like system that managed four or five different motions, but once it was broken into objects most of the code just fell off and it became a nice clean system.
I was super proud of it at the time, but it's pretty bad by my current standards.
You could build a chatbot that supports Discord, Slack, and IRC dynamically at runtime, or a web app that can use multiple different database engines, or a social network with multiple types of posts that can all be rendered in a main feed, or a bunch of other things. In all of these cases you can also take advantage of this kind of dynamic dispatch to inject mock objects for testing, as well as theoretically have an easier time swapping out a layer if you want to change a dependency.
What really frustrates me is that almost none of the OOP instruction I've encountered ever showed these kinds of real, practical examples. They always over-emphasize harmful inheritance-based approaches and never really explain composition. In college we learned about OOP by building a little text-based RPG thing with different types of monsters implemented as subclasses of an abstract base class, which left me feeling like there wasn't much practical use for it outside of game development.
It wasn't until my first internship that I saw a real-world use of OOP, in the form of a giant Spring Boot monolith with tons of dependency injection. Eventually, after staring at that for a few months, OOP finally clicked for me, but I still find it annoying that nobody ever tried explaining this using practical, small-scale examples.
So true. The canonical example used by OO pundits is often a "Person". A Person can be either a Contractor or an Employee (inheritance).
Having spent a lifetime in HR systems, there really couldn't be a worse example. It turns out that people may become contractors or employees, and then change back again, leaving the company and then returning. Some people may even hold the role of contractor and employee contemporaneously. Any crazy shit is true for humans. Instead of using simple OO classes to model people, IRL you end up with many, many tables that capture their lifecycle.
Good OO candidates/examples are usually non real worl things. Window managers and windows. Progamming concepts. That kind of thing, where the rules are simple and point in time.
> You could build a chatbot that supports Discord, Slack, and IRC dynamically at runtime...
As a counter-point, OOP isn't necessary for dynamic dispatch.
None of these dynamic features you describe are necessarily any harder in non-OOP languages — or easier in OOP languages.
You can do the same kind of dynamic dispatch in Elixir, Go, etc.
OOP really is a preference, not a differentiator.
Source: I cut my teeth on OOP and used it for many years. Been using Elixir for the last half a decade, and have zero loss of ability to do things like this.
The issue was that anything they taught had to be dumbed down to the point to fit in a classroom and they basically stopped there. We also never wrote code that would be looked at by more than one person. I didn't "get it" until I spent years in a real job, with systems complex enough for this to matter, and practical examples like Dependency Inversion etc as you say. There was also a complete lack of Composition and (at the time I did it) multiple-inheritance which is a bit more controversial these days.
Our traditional Comp Sci course also worked on a variety of languages but nearly zero UI or web programming back then, which are both natural fits for OOP that we largely ignored.
A class Animal is inherited by Dog class that barks. That's like the total opposite of what any practical example can be and people confuse how that's even useful.
Never forget who is in charge. Deferring to a style dodges responsibility for your choices.
At small code sizes, OO does not really have any apparent advantages to doing everything imperatively or in a hacky way. In fact I’d argue it tends to overly complicate and obfuscate things. But there is a point at which you really start wanting OO instead because you can no longer reason about the binary as a whole, and need to start thinking in terms of interfaces with separation of concerns. Even if you did understand the whole state, there are too many people working on it to keep up with changes to the whole state in a way that lets you reason about the thing as a whole.
At my first job (and in college) code never reached such levels of complexity so OO always seemed like some dumb fad to make things more complicated than necessary. I still that is often the case, but now I absolutely see the benefits when they present themselves.
It's common for a lot of online resources (I don't know about university) to provide introductions to programming using OOP, which I think is a terrible mistake. It fills beginners' heads with all sorts of incorrect, fanciful ideas about how computers work and what happens when a program is executed. It's also difficult to know how to use OOP concepts and structures correctly without knowing why they're convenient. I strongly believe you need a good understanding of the technical motivations for such constructs before you can actually use them with good effect--otherwise you're simply relying on imitating patterns or taking certain things as foundational when in fact they are not. I think functional languages and strictly imperative languages are far more appropriate for a first programming language.
If I were to teach programming, I'd start with straight up assembly. Give people a taste of what the cpu is actually doing under all your abstractions. Probably a few weeks, enough for hello world, fibonacci numbers, and an intro to branching. Then introduce C, get used to pointers and basic high-level language concepts. Really hammer in the idea of thinking like the CPU and being mindful of the resources you're using.
From there, guide them towards building a real application in imperative C or C++. After that the overarching theme is rewriting the application in an object oriented language, with attention to why and when you should and shouldn't use these new tools.
IMO, understanding the fundamentals of how a cpu works is absolutely essential for writing good code at any level of abstraction, and it seems a lot of new programmers are missing that.
It "clicked" much later, in two stages, using languages that were "kind of like" OOP, even if not rigorously so: LabVIEW, HyperCard, Visual Basic. I think VB had a decent strategy for introducing OO to the rest of us. Out of the box, it was "object based," meaning that you could use classes that had been created by someone else. For a bit more money you could buy the version that let you do full OO, but I never did that. But by being a user of objects, it gave you an idea of what you'd want if you could create them for yourself.
Nowadays of course people range from being bullish to bearish on OO, and I've had the experience of doing it badly and making a mess of things, when a procedural or functional model would probably be better.
Kind of a lesser issue is that I finally grasped how to work with a modern OS after laying my hands on the first couple volumes of the Win32 programmer's manuals, which I think were vastly less forbidding than Inside Macintosh.
What books exactly are you referring to here?
Of course, if you want to go the whole polymorphic route, i'd suggest using something like c++, but the key ideas are really structures, and functions that operate on those structures.
The hardest bit is knowing when to stop, aiming for the RAII sweetspot in C++ is the goal not AbstractBeanFactoryFactoryBuilder().
I still frequently see code examples online that are written using classes that don’t need to be. I imagine this is the reverse problem: people from languages like Java thinking everything has to be in a class.
It feels like something used an awful lot in ways that don't really add any value. But again, it must just be my lack of understanding. Maybe one of these days it'll click in.
Moving the beam endpoint required an animation to bring the width to 0, change the endpoint, then ramp the beam width back up. The sprites each have their own animation in different conditions, and moving them along the beam is another animation.
The real key for me was the sprites. I built a class which takes in the start and end points of the beam and an update method. Once you start the sprite up, you just have to poke it every frame with the time delta and it manages all of its own animations.
Likewise, the laser itself was an object that managed its own animations and the sprites related to it.
This resulted in a bunch of objects with just a little code in them. But because the concerns are separated, it results in less code overall than it would take to manage everything all at once.
Is this the best, or even a good way to do this? Probably not, but it was a very beautiful solution in the moment
I'm not sure if it was years, but it wasn't immediate. I just didn't understand why dependency injection was good at first, and not just someone's weird personal code style choice.
I thought it was just people being "Enterprisey" which I'd encountered many times over the years.
Once I committed to unit testing, I realized how necessary it is.
Unfortunately I still encounter customers who haven't bought into it and as a result have untestable code. It's so hard to go back and retrofit testing.
In many cases mocks are now over used where previously they were important in say 2008. Especially now with languages that support functions as objects, better generics, and other features which weren't common a while back. Likewise frameworks are and languages are generally way more testable now which means you're doing less backflips like static injecting wrappers for DateTime.now into libraries to make tests work. This further allows more contract and less implementation specific testing.
As with most things there is a lot of nuance/art to doing it well and smoothly
When I code review, I try to make sure I call out "fake tests".
But that's a long way from glorious full DI containers where you never call 'new' in your code anywhere and all object creation can be dictated by config. I suspect that must be only the kind of thing that people who maintain 1,000,000 line codebases that are at the center of massive bureaucracies.
Just vanilla C++ classes, and virtual interfaces if we need to mock things for unit tests.
No automatic wiring of the hierarchy.
The idea that I take advantage of good abstractions and I send those objects into my classes that need to perform actions via those abstractions made a lot of sense. Helps enable good polymorphism, as well as unit testing and other things.
I don't think I'm doing it justice, but the idea took a good while to understand there reasons behind it. Some books that helped me grok the idea were
- Patterns of Enterprise Application Architecture - Clean Architecture - Architecture Patterns with Python
along with running into problems that could be easily solved with a decent abstraction at work and learning to apply it directly.
I always hated testing and I still do, but every time I commit to doing it right I catch so many errors before QA.
The term is unfamiliar to me -- is it related to "fault injection"?
But this makes testing A in isolation difficult. When testing A, you want to mock out B with an instance the test can manipulate.
So we want A to not create B, instead we want B to be "injected" into A. The general strategy of having B passed into A is called dependcy injection.
If you've ever written a constructor for a class that has arguments (within the constructor signature) that are used by the instance of the class when instantiated then you have done dependency injection, or put more simply 'passing stuff in' which was eloquently stated in another comment on this thread.
it makes the code cleaner and testable.
This way, your code isn't bound to a specific implementation.
Some DI frameworks even go so far and define all resources in a config file. This way you can switch out the implementation without a recompilation.
https://hasura.io/blog/build-fullstack-apps-nestjs-hasura-gr...
Dependency Injection solves the problem of when you want to create something, but THAT something also needs OTHER somethings, and so on.
In this example, think about a car.
A car might have many separate parts it needs:
We can manually construct a car, like: But this is tedious and fragile, and it makes it hard to be modular.With dependency injection, it allows you to register a sort of "automatic" system for constructing an instance of "new Foo()", that continues down the chain and fetches each piece.
And then "class Car" would have an "@Inject" in it's "constructor", and so on down the chain.When you write tests, you can swap out which instance of the "@Injected" class is provided (the "Dependency") much easier.
Deleted Comment
The abstract concept of OOP (messages between complex objects, as defined by Alan Kay) is an attempt at mimicking biological systems. Most modern languages implement data abstraction, but call it OOP, where they encapsulate some functionality with the data it operates on. Really helped with varying data formats in the AirForce in the 60s, apparently. There isn't anything wrong with this abstract concept either - it's a way of structuring a solution, with trade-offs.
Support for unit testing and mocking has little to do with OOP, and everything to do with the underling platform. Both C++ and Java, for example, do not have a special runtime mode where arbitrary replacement of code or data could occur. This is necessary for mocking functionality that is considered implementation detail and hidden by design. The hidden part is great for production code, not great for testing.
For example, if an object in java has a field like 'private final HttpClient client = new CurlBasedHttpClient();' this code is essentially untestable because there is no way in Java to tell the JVM "during testing, when this class instantiates an HttpClient, use my MockHttpClient".
Kotlin fixed some of that with MockK, which can mock the constructor of a Kotlin object, and you can return your mock implementation when the constructor is invoked.
Clearly, it's a platform issue. There could be a world where you could replace any object in the stdlib or any method or field with a mock version. JavaScript is much more flexible in that regard, which is why unit testing js code is much easier.
The root of it all stems from the fact that unit tests need to change some implementation details of the world around the object, but production code should not be able to, in order to get all the benefits of encapsulation.
If you get rid of modern OOP, you are swinging the pendulum in the opposite direction, where your tests are easy to write on any platform, because everything is open and easily accessible, but your code will suffer from issues that creep up when structures are open and easily accessible, such as increased coupling and reduced cohesion.
Using a totally non-OOP functional style, you can either instantiate state within a function or pass it in from the calling context, which is the same trade-off that dependency injection targets.
However you structure it, there will always be "glue code" that ties the nice inmutable code with the outside-interacting bits, and if you want to unit test those, dependency injection (with functions or state, not classes or instances) is still the way to go.
The few times I had to use Java afterwards I felt the same - all OOP features were unnecessary or at least didn't feel like the most straightforward approach. Nowadays I never use classes in Python, JS etc., it's just not needed - and in the case of Python it makes JSON insanely cumbersome.
Deleted Comment
It made me click about the saying that science advances one funeral at a time. It is easier to rally people of similar thought than to change people of opposite opinion. Not impossible, just more difficult. It explains a lot of thing in my opinion.
1. It is easier to start a start up than to convince your boss to take a certain product direction. E.g to not pursue certain pursuit, as outlined by John carmack's departure from meta. The ultimate judgement will be whether YOUR idea survive rather than whether your boss buy your idea. And I prefer bootstrap, at least for now, for that reason.
2. Never attempt to change your spouse. Find the common ground instead.
3. Empathy is mostly about experience sharing. You can't have people feel something they never experience before. If you can empathize, it means you have experience to draw similarity between. Imagine teaching a 18yo to be a father, that's how preaching people to be empathetic felt like.
Imo, ideas form within the spectrum of indoctrination and "epiphany".
Indoctrination is something that you hear over and over, and sort of taking it as granted. Parent to children, school to students, religion to devotees, government to citizen, social media to general public, hackernews to us developers, all impart their flavors to our mind and we act according to it. Take sleeping for example, sleep well, sleep long enough, sleep early shouldn't elicit much debate, we treat it as truth more of less. On the other hand, dietary cholesterol consumption, which is considered, advertised and indoctrinated as bad for more than 5 decades has now garnered some attention to consider otherwise [1][2][3]
Epiphany is circumstantial, event driven. Either you have a lightbulb moment, or reality decides to slap you in the face. Having a close friend/relative die at a young age will either makes you treasure life a lot more, or send you into deep depression. But it will change you forever for sure. On the other hand, again taking sleeping as an example, I have been a night person since teenage, habitually sleeping at 3am, 4am, feeling drowsy in the morning and study/work in the late evening. "I prefer working in silence" is the excuse I gave to the then self. One day it dawn on me that there are only 24 hours a day, if I can get things done in 1am before sleep, I can equally get things done in 6am after sleep. And if I can rest well, I can get things done quicker. No new information just one day I viscerally feel that shifting the biological clock earlier makes more.
1. https://en.wikipedia.org/wiki/Ancel_Keys
2. https://en.wikipedia.org/wiki/Seven_Countries_Study
3. https://en.wikipedia.org/wiki/Robert_Lustig
You can have an impact there. But you can't cause immediate change.