I tend to advocate for people to study design patterns. Not for the purpose that you will necessarily ever use most (or even any) of these exact patterns in your software, but just that you've strengthened your mental muscle for software design in general. I encounter lots of engineers who simply aren't able to think "outside the box" when building something new.
Personally, I prefer to learn FP patterns, which tend to be backed with nice mathematical properties behind them.
I appreciate that Go tends to avoid making limiting assumptions about what I might want to do with it (such as assuming I don't want to return a value whenever I return a non-nil error). I like that Go has simple, flexible primitives that I can assemble how I want.
And all of those programmers are either using specialized languages, (suffering problems when they want to turn their program into a shitty web app, for example), or committing crimes against syntax like
rotation_matrix.matmul(vectorized_cat)
You don't even need such construction in most native applications, embedded systems, and OS kernel development.
What would happen from time to time was that an important reason did come up, but the team was now many releases behind. Whoever was unlucky enough to sign up for the project that needed the updated dependency now had to do all those updates of the dependency, including figuring out how they affected a bunch of software that they weren't otherwise going to work on. (e.g., for one code path, I need a bugfix that was shipped three years ago, but pulling that into my component affects many other code paths.) They now had to go figure out what would break, figure out how to test it, etc. Besides being awful for them, it creates bad incentives (don't sign up for those projects; put in hacks to avoid having to do the update), and it's also just plain bad for the business because it means almost any project, however simple it seems, might wind up running into this pit.
I now think of it this way: either you're on the dependency's release train or you jump off. If you're on the train, you may as well stay pretty up to date. It doesn't need to be every release the minute it comes out, but nor should it be "I'll skip months of work and several major releases until something important comes out". So if you decline to update to a particular release, you've got to ask: am I jumping off forever, or am I just deferring work? If you think you're just deferring the decision until you know if there's a release worth updating to, you're really rolling the dice.
(edit: The above experience was in Node.js. Every change in a dynamically typed language introduces a lot of risk. I'm now on a team that uses Rust, where knowing that the program compiles and passes all tests gives us a lot of confidence in the update. So although there's a lot of noise with regular dependency updates, it's not actually that much work.)
While my recent legacy Java project migration from JDK 8 -> 21 & a ton of dependency upgrades has been a pretty smooth experience so far.
My listing of similar games:
- Grim Omens 3/5 - Revengate 4/5 - Dungeon Crawl Soup (on pc) 4/5 - The greedy cave 5/5 - Pathos 4/5
It's a bit snarky of me, but whenever I see some web developer or product person with a strong opinion about AI and its future, I like to ask "but can you at least tell me how gradient descent works?"
I'd like to see a future where more developers have a basic understanding of ML even if they never go on to do much of it. I think we would all benefit from being a bit more ML-literate.
> I'd like to see a future where more developers have a basic understanding of ML even if they never go on to do much of it. I think we would all benefit from being a bit more ML-literate.
Why "ML-literate" specifically? Also, there are some people against calculus and statistic in CS curriculum because it's not "useful" or "practical", why does ML get special treatment here?Plus, I don't think a "gotcha" question like "what is gradient descent" will give you a good signal about someone if it get popularized. It probably will lead to the present-day OOP cargo cult, where everyone just memorizes whatever their lecturer/bootcamp/etc and repeats it to you without actually understanding what it does, why it's the preferred method over other strategies, etc.
What about something like `gamma`? Lorentz factor? Luminance multiplier? Factorial generalization?
Why not just use the full sentence rather than assign it to an arbitrary name/symbol `gamma` and leave it dependent on the context?
And it's not that hard to add an inline comment to dispel the confusion
const tau = 2*pi; // Alternate name for 2pi is "tau"I like the declaration side. I think part of where it misses the mark is the syntax on the caller side.
I feel like standard conditionals are enough to handle user errors while the heavy machinery of try-catch feels appropriately reserved for unexpected errors.
And more importantly, I don't think there's any JEP trying to improve checked exception handling.