Having worked professionally in C, Java, Rust, Ruby, Perl, PHP I strongly prefer lock files. They make it so much nicer to manage dependencies.
vs
"You can use make to ape the job of dependency managers"
wat?
Having worked professionally in C, Java, Rust, Ruby, Perl, PHP I strongly prefer lock files. They make it so much nicer to manage dependencies.
vs
"You can use make to ape the job of dependency managers"
wat?
Maven, by default, does not check your transitive dependencies for version conflicts. To do that, you need a frustrating plugin that produces much worse error messages than NPM does: https://ourcraft.wordpress.com/2016/08/22/how-to-read-maven-....
How does Maven resolve dependencies when two libraries pull in different versions? It does something insane. https://maven.apache.org/guides/introduction/introduction-to....
Do not pretend, for even half a second, that dependency resolution is not hell in maven (though I do like that packages are namespaced by creators, npm shoulda stolen that).
The point is, "You don't need lockfiles."
And that much is true.
(Miss you on twitter btw. Come back!)
I really liked those charts, I wonder how you can generate them, whether there's a tool out there that you can just feed a Git repo into or something.
Well that's just slander as far as I'm concerned. Of course we other non-clojure programmers believe in backwards compatibility. What a crazy thing to suggest that we don't.
I previously worked in PHP, Perl-cgi, Java, and Python- webtools mostly based on MySQL and other SQL database flavours.
I worked in a Clojure only shop for a while and they taught me the ways after that you don’t go back. Everything can quickly click into place, it’s daunting to start the learning curve is very unsteep, takes long to get anywhere, but as a curiosity it was fun, then I started to hate how everything else was done now I’m sold my soul to the Clojure devil.
I find this to be extremely "it depends": this is a normal interaction amongst software engineers, if you have mature engineers.
It's only not normal in those ecosystems (Everyone knows which ones those are) which appear to have not had adults in the rooms during the design phase.
If you've only been developing for the last decade or so, you may think it's normal that stuff breaks all the time when software is upgraded. If you've been developing since the 90s you'd know that "upgrades brings breakage" is a new phenomenon, and the field (prior to 2014 or thereabouts) is filled with engineering compromises made purely to avoid breakage.
It's why, if I have an application from 10 years ago in React, I won't even try to build it again, but if I have an application from 1995 in C, it'll build just fine, dependencies included[1] (with pages of warnings, of course).
[1] C dependency graphs are small and shallow, because of the high friction in adding third-party libraries.
This seems a little nuts, to be honest. It feels like you're just pushing failures from easy-to-diagnose issues with the function signature, to hard-to-diagnose issues with the function body. Basically the function body now has to support any combination of hash parameters that the function has ever taken over its entire history -- Is this information documented? Do you have test coverage for every possible parameter combo?
Of course, normal rules apply like, "Don't pollute your program with a proliferation of booleans."
As an escape hatch, you end up doing a lot of exclusions and overrides, basically creating a lockfile smeared over your pom.
P.S. Sadly, I think enough people have left Twitter that it's never going to be what it was again.
There's a very strong argument that manually managing deps > auto updating, regardless of the ergonomics.
P.S. You're, right, but also it's where the greatest remnant remains. :(