Readit News logoReadit News
TheDong · a year ago
I think it's a bad addition since it pushes people towards a worse solution to a common problem.

Using "go tool" forces you to have a bunch of dependencies in your go.mod that can conflict with your software's real dependency requirements, when there's zero reason those matter. You shouldn't have to care if one of your developer tools depends on a different version of a library than you.

It makes it so the tools themselves also are being run with a version of software they weren't tested with.

If, for example, you used "shell.nix" or a dockerfile with the tool built from source, the tool's dependencies would match it's go.mod.

Now they have to merge with your go.mod...

And then, of course, you _still_ need something like shell.nix or a flox environment (https://flox.dev/) since you need to control the version of go, control the version of non-go tools like "protoc", and so you already have a better solution to downloading and executing a known version of a program in most non-trivial repos.

jchw · a year ago
Yep, unfortunately this concern was mostly shrugged off by the Go team when it was brought up (because it would've required a lot of work to fix IIRC, which I think is a bad excuse for such a problem). IMO, a `go tool` dependency should've worked exactly the same way that doing `go install ...` works with a specific @tag: it should resolve the dependencies for that tool completely independently. Because it doesn't, you really, really shouldn't use this mechanism for things like golangci-lint, unfortunately. In fact, I honestly just recommend not using it at all...
arccy · a year ago
i think it's more an argument to not use the dumpster fire that is golangci-lint
mseepgood · a year ago
> Using "go tool" forces you to have a bunch of dependencies in your go.mod

No, it doesn't. You can use "go tool -modfile=tools.mod mytool" if you want them separate.

jcmfernandes · a year ago
I built a simple tool to generate binstubs and work around this problem: https://github.com/jcmfernandes/go-tools-binstubs

However, having multiple tools share a single mod file still proves problematic occasionally, due to incompatible dependencies.

jen20 · a year ago
This should almost certainly be the default recommendation.
nijave · a year ago
That seems like it should have been the default
potamic · a year ago
Why do ecosystems continue to screw up dependency management in 2025? You would think the single most widely used feature by programmers everywhere would be a solved problem by now.
LinXitoW · a year ago
Go launched without any dependency management. Go people believe that anything non-trivial is either a useless abstraction, or too difficult for the average developer. So their solution is to simply not add it, and tell anyone claiming to need it that their mind has been poisoned by other languages...

Until that time when they realize everyone else was right, and they add an over simplified, bad solution to their "great" language.

herval · a year ago
Go has been a masterclass in disparaging industry learnings as "unnecessary" (dep management, generics, null safety), then gradually bolting them into the language. It's kinda hilarious to watch.
devmor · a year ago
Because everyone has their own opinion about it, like most other standards.

Personally, I think the way PHP handles dependencies is vastly preferable to every other ecosystem I've developed in, for most types of development - but I know its somewhat inflexible approach would be a headache for some small percentage of developers too.

Defletter · a year ago
> Using "go tool" forces you to have a bunch of dependencies in your go.mod that can conflict with your software's real dependency requirements, when there's zero reason those matter. You shouldn't have to care if one of your developer tools depends on a different version of a library than you.

Heh, were the people who made 'go tool' the same people who made Maven? Would make sense :P

larusso · a year ago
At least build time dependencies are separated from runtime dependencies.
nixosbestos · a year ago
I ask this out of curiosity, not accusation, do you work for Flox? Can't say I've ever seen it mentioned "in the wild".
TheDong · a year ago
I do not, just aware of it as a hip tool in the general space of "I want to install and version tools per-project, but I don't want to learn the nix programming language"
carlthome · a year ago
Hehe, I was also wondering this.
peterldowns · a year ago
I agree — tools should be shared artifacts that the team downloads and can guarantee are the same for everyone. I usually setup a flake.nix for everyone but flox, earthly, devenv, jetify, are all great alternatives. Ideally your tools are declaratively configured and shared between your team and your CI/CD environment, too.

The `go tool` stuff has always seemed like a junior engineering hack — literally the wrong tool for the job, yeah sure it gets it working, but other than that it's gross.

arccy · a year ago
well now it leans in to go's reproducible tooling so it's declaratively configured and shared between your team and your CI/CD environment too.

Deleted Comment

arccy · a year ago
control the go version with the `toolchain` directive, replace protoc with buf.build (which is way better anyway).
amukher1 · a year ago
Found this a lot easier to follow. https://blog.howardjohn.info/posts/go-tools-command/

And didn't quite understand the euphoria.

mook · a year ago
There's also the (draft) release notes: https://go.dev/doc/go1.24 And the docs: https://go.dev/doc/modules/managing-dependencies#tools

I've done the blank import thing before, it was kinda awkward but not _that_ bad.

gr4vityWall · a year ago
The feature itself seems reasonable and useful. Specially if most of your tooling is written is Go as well. But this part caught my attention:

> user defined tools are compiled each time they are used

Why compile them each time they are used? Assuming you're compiling them from source, shouldn't they be compiled once, and then have the 'go tool' command reuse the same binaries? I don't see why it compiles them at the time you run the tool, rather than when you're installing dependencies. The benchmarks show a significant latency increase. The author also provided a different approach which doesn't seem to have any obvious downsides, besides not sharing dependency versions (which may or may not be a good thing - that's a separate discussion IMO).

kbolino · a year ago
There is a cache and they're aren't re-compiled unless they change or the cache is cleared.
Groxx · a year ago
"Shared dependency state" was my very first thought when I heard about how it was built.

Yeah I want none of that. I'll stick with my makefiles and a dedicated "internal/tools" module. Tools routinely force upgrades that break other things, and allowing that is a feature.

nikolayasdf123 · a year ago
same. tools are not part of the codebase, nor dependencies.

you got to have isolation of artefact and tools around to work with it.

it is bonkers to start versioning tools used to build project mixed with artefact dependencies itself. should we include version of VSCode used to type code? how about transitive dependencies of VSCode? how about OS itself to edit files? how about version of LLM model that generated some of this code? where does this stop?

0x696C6961 · a year ago
The way it's implemented is the way that almost everyone already does it. It's just more convenient now.
pragma_x · a year ago
Ah, okay. So this achieves yet more feature parity with npm/yarn? Sweet.
0x696C6961 · a year ago
Yeah it's a nice QOL improvement. Not some game changer ...

Deleted Comment

bjackman · a year ago
I always think it's a shame that these features end up getting built into ecosystem-specific build tools. Why do we need separate build systems for every language? It seems entirely possible to have build system that can do all this stuff for every language at once.

From my experience at Google I _know_ this is possible in a Megamonorepo. I have briefly fiddled with Bazel and it seems there's quite a barrier to entry, I dunno if that's just lack of experience but it didn't quite seem ready for small projects.

Maybe Nix is the solution but that has barrier to entry more at the human level - it just seems like a Way of Life that you have to dive all the way into.

Nonetheless, maybe I should try diving into one or both of those tools at some point.

sunshowers · a year ago
(I worked on source control at FB for many years.)

The main argument for not overly genericizing things is that you can deliver a better user experience through domain-specific code.

For Bazel and buck2 specifically, they require a total commitment to it, which implies ongoing maintenance work. I also think the fact that they don't have open governance is a hindrance. Google's and Meta's internal monorepos make certain tradeoffs that don't quite work in a more distributed model.

Bazel is also in Java I believe, which is a bit unfortunate due to process startup times. On my machine, `time bazelisk --help` takes over 0.75 seconds to run, compared to `time go --help` which is 0.003 seconds and `time cargo --help` which is 0.02 seconds. (This doesn't apply to buck2, which is in Rust.)

jeffbee · a year ago
This is likely because you are running it in some random PWD that doesn't represent a bazel workspace. When running in a workspace the bazel daemon persists. Inside my workspace the bazelisk --help invocation needs just 30ms real time.

Running bazel outside of a bazel workspace is not a major use-case that needs to be fixed.

spockz · a year ago
GraalVM’s native image has been a thing for a while now. This could overcome the daemon issue partially. The daemon does more ofc by as it keeps some state in memory. But at least the binary start time is a solved problem in Java land.
pornel · a year ago
> Why do we need separate build systems for every language?

Because being cross-language makes them inherit all of the complexity of the worst languages they support.

The infinite flexibility required to accommodate everyone keeps costing you at every step.

You need to learn a tool that is more powerful than your language requires, and pay the cost of more abstraction layers than you need.

Then you have to work with snowflake projects that are all different in arbitrary ways, because the everything-agnostic tool didn't impose any conventions or constraints.

The vague do-it-all build systems make everything more complicated than necessary. Their "simple" components are either a mere execution primitive that make handling different platforms/versions/configurations your problem, or are macros/magic/plugins that are a fractal of a build system written inside a build system, with more custom complexity underneath.

OTOH a language-specific build system knows exactly what that language needs, and doesn't need to support more. It can include specific solutions and workarounds for its target environments, out of the box, because it knows what it's building and what platforms it supports. It can use conventions and defaults of its language to do most things without configuration. General build tools need build scripts written, debugged, and tweaked endlessly.

A single-language build tool can support just one standard project structure and have all projects and dependencies follow it. That makes it easier to work on other projects, and easier to write tooling that works with all of them. All because focused build system doesn't accommodate all the custom legacy projects of all languages.

You don't realize how much of a skill-and-effort black hole build scripts are is until you use a language where a build command just builds it.

bjackman · a year ago
But this just doesn't match my experience with Blaze at all. For my internal usage with C++ & Go it's perfect. For the weird niche use case of building and packaging BPF programs (with no support from the central tooling teams, we had to write our own macros) it still just works. For Python where it's a poor fit for the language norms it's a minor inconvenience but still mostly stays out of the way. I hear Java is similar.

For vendored open source projects that build with random other tools (CMake, Nix, custom Makefile) it's a pain but the fact that it's generally possible to get them building with Blaze at all says something...

Yes, the monorepo makes all of this dramatically easier. I can consider "one-build-tool-to-rule-them-all isn't really practical outside of a monorepo" as a valid argument, although it remains to be proven. But "you fundamentally need a build tool per language" doesn't hold any water for me.

> That makes it easier to work on other projects, and easier to write tooling that works with all of them.

But... this is my whole point. Only if those projects are in the same language as yours! I can see how maybe that's valid in some domains where there's probably a lot of people who can just do almost everything on JS/TS, maybe Java has a similar domain. But for most of us switching between Go/Cargo/CMake etc is a huge pain.

Oh btw, there's also Meson. That's very cross-language while also seeming extremely simple to use. But it doesn't seem to deliver a very full-featured experience.

munificent · a year ago
I think the problem is basically because the build system has to be implemented using some ecosystem, and no other ecosystem wants to depend on that one.

If your "one build system to rule them all" was built in, say, Ruby, the Python ecosystem won't want to use it. No Python evangelist wants to tell users that step 1 of getting up and running with Python is "Install Ruby".

So you tend to get a lot of wheel reinvention across ecosystems.

I don't necessarily think it's a bad thing. Yes, it's a lot of redundant work. But it's also an opportunity to shed historical baggage and learn from previous mistakes. Compare, for example, how beloved Rust's cargo ecosystem is compared the ongoing mess that is package management in Python.

A fresh start can be valuable, and not having a monoculture can be helpful for rapid evolution.

pansa2 · a year ago
> No Python evangelist wants to tell users that step 1 of getting up and running with Python is "Install Ruby".

True, but the Python community does seem to be coalescing around tools like UV and Ruff, written in Rust. Presumably that’s more acceptable because it’s a compiled language, so they tell users to “install UV” not “install Rust”.

6keZbCECT2uB · a year ago
Partly in jest, you can often find a Perl / bash available where you can't find a Python, Ruby, or Cargo.
marwis · a year ago
Sounds like the only way out of this is to design language agnostic tooling protocols that anybody can implement.
jlarsen · a year ago
I've had exactly the same thought, after hitting walls repeatedly with limitations in single-language ecosystems. And likewise, I've had the same concerns around the complexity that comes with Bazel/Buck/Nix.

It's been such a frustration for me that I started writing my own as a side project a year or two ago, based on a using a standardized filesystem structure for packages instead of a manifest or configuration language. By leaning into the filesystem heavily, you can avoid a lot of language lock-in and complexity that comes with other tools. And with fingerprint-based addressing for packages and files, it's quite fast. Incremental rebuild checks for my projects with hundreds of packages take only 200-300ms on my low-end laptop with an Intel N200 and mid-tier SSD.

It's an early stage project and the documentation needs some work, but if you're interested: https://github.com/somesocks/dryadhttps://somesocks.github.io/dryad/

One other alternative I know of that's multi-language is Pants(https://www.pantsbuild.org/), which has support for packages in several languages, and an "ad-hoc" mode which lets you build packages with a custom tool if it isn't officially supported. They've added support for quite a few new tools/languages lately, and seem to be very much an active project.

8n4vidtmkvmk · a year ago
Not loving the cutesy names (https://somesocks.github.io/dryad/docs/02-concepts/01-the-ga...). I want my build tool to be boring.
6keZbCECT2uB · a year ago
I agree. In my opinion, if you can keep the experience of Bazel limited to build targets, there is a low barrier to entry even if it is tedious. Major issues show up with Bazel once you start having to write rules, tool chains, or if your workspace file talks to the Internet.

I think you can fix these issues by using a package manager around Bazel. Conda is my preferred choice because it is in the top tier for adoption, cross platform support, and supported more locked down use cases like going through mirrors, not having root, not controlling file paths, etc. What Bazel gets from this is a generic solution for package management with better version solving for build rules, source dependencies and binary dependencies. By sourcing binary deps from conda forge, you get a midpoint between deep investment into Bazel and binaries with unknown provenance which allows you to incrementally move to source as appropriate.

Additional notes: some requirements limit utility and approach being partial support of a platform. If you require root on Linux, wsl on Windows, have frequent compilation breakage on darwin, or neglect Windows file paths, your cross platform support is partial in my book.

Use of Java for Bazel and Python for conda might be regrettable, but not bad enough to warrant moving down the list of adoption and in my experience there is vastly more Bazel out there than Buck or other competitors. Similarly, you want to see some adoption from Haskell, Rust, Julia, Golang, Python, C++, etc.

JavaScript is thorny. You really don't want to have to deal with multiple versions of the same library with compiled languages, but you have to with JavaScript. I haven't seen too much demand for JavaScript bindings to C++ wrappers around a Rust core that uses C core libraries, but I do see that for Python bindings.

sunshowers · a year ago
> You really don't want to have to deal with multiple versions of the same library with compiled languages, but you have to with JavaScript.

Rust handles this fine by unifying up to semver compatibility -- diamond dependency hell is an artifact of the lack of namespacing in many older languages.

lihaoyi · a year ago
My experience with Bazel is it does everything you need, and works incredibly well once set up, but is ferociously complex and hard to learn and get started with. Buck and Pants are easier in some ways, but fundamentally they still look and feel mostly like Bazel, warts and all

I've been working on an alternate build tool Mill (https://www.mill-build.org) tries to provide the 90% of Bazel that people need at 10% the complexity cost. From a greenfield perspective a lot of work to try and catch up to Bazel's cross-language support and community. I think we can eventually get there, but it will be a long slog

morepedantic · a year ago
Brazil performs dependency resolution in a language-agnostic way.

https://gist.github.com/terabyte/15a2d3d407285b8b5a0a7964dd6...

imiric · a year ago
This seems handy, but often the tools run by `go generate` are outside of the Go ecosystem, or need to be binaries.

So I think a general solution would work better, and not be limited to Go. There are plenty of tools in this space to choose from: mise, devenv, Nix, Hermit, etc.

TheCondor · a year ago
Mise is right on the edge of being pretty killer. I’m bullish on it. It also includes a lot of nice to haves that you can declare, like k9s, which isn’t exactly a dev tool but becomes expected
arccy · a year ago
better motivation for rewrite it in go...

but are there really that many tools you need in a go project not written in go?

rednafi · a year ago
I like that Go decided to natively support this. But since it’s keeping the dev dependencies in the same go.mod, won’t it make the binary larger?

In Python’s uv, the pyproject.toml has separate sections for dev and prod dependencies. Then uv generates a single lock file where you can specify whether to install dev or prod deps.

But what happens if I run ‘go run’ or ‘go build’? Will the tools get into the final artifact?

I know Python still doesn’t solve the issue where a tool can depend on a different version of a library than the main project. But this approach in Go doesn’t seem to fix it either. If your tool needs an older version of a library, the single go.mod file forces the entire project to use the older version, even if the project needs—or can only support—a newer version of the dependency.

catlifeonmars · a year ago
> won’t it make the binary larger?

No. The binary size is related to the number of dependencies you use in each main package (and the dependencies they use, etc). It does not matter how many dependencies you have in your go.mod.

rednafi · a year ago
Ah, thanks. This isn't much of an upgrade from the `tools.go` convention, where the tools are underscore-imported. All it does is provide an indication in the `go.mod` file that some dependencies come from tools.

Plus, `go tool <tool-name>` is slower than `./bin/<tool-name>`. Not to mention, it doesn’t resolve the issue where tools might use a different version of a dependency than the app.

pqb · a year ago
> In Python’s uv, the pyproject.toml has separate sections for dev and prod dependencies.

If you want, you can have multiple ".mod" files and set "-modifle=dev-env.mod" every time you run "go" binary with "run" or "build" command. For example, you can take what @mseepgood mentioned:

> go tool -modfile=tools.mod mytool

Plus, in last versions of the Go we have workspaces [0][1]. It is yet another way to easily switch between various environments or having isolated modules in the monorepo.

[0]: https://go.dev/blog/get-familiar-with-workspaces

[1]: https://go.dev/doc/tutorial/workspaces

remram · a year ago
So it's just dev-dependencies?
nikolayasdf123 · a year ago
a bit worse. it is all mixed up. to keep separate dependency tree for tools need use old approach with go.mod. it is actually even worse now.
nikolayasdf123 · a year ago
UPD:

1. it is single tree

2. BUT tools will not propagate through the dependency tree downstream due to go module pruning

check this comment: https://github.com/golang/go/issues/48429#issuecomment-26184...

official docs: https://tip.golang.org/doc/modules/managing-dependencies#too...

> Due to module pruning, when you depend on a module that itself has a tool dependency, requirements that exist just to satisfy that tool dependency do not usually become requirements of your module.

silverwind · a year ago
Yes, except it does not support version ranges.
verdverm · a year ago
Go uses Minimum Version Selection (MVS) instead of a SAT solver. There are no ranges in any go dependency specifications. It's actually a very simple and elegant algorithm for dependency version selection

https://research.swtch.com/vgo-mvs

jamietanna · a year ago
Yep, that's the intent
the_gipsy · a year ago
lol yea
WuxiFingerHold · a year ago
I don't understand why it's a good idea to couple tooling or configuration or infrastructure (e.g. Aspire.NET, which I'm also not convinced of being a good idea) so tightly with the application. An application should not need to be aware of how whatever tools are implemented or how configuration or infrastructure is managed. The tooling should point to the application as dependency. The application should not have any dependency on tooling.
globular-toast · a year ago
A note for the author in case they are reading: "i.e." means "that is", "e.g." means "for example". You should be able to substitute these meanings and find the sentence makes sense. In all cases here you wanted "e.g.".