Don't be discouraged by all the people in this thread saying you're using make wrong. One of the things that makes make a great tool is how deceptively simple it is. Yes not using .PHONY can potentially get you in trouble. But for a small project that's the sort of trap you'll fall into a year later, if at all, and even then you'll only be scratching your head for an hour. 99% of the time you don't have to care about doing things the proper way. Make lets you just hit the ground running and only imposes as much complexity as you need to keep the thing from falling apart.
> One of the things that makes make a great tool is how deceptively simple it is.
One of the worst things of Make is how deceptively simple it looks.
Make does exactly one thing: it takes input files, some dependencies and generates _exactly_one_ output file.
To have rules which don't generate output (like `install` or `all` or `clean` or all targets in the article) we need to resort to a hack, a special magic target like `.PHONY` (which hasn't been part of POSIX up to the 2017 version - IEEE Std 1003.1-2017 - https://pubs.opengroup.org/onlinepubs/9699919799/utilities/m..., only the current one - IEEE Std 1003.1-2024 - https://pubs.opengroup.org/onlinepubs/9799919799/utilities/m... includes `.PHONY`). If you want to generate more than one file (like an object file and a module or a precompiled header or ...) you are on your own to build some brittle hack to get that working. Don't forget that not every Make is GNU Make, BSD and other nix like Solaris/Illumos still exist.
Don't get me wrong: Make has it's uses for sufficiently complex projects which aren't too complex yet to need some "better" build system. Problem is that such projects may get too complex when more code is added and they inevitably gain some sort of scripts/programs to generate Makefiles or parts of Makefiles (so, an ad hoc meta build system is created).
And the problem isn't that they use it, but that they are proposing it as a solution to "everybody". And that their Makefile stops working as soon as there is a directory (or file) `build` (or `dev` or ...) in the project root.
> Make does exactly one thing: it takes input files, some dependencies and generates _exactly_one_ output file.
Not true. Your dependency graph might culminate on a single final target, but nothing prevents you from adding as many targets that generate as many output files as you feel like adding and set them as dependencies of your final target.
Think about it for a second. If Make was only able to output a single file, how in the world do you think it's used extensively to compile all source files of a project, generate multiple libraries, link all libraries, generate executables, and even output installers and push them to a remote repository?
> To have rules which don't generate output (like `install` or `all` or `clean` or all targets in the article) we need to resort to a hack, a special magic target like `.PHONY`
I don't understand what point you thought you were making. So a feature that boils down to syntactic sugar was added many years ago. So what? As you showed some gross misconceptions on what the tool does and how to use it, this point seems terribly odd.
> And the problem isn't that they use it, but that they are proposing it as a solution to "everybody".
I think you're making stuff up. No one wants Make to rule the world. I don't know where you got that from.
I think the whole point is that Make excels at a very specific usecase: implement workflows comprised of interdependent steps that can be resumed and incrementally updated. Being oblivious of Make leads many among us to reinvent the wheel poorly, using scripting languages to do much of the same thing but requiring far more work. If you can do this with a dozen lines of code in a Makefile, why on earth would you be churning out hundreds of lines of any random scripting language?
> If you want to generate more than one file (like an object file and a module or a precompiled header or ...)
He's not using C, though :-)
> And the problem isn't that they use it, but that they are proposing it as a solution to "everybody".
He's proposing it for the same reason I'm starting to like it, after many years in the industry: as a simple build wrapper.
> And that their Makefile stops working as soon as there is a directory (or file) `build` (or `dev` or ...) in the project root.
And they can fix that problem in 5 minutes, big deal :-)
> Don't forget that not every Make is GNU Make, BSD and other nix like Solaris/Illumos still exist.
This is a very bad reason in this day and age. 99.999999% of *NIX usage these days, probably 99.9999999999999999% for the average person, since most people won't ever get to those environments where BSD and Solaris are still used, is Linux.
And even for BSD and Solaris, guess what... you add an extra step in the build instructions asking them to... install GNU Make.
Heck, even back in 2005 (I think?) for Solaris one of the first things you'd do was to install the GNU userland wherever allowed because the Solaris one was so forlorn I swear I heard wooden planks creak and dust pouring down every time I had to use their version of ps.
And regarding POSIX, meh. If you're a C developer (C++, Rust, I guess), knock yourself out. Most of the stuff devs use are so far removed from POSIX... Actually, not removed, but has so many non-POSIX layers on top (I mean not standardized). Ruby bundler is not standardized like awk. Python pip is not standardized like make. Etc, etc. That's the reality we're in. POSIX is very useful but only as a very low level base most people don't need to chain themselves directly to. I'd definitely not avoid a tool because it's not in the latest POSIX standard (or only in the latest POSIX standard).
I still wouldn’t say it’s that complicated - you do need to know your way around the syntax a bit but it’s less challenging than getting all the other tooling working in the first place. :)
I've been a happy make user for 20+ years across many, many projects and many languages. I've never had issues with the .PHONY task that seems to bother people so much.
It's simple, readable, editable, composable and already installed everywhere.
It does what it says on the tin and not much else.
FWIW, I also wrap up whatever fad (or nightmare) build system people use in other projects when I need to deal with them.
I'll eat crow if wrong, but I'm guessing I know more about GNU make than
you do. It is none of the four things you claim. Also, people who say
"on the tin" need a good ass-kicking.
> Don't be discouraged by all the people in this thread saying you're using make wrong.
Fully agree, and I would add that it's far better to adopt the right tool for the job, even if you are not an expert, than avoiding the criticisms from perfectionists by adopting the wrong tool for the job.
Everyone needs to start from somewhere, and once the ball is rolling then incremental changes are easy to add.
People who want to call me out would be a lot more productive pointing me to some guides instead of chastising me over an ancient framework who's best documentation has been lost to time. And whose best practices are locked behing proprietary codebases.
Little tips here and there are nice, but that doesn't teach me the mentality of how to achitect a makefile
Every makefile recipe should produce exactly one output: $@. The makefile as a whole produces an arbitrary number of outputs since rules can depend on other rules.
This leads us to a neat rule of thumb for phony targets: any recipe that does not touch $@ and only $@ should have $@ marked as phony.
I find that keeping track of phony targets with a list makes things much easier.
phonies :=
phonies += something
something:
./do-something
phonies += something-else
something-else: something
./do-something-else
# touches $@ and thus does not need to be phony
create-file:
./generate-some-output > $@
.PHONY: $(phonies)
Makefiles are terrible tech. The problem is that they're slightly less bad than most other build system we've come up with, which makes them "useful" in a masochistic way.
Build systems tend to commit one or more of the following sins:
* Too basic: Once you try to build anything beyond a toy, it quickly becomes chaos.
* Too complicated: The upfront required knowledge, bureaucracy, synchronization and boilerplate is ridiculous. The build system itself takes an order of magnitude more data and memory than the build target.
* No standard library (or a substandard one that does things poorly or not at all): You must define everything yourself, leading to 10000 different incompatible implementations of the same build patterns. So now no one can just dive in and know what they're doing.
* Too constricting: The interface wasn't built as a simple layer upon an expert layer. So now as soon as your needs evolve, you have to migrate away.
* Too much magic: The hallmark of a poorly designed system. It doesn't have to be turtles all the way down, but it should be relatively close with few exceptions.
My 2c: Makefiles are excellent tech, just that a lot of people haven't learned to use it properly and use it as it was intended. I'm sure I'll get pushback, that's ok.
- Too basic: At least half of the software I use just uses plain makefiles and maybe a configure script. No autotools. I optionally run ./configure, and then make and make install, and it just works. I definitely wouldn't consider my setup to be a toy by any stretch of the imagination. It's built out of smaller programs that do one thing and one thing well.
- Too complicated: I don't know, I think make and how it works is really easy to understand to me at least. I guess everyone's had different experiences. Not necessarily your case, but I think usually it's because they had bad experiences that they probably blamed make for, when they were trying to build some complex project that either had a bad build setup itself (not make's fault), or without the requisite knowledge.
- No standard library: It's supposed to be tooling agnostic, which is what makes it universally applicable for a very wide range of tools, languages, and use cases. It's viewed as a feature, not a bug.
- Too constricting: I'm not sure what you mean here, it's designed to do one thing and one thing well. The simple layer is the dependency tracking.
- Too much magic: Cryptic or inconsistent syntax: See 'Too complicated'
The worst build systems are the ones centered on a particular programming language. Since there's N>>1 programming languages that's N>>1 build systems -- this does not scale, as the cognitive load is prohibitive.
The only general-purpose build system that spans all these languages is `make` or systems that target `make` (e.g., CMake). And this sucks because `make` sucks. And `make` sucks because:
- it's really difficult to use right
(think recursive vs. non-recursive make)
- so many incompatible variations:
- Unix/POSIX make
- BSD make
- GNU make
- `nmake` (Windows)
- it's rather ugly
But `make` used right is quite good. We're really lucky to have `make` for the lowest common denominator.
Gradle and Bazel are absolutely general purpose build tools that are very widely used in industry.
As a smaller contender, my personal favorite is the Mill build tool (written in Scala), that is basically what a build tool should be, it’s as close to a theoretical perfect as possible. I really advise reading the blog post by its author Li Haoyi: https://www.lihaoyi.com/post/SoWhatsSoSpecialAboutTheMillSca...
Xmake https://xmake.io/ for C and C++ (I haven't use that for anything serious yet) and Buck 2 https://buck2.build/ if you need a really complex build system. Both of these do caching of build artifacts and can do distributed builds (with less and more complex setup).
Not OP, but its not just that C/C++ lacks modules. I think that is missing the real issue. Any complicated program probably needs a custom developed tool to build it. As a simple example, imagine a program that uses a database - you want to keep the sources as SQL and generate classes from them. Thats a custom build step.
Its just that in some languages and build systems (Node, Maven), we have abstracted this away by calling them plugins and they probably come from the same group that made the library you need.
No such pluginsystem exists, as far as I am aware, for makefiles.
There are projects that generate files, depend on multiple languages, etc. If you push the job of a build tool to the compiler infrastructure, then why even have a “build tool” in the first place? Make is simply anemic for anything remotely complex, and there are countless better tools that actually solve the problem.
Good luck writing Makefiles for Fortran, OCaml or (whenever they will really, actually work) C++ modules.
There aren't many widely used build systems that can handle such dynamic dependencies without some special "magic" for these, the only one that I know of (with a significant number of users, so not Shake) is Buck 2 (Bazel and all C++ build systems use "special magic", you can't write in user rules).
One or more, OK that leaves of course lots of room. I would estimate:
(too basic) Makefiles are not. (too complicated) They can be, depends on what you make them to be. (standard library) Well, there is one, there are some builtin functions you can use in the makefile. (too constricting) Haven't noticed that, so I would say no. (too much magic) Hmmm I don't see it. It is very clear what is a target and a dependency and so on. Not so magical. (syntax) Yeah definitely could be better. Even a plain JSON file would be better here.
How to compile and run this? We need a build system! Download and install GNU Make.
When that step is complete:
Type in
make hello
and its done. Now, run via ./hello
See, Too much magic (didn't even have a makefile or Makefile), no standard library, Too constricting, cryptic, too basic. And, because you had to install Make, too complicated. Hits every one of your objections.
I adore Make. I've written one (or more) for every single task or project I've touched in the last 20 years.
No smarts. It's just a collection of snippets with a few variables. "make run", "make test", "make lint", that kind of thing.
"make recent" = lint then run the most recently modified script.
You could do the same thing with Bash or other shells, but then you get stuck into Developer Land. Things are so much more complicated, without giving extra value. Make is just a DSL saying "files like this, are made into files like that, by running this command or two". That's it.
> Make is just a DSL saying "files like this, are made into files like that, by running this command or two".
Nicely put.
Decades ago i wrote a testing framework in java where you could specify your tests and their dependent classes using make-like syntax. So you could have a set of test classes which define the "baseline suite", then another layer of test classes which is dependent on the above and only run if the above is successful and so on.
I really do not understand why folks today make everything so complicated. My advise has always been, stick to standard Unix tools and their way of doing things (tested and proven over time) unless you run into something which could absolutely not be done that way. Time is finite/limited and i prefer to spend it on System/Program Design/Modeling/Structure/Patterns etc. which are what is central to problem-solving; everything else is ancillary.
That just rosy tinted glasses most of the historical users are wearing. It takes time and nerve to admit that you have decades of experience with a footgun that isn’t even trivial to use beyond tutorial/builtin use cases.
> Make is just a DSL saying "files like this, are made into files like that, by running this command or two". That's it.
The problem with make isn’t make - it’s that what makes calling usually doesn’t do that anymore. On my last project we had a makefile that had 4 main commands - build test frontend deploy. Build and test called through to maven, frontend called npm, and deploy called docker + aws.
All of those tools do their own internal state tracking, caching, incrementalness and don’t report what they’ve done, so it’s not possible to write a molecule that says “only deploy if build has been updated” because maven/cargo/dotnet/npm/go don’t expose that information.
The author is not even using the mtime-based dependency tracking. Also the targets are supposed to be PHONY but not marked as such. The author could have replaced it with a shell script that read $1 and matched on it to determine what to do.
Or just with a simple command which is guaranteed to be on most Linux systems already - make.
Maybe his Makefiles aren't complex, nor they seem to follow all the best practices invented by code gurus in sandals, but it works and, what's important, it works
for him.
The strengths of make, in this context where it's been coaxed into serving as a task runner for small projects, are:
1) It's already installed practically everywhere
2) It reduces your cognitive load for all sorts of tasks down to just remembering one verb which you can reuse across multiple projects, even if the implementation ends up differing a bit
3) In conjunction with the similarly ubiquitous SSH and git, you have everything you need to apply the basic principles of DevOps automation and IaC
There's something special about waking up one day with an idea, and being able to create a fresh git repository where the first commit is the Makefile you've had in your back pocket for years that scripts everything from environment setup to deployment to test automation to code reviews.
There's zero effort beyond just copying your single file "cookbook" into that new repo.
Please help me understand why this thing exists. Like, no snark, I like using the proper tool for a job -- when would I look at the project and think "this is something that is better done with 'just' tool". Instead of readme.txt and a folder with scripts
why do you need a "command runner"? Have you heard of bash functions? Or... make? The thing is too simple to justify installing another tool, however nifty it is.
Here’s a one-line horror story for you (from a real project I’m working on):
.PHONY: $(MAKECMDGOALS)
> The author could have replaced it with a shell script that read $1
Sure, but `./build.sh dev` is a bit less obvious than `make dev`.
Another reason to use Make even if you don’t have any non-phony steps is that you can add those later if needed. (I agree that the author should mark {dev,build,deploy} as phony though.)
Why is this a horror story? Under certain assumptions of how the author intends to use this, this sounds like a sensible way to define a dynamic list of phony targets to me, without having to specify them by hand.
There are many reasonable scenarios why you might want to do this: determining at the point of calling make which targets to force or deactivate for safety, projects with nested or external makefiles not directly under your control, reuse of MAKECMDGOALS throughout the makefile (including propagation to submakefiles), ...
While you're technically correct, what I gathered from their experience is the consistency of usage, between not only their own projects but third-party projects too.
They could make technical improvements to their own Makefiles, sure. But it's more about being able to enter a project and have a consistent experience in "getting started".
We all were beginners at one time or another. And if you want to learn a tool, it helps to actually use it, even if your greenhorn usage is less than perfect. You can make incremental improvements as you learn, like we all do.
That's the beauty of make and shell, it's follows the UNIX principle of being simple and doing one thing and one thing well. People want it to do many other things, like be a scripting language, a dependency tracker, etc, so they're willing to pull in bloatware. New isn't necessarily better. Autoconf and automake isn't make.
None of them are simple, they are chock full of hacks upon hacks, “fixing” their own idiocies, and by extension, none of them are doing their one thing well. Especially bash scripts, they should be left behind..
Yes, the UNIX principle of being simple and doing one thing and one thing well.
Make does dependency tracking relatively well (for 1976). But if you just want to run some commands, your shell already does that just as well, without any of the caveats that apply to make.
There isn’t even a need for a shell script. The author is already invoking three separate tools, each of which has a mechanism for invoking custom commands.
Technically all of these make targets look for files by the names of the targets. Each one should really be defined as .PHONY.
That said, I used to write makefiles like this all the time, but have since switched to just and justfiles in recent years which make this the default behavior, and is generally simpler to use. Things like parameters are simpler.
I kinda like these make-ish systems, but they all have one problem: Make is already on any Linux and Mac, and is pretty easy to get on Windows as well. (It’s a real pity they don’t include it in the Git Bash!) Just using the lowest common denominator is a big argument for Make IMO.
You have to handle dependencies either way to build a project - what’s one more tiny executable?
This criticism might make sense for some non-vim editor because you might have to ssh into a remote location where you can’t install stuff. But if you should be able to build a project and thus install its required dependencies, then you might as well add one additional word to the install command.
On Windows if you don't use WSL, Cygwin gets you 95% of the way there. I've been using it for decades to develop CLI tools and backbends in Python and a few other languages. You learn the quirks in about 1 month, add some tooling like apt-cyg and map C: to /c and you're off to the races.
A big mistake Make has is mixing phony and file targets in the same namespace. They should be distinguishable by name, e.g. phony targets start with a : or something.
yeah just is really cool but it's not really commonly installed so that's kind of annoying.
i feel like we're due for some kind of newfangled coreutils distribution that packages up all the most common and useful newfangled utilities (just, ripgrep, and friends) and gets them everywhere you'd want them.
But I want please, ag and friends!
The "problem" with this kind of package is that everybody wants something else. And the chances that they get a part of the default MacOS or Windows install (or even part of the XCode command line tools or Plattform SDK (or whatever that is called now)) is quite small.
I like `asdf` a lot for this, but I actually don't use it for either of those examples (though it does have plugins for them). Ripgrep is in most package repos by now and all my dev machines have a Rust toolchain installed so I can build and install `just` from source with a quick command.
It funny that make evokes such fierce arguments, almost like the semi-religious vi-vs-emacs wars of old.
I agree fully with the OP, in particular I find it smart that he wraps anything in a top-level makefile, even if other, more sophisticated build tools are used. The advantage is standardization, not having to remember anything and to know that if you wrote it, you will just be able to type "make" and it will work.
Let's say a C person wants to compile a Rust project, they would not have to look up how cargo works, but could simply type "make" (or "gmake"; I don't use GNU specifics, but try to be POSIX compliant, even if it is certainly true that almost 100% of makes are gmakes).
Thanks for proposing the use of the timeless "make" as a sort of top-level build system driver; this will probably still work in 250 years.
It's funny such a simple title inspired a flamewar. The article itself is an insanely simple use case for make (that uses gulp in 2024?) that clearly no one read.
cargo is a bad example as it's universally `cargo build`.
Make on its own is great but most of the time I've worked with C projects it's been cmake/autotools + global pkg installs, which you Do have to frequently look up.
It's an interesting phenomenon. ChatGPT and other LLMs have really opened up previously "archaic" tooling like Make and Bash. I've "written" more Bash in the last year than my entire career previously, because LLMs are such good copilots for that.
Agreed but my favorite thing is to take a Makefile and throw it into ChatGPT and have it give me a justfile and seeing it remove all the weird makefile patterns.
Yes, Make is awesome. I use it for so many things. It's a great way to automate tasks. For example my personal website is built using a Makefile that calls bash scripts to rebuild the updated web pages, and I deploy it using a git push to my server and a git hook there that calls Make. However there are files that I don't want to put into the Git repository because they are blobs that may change often like PDFs of my teaching materials. It's okay, I have an "uploads" target in my Makefile that will upload only the modified PDFs to my server and this target is a dependency of the "deploy" target which does the git push so I don't even have to think about it.
Also the updated PDFs for my courses materials are automatically put into my websites source tree by another Makefile that I use to manage and build my teaching materials and which let me either build the PDFs I use from my LaTeX sources or build from the same sources alternate versions of the materials for my students (without solutions to the lab sessions exercises for example) and automatically publish those to my local website version to be uploaded whenever I want to deploy the updated website.
It's kind of Makefiles all the way down. I like Makefiles! =)
One of the worst things of Make is how deceptively simple it looks.
Make does exactly one thing: it takes input files, some dependencies and generates _exactly_one_ output file. To have rules which don't generate output (like `install` or `all` or `clean` or all targets in the article) we need to resort to a hack, a special magic target like `.PHONY` (which hasn't been part of POSIX up to the 2017 version - IEEE Std 1003.1-2017 - https://pubs.opengroup.org/onlinepubs/9699919799/utilities/m..., only the current one - IEEE Std 1003.1-2024 - https://pubs.opengroup.org/onlinepubs/9799919799/utilities/m... includes `.PHONY`). If you want to generate more than one file (like an object file and a module or a precompiled header or ...) you are on your own to build some brittle hack to get that working. Don't forget that not every Make is GNU Make, BSD and other nix like Solaris/Illumos still exist.
Don't get me wrong: Make has it's uses for sufficiently complex projects which aren't too complex yet to need some "better" build system. Problem is that such projects may get too complex when more code is added and they inevitably gain some sort of scripts/programs to generate Makefiles or parts of Makefiles (so, an ad hoc meta build system is created).
And the problem isn't that they use it, but that they are proposing it as a solution to "everybody". And that their Makefile stops working as soon as there is a directory (or file) `build` (or `dev` or ...) in the project root.
I've definitely been using .PHONY on various Linux and MacOS computers long before 2017.
Maybe it's just me, but I've never much cared for whether or not something is specified if it happens to be present everywhere I go.
Not true. Your dependency graph might culminate on a single final target, but nothing prevents you from adding as many targets that generate as many output files as you feel like adding and set them as dependencies of your final target.
Think about it for a second. If Make was only able to output a single file, how in the world do you think it's used extensively to compile all source files of a project, generate multiple libraries, link all libraries, generate executables, and even output installers and push them to a remote repository?
> To have rules which don't generate output (like `install` or `all` or `clean` or all targets in the article) we need to resort to a hack, a special magic target like `.PHONY`
I don't understand what point you thought you were making. So a feature that boils down to syntactic sugar was added many years ago. So what? As you showed some gross misconceptions on what the tool does and how to use it, this point seems terribly odd.
> And the problem isn't that they use it, but that they are proposing it as a solution to "everybody".
I think you're making stuff up. No one wants Make to rule the world. I don't know where you got that from.
I think the whole point is that Make excels at a very specific usecase: implement workflows comprised of interdependent steps that can be resumed and incrementally updated. Being oblivious of Make leads many among us to reinvent the wheel poorly, using scripting languages to do much of the same thing but requiring far more work. If you can do this with a dozen lines of code in a Makefile, why on earth would you be churning out hundreds of lines of any random scripting language?
A pattern like
is pretty common. What's the issue?Deleted Comment
He's not using C, though :-)
> And the problem isn't that they use it, but that they are proposing it as a solution to "everybody".
He's proposing it for the same reason I'm starting to like it, after many years in the industry: as a simple build wrapper.
> And that their Makefile stops working as soon as there is a directory (or file) `build` (or `dev` or ...) in the project root.
And they can fix that problem in 5 minutes, big deal :-)
> Don't forget that not every Make is GNU Make, BSD and other nix like Solaris/Illumos still exist.
This is a very bad reason in this day and age. 99.999999% of *NIX usage these days, probably 99.9999999999999999% for the average person, since most people won't ever get to those environments where BSD and Solaris are still used, is Linux.
And even for BSD and Solaris, guess what... you add an extra step in the build instructions asking them to... install GNU Make.
Heck, even back in 2005 (I think?) for Solaris one of the first things you'd do was to install the GNU userland wherever allowed because the Solaris one was so forlorn I swear I heard wooden planks creak and dust pouring down every time I had to use their version of ps.
And regarding POSIX, meh. If you're a C developer (C++, Rust, I guess), knock yourself out. Most of the stuff devs use are so far removed from POSIX... Actually, not removed, but has so many non-POSIX layers on top (I mean not standardized). Ruby bundler is not standardized like awk. Python pip is not standardized like make. Etc, etc. That's the reality we're in. POSIX is very useful but only as a very low level base most people don't need to chain themselves directly to. I'd definitely not avoid a tool because it's not in the latest POSIX standard (or only in the latest POSIX standard).
I still wouldn’t say it’s that complicated - you do need to know your way around the syntax a bit but it’s less challenging than getting all the other tooling working in the first place. :)
It's simple, readable, editable, composable and already installed everywhere.
It does what it says on the tin and not much else.
FWIW, I also wrap up whatever fad (or nightmare) build system people use in other projects when I need to deal with them.
I'll eat crow if wrong, but I'm guessing I know more about GNU make than you do. It is none of the four things you claim. Also, people who say "on the tin" need a good ass-kicking.
Fully agree, and I would add that it's far better to adopt the right tool for the job, even if you are not an expert, than avoiding the criticisms from perfectionists by adopting the wrong tool for the job.
Everyone needs to start from somewhere, and once the ball is rolling then incremental changes are easy to add.
Great job!
Little tips here and there are nice, but that doesn't teach me the mentality of how to achitect a makefile
This leads us to a neat rule of thumb for phony targets: any recipe that does not touch $@ and only $@ should have $@ marked as phony.
I find that keeping track of phony targets with a list makes things much easier.
What does it get you other than the ability to print the list of all phonys?
Build systems tend to commit one or more of the following sins:
* Too basic: Once you try to build anything beyond a toy, it quickly becomes chaos.
* Too complicated: The upfront required knowledge, bureaucracy, synchronization and boilerplate is ridiculous. The build system itself takes an order of magnitude more data and memory than the build target.
* No standard library (or a substandard one that does things poorly or not at all): You must define everything yourself, leading to 10000 different incompatible implementations of the same build patterns. So now no one can just dive in and know what they're doing.
* Too constricting: The interface wasn't built as a simple layer upon an expert layer. So now as soon as your needs evolve, you have to migrate away.
* Too much magic: The hallmark of a poorly designed system. It doesn't have to be turtles all the way down, but it should be relatively close with few exceptions.
* Cryptic or inconsistent syntax.
- Too basic: At least half of the software I use just uses plain makefiles and maybe a configure script. No autotools. I optionally run ./configure, and then make and make install, and it just works. I definitely wouldn't consider my setup to be a toy by any stretch of the imagination. It's built out of smaller programs that do one thing and one thing well.
- Too complicated: I don't know, I think make and how it works is really easy to understand to me at least. I guess everyone's had different experiences. Not necessarily your case, but I think usually it's because they had bad experiences that they probably blamed make for, when they were trying to build some complex project that either had a bad build setup itself (not make's fault), or without the requisite knowledge.
- No standard library: It's supposed to be tooling agnostic, which is what makes it universally applicable for a very wide range of tools, languages, and use cases. It's viewed as a feature, not a bug.
- Too constricting: I'm not sure what you mean here, it's designed to do one thing and one thing well. The simple layer is the dependency tracking.
- Too much magic: Cryptic or inconsistent syntax: See 'Too complicated'
The only general-purpose build system that spans all these languages is `make` or systems that target `make` (e.g., CMake). And this sucks because `make` sucks. And `make` sucks because:
But `make` used right is quite good. We're really lucky to have `make` for the lowest common denominator.https://pubs.opengroup.org/onlinepubs/9699919799/utilities/m...
As a smaller contender, my personal favorite is the Mill build tool (written in Scala), that is basically what a build tool should be, it’s as close to a theoretical perfect as possible. I really advise reading the blog post by its author Li Haoyi: https://www.lihaoyi.com/post/SoWhatsSoSpecialAboutTheMillSca...
The fact that Make can't even do subdirectories sanely is kind of ridiculous.
Does anyone know of anything better than Make? There's Ninja but it's not designed to be written by hand.
[1] https://github.com/casey/just
Xmake https://xmake.io/ for C and C++ (I haven't use that for anything serious yet) and Buck 2 https://buck2.build/ if you need a really complex build system. Both of these do caching of build artifacts and can do distributed builds (with less and more complex setup).
Of course the chaos is not caused by, "very hypotheticaly" let's say, a compiler or maybe a language without modules.
How would you estimate that ? 20%, 40%, or 70%, true ?
Its just that in some languages and build systems (Node, Maven), we have abstracted this away by calling them plugins and they probably come from the same group that made the library you need.
No such pluginsystem exists, as far as I am aware, for makefiles.
There aren't many widely used build systems that can handle such dynamic dependencies without some special "magic" for these, the only one that I know of (with a significant number of users, so not Shake) is Buck 2 (Bazel and all C++ build systems use "special magic", you can't write in user rules).
(too basic) Makefiles are not. (too complicated) They can be, depends on what you make them to be. (standard library) Well, there is one, there are some builtin functions you can use in the makefile. (too constricting) Haven't noticed that, so I would say no. (too much magic) Hmmm I don't see it. It is very clear what is a target and a dependency and so on. Not so magical. (syntax) Yeah definitely could be better. Even a plain JSON file would be better here.
I will show how Make hits every one of your complaints:
(sarcasm on)
in file hello.c:
How to compile and run this? We need a build system! Download and install GNU Make.When that step is complete:
Type in
make hello
and its done. Now, run via ./hello
See, Too much magic (didn't even have a makefile or Makefile), no standard library, Too constricting, cryptic, too basic. And, because you had to install Make, too complicated. Hits every one of your objections.
(sarcasm off)
Deleted Comment
No smarts. It's just a collection of snippets with a few variables. "make run", "make test", "make lint", that kind of thing.
"make recent" = lint then run the most recently modified script.
You could do the same thing with Bash or other shells, but then you get stuck into Developer Land. Things are so much more complicated, without giving extra value. Make is just a DSL saying "files like this, are made into files like that, by running this command or two". That's it.
This is incredibly powerful!
Nicely put.
Decades ago i wrote a testing framework in java where you could specify your tests and their dependent classes using make-like syntax. So you could have a set of test classes which define the "baseline suite", then another layer of test classes which is dependent on the above and only run if the above is successful and so on.
I really do not understand why folks today make everything so complicated. My advise has always been, stick to standard Unix tools and their way of doing things (tested and proven over time) unless you run into something which could absolutely not be done that way. Time is finite/limited and i prefer to spend it on System/Program Design/Modeling/Structure/Patterns etc. which are what is central to problem-solving; everything else is ancillary.
Running `make test` and knowing it will work, regardless of the stack, language, repo is a huge lifesaver.
The problem with make isn’t make - it’s that what makes calling usually doesn’t do that anymore. On my last project we had a makefile that had 4 main commands - build test frontend deploy. Build and test called through to maven, frontend called npm, and deploy called docker + aws.
All of those tools do their own internal state tracking, caching, incrementalness and don’t report what they’ve done, so it’s not possible to write a molecule that says “only deploy if build has been updated” because maven/cargo/dotnet/npm/go don’t expose that information.
Or just with a simple command runner like just.
https://just.systems/
Maybe his Makefiles aren't complex, nor they seem to follow all the best practices invented by code gurus in sandals, but it works and, what's important, it works for him.
1) It's already installed practically everywhere
2) It reduces your cognitive load for all sorts of tasks down to just remembering one verb which you can reuse across multiple projects, even if the implementation ends up differing a bit
3) In conjunction with the similarly ubiquitous SSH and git, you have everything you need to apply the basic principles of DevOps automation and IaC
There's something special about waking up one day with an idea, and being able to create a fresh git repository where the first commit is the Makefile you've had in your back pocket for years that scripts everything from environment setup to deployment to test automation to code reviews.
There's zero effort beyond just copying your single file "cookbook" into that new repo.
* All commands in one place, view them all with `just --list`
* Stupid-simple format
* Small standalone binary
* Configurable (with arguments, environment variables etc) but not _too_ configurable
When I see a git repo with a Makefile, I'm filled with dread. When I see a repo with a Justfile, I get warm fuzzies.
Some people say it just doesn't do enough to justify existing. These people are just wrong.
Sure, but `./build.sh dev` is a bit less obvious than `make dev`.
Another reason to use Make even if you don’t have any non-phony steps is that you can add those later if needed. (I agree that the author should mark {dev,build,deploy} as phony though.)
There are many reasonable scenarios why you might want to do this: determining at the point of calling make which targets to force or deactivate for safety, projects with nested or external makefiles not directly under your control, reuse of MAKECMDGOALS throughout the makefile (including propagation to submakefiles), ...
They could make technical improvements to their own Makefiles, sure. But it's more about being able to enter a project and have a consistent experience in "getting started".
I'd say putting the Makefile content in `package.json` would be more consistent, especially as they are already using Gulp as the build system.
Make does dependency tracking relatively well (for 1976). But if you just want to run some commands, your shell already does that just as well, without any of the caveats that apply to make.
If anything, it's an argument for making better use of make's own features for configuration in the first place.
That said, I used to write makefiles like this all the time, but have since switched to just and justfiles in recent years which make this the default behavior, and is generally simpler to use. Things like parameters are simpler.
https://github.com/casey/just
This criticism might make sense for some non-vim editor because you might have to ssh into a remote location where you can’t install stuff. But if you should be able to build a project and thus install its required dependencies, then you might as well add one additional word to the install command.
Too late of course.
i feel like we're due for some kind of newfangled coreutils distribution that packages up all the most common and useful newfangled utilities (just, ripgrep, and friends) and gets them everywhere you'd want them.
I agree fully with the OP, in particular I find it smart that he wraps anything in a top-level makefile, even if other, more sophisticated build tools are used. The advantage is standardization, not having to remember anything and to know that if you wrote it, you will just be able to type "make" and it will work.
Let's say a C person wants to compile a Rust project, they would not have to look up how cargo works, but could simply type "make" (or "gmake"; I don't use GNU specifics, but try to be POSIX compliant, even if it is certainly true that almost 100% of makes are gmakes).
Thanks for proposing the use of the timeless "make" as a sort of top-level build system driver; this will probably still work in 250 years.
Make on its own is great but most of the time I've worked with C projects it's been cmake/autotools + global pkg installs, which you Do have to frequently look up.
Except if you want to use some specific feature. Or specific log level. Or build a specific crate in a workspace. Or...
https://github.com/casey/just
Avoids lots of weird makefileisims
Deleted Comment
Also the updated PDFs for my courses materials are automatically put into my websites source tree by another Makefile that I use to manage and build my teaching materials and which let me either build the PDFs I use from my LaTeX sources or build from the same sources alternate versions of the materials for my students (without solutions to the lab sessions exercises for example) and automatically publish those to my local website version to be uploaded whenever I want to deploy the updated website.
It's kind of Makefiles all the way down. I like Makefiles! =)