I wish we had a culture that expected every new C/C++ build system to handle a non-trivial project like Boost or Qt (and their dependencies, like ICU and OpenSSL) before it being pitched as the new best thing. It's trivial to make a build system that elegantly handles your own toy projects that follow your preferred style and structure. But the real world of C/C++ is a harsh place with a lot of variability.
This is wisdom right here. It's tragedy of the commons.
I find that the real problem is no one wants to properly learn how their build system works. I don't care if it's make, cmake or bazel -- whatever it is, you need to _learn_ it. I've worked with folks that have 20 years of experience, fantastic C/C++ developers, that look at a makefile and say "Ugh what is this complicated mess, can you do it in cmake or bazel or something" and expect a silver-bullet where-in the makefile build will somehow transform itself into a self-describing intuitive build system by virtue of some sort of hype-osmosis.
> I've worked with folks that have 20 years of experience that look at a makefile and say "Ugh what is this complicated mess, can you do it in cmake or bazel or something"
This is so true, it happened to me more than once.
A couple of projects ago, we had a complicated build process (7-8 manual build steps that depend on files generated from each other before) for an embedded system.
I wrote a little makefile deleting all those 7-8 shell scripts and i was asked to re do it in cmake.. I was like wtf.. each clearly defined step in makefile would turn into multiple unreadable function calls in cmake.. why would anyone want to do that..
Not that Makefiles are perfect, but sometimes, the right tool for the job isn't the shiniest. Make does a job of being "good enough" for a lot of little tasks.
Like many C++ devs I hacked around in cmake for years, but not really understanding what I was doing or how to structure cmake projects. This was made worse as newer ways of doing things came into being.
If this sounds like you, do yourself a favor and go through these slides (or watch the talk they came from):
It really clarified things for me, but also avoids going too much into detail. You will definitely need more info as you go along, but you can look those things up in the docs. This presentation does a good job at showing the core essentials that you can build your knowledge on later.
The problem in my experience is you invest time learning build system A, then a year later build system B comes out, and not only do you need to relearn a bunch of stuff again, often build system B does some of the stuff of system A but not all of it, plus it does new stuff that you've never encountered before. Then this cycle repeats, endlessly, and every new team you join has adopted the newest build system.
Granted some ecosystems are worse than others here. in the JavaScript world it went something like: make/grunt/bower, gulp/webpack, esbuild, parcel, vite, rollup, and on and on it goes.
Even in the conservative Java ecosystem we've been through maven, ant, groovy/gradle...
Most of these tools offer incremental improvements for a huge learning cost. It's a nightmare.
I think there's a good reason for people moving to Rust, aside from the merits of each language, the tooling for using packages is just so much better in Rust.
This is sort of unrelated, but that reminds me that one of my biggest issues with learning C++ was how I was expected to deal with libraries (particularly on Linux, where conventions will even differ between distros) and building the project. Most guides or what have you sort of teach you how to compile a file or two, but you quickly run into issues that are difficult to solve for a complete beginner without a direct source of feedback.
Every time I've tried to dabble in C++ I've had the same horrible experience.
I end up "Randomly" stabbing at things until it works just well enough to get that particular thing done then dropping it all because it was such a painful experience.
Compared to something like cargo which works really well, C++ and it's build tools just feel flaky.
It may be that I'm just missing a mental model to get to grips with it, but no other major programming language is like that from my experience.
In Behavioral Science this phenomenon is called "learned helplessness". The rabbit will not flee the cage even with the door propped open and no one around.
Theon Greyjoy in "Game of Thrones" exhibited this condition.
I wish somebody would write a book. Even an online book. An all-encompassing book just about how to link/build C++ projects and all the different solutions people use and how they work in practice. Common ways to organize and configure C++ project builds. How Linux distros each differ in where libraries are stored and how to find and link them. Common issues and how to fix them. But it also needs to convey how all these things work and help you create a mental model that allows you to also find your own solutions.
You can specify it in a header file for the library. That way if you include the header file, the library mentioned will automatically get linked as long as it is somewhere in the library search path.
I have found myself wishing that GCC would also get something like this.
I remember back when I was programming in Delphi I could link directly against a .dll, just take a function prototype from the .h file and translate it into a function declaration like this one:
function I2C_GetNumChannels(out numChannels: Longword): FT_Result; stdcall; external 'libmpsse.dll';
and that was it; but to do this in MSVC you needed not only the .h header and the .dll itself, you also needed that stupid .lib file that had AFAYCT had literally nothing inside it except symbol entries that said "no, load it dynamically from this .dll on startup, please". So it was a rather common source of amusement for Delphi programmers that paradoxically, it was harder to link a program written in C against a DLL written in C than it was to link a program written in Delphi against a DLL written in C.
I wish that would work for all build settings, and be standardized across compilers. Even building complex projects with platform-specific build settings could then be reduced to a simple:
Oh, wow, I’ve been thinking for a while about implementing something similar for myself to be used with GCC/clang on Linux.
I suspected that someone might have done it before, but didn’t know of any implementation. I’ll take a closer look at Visual C++ (used it in the last millennium for work) before deciding how mine should work.
Has everyone forgotten about deps files? Run gcc -MD and it will create .d files that record the dependencies between your source (and header) files. You can then use an include directive in your Makefile to pull that information in for make to use. There are a couple of variations on the theme; some people recommend putting the .d files alongside your source files, others recommend a specific “deps” directory for them, etc. See the man page for details, with particular reference to options like -M, -MM, -MF, -MD, and -MMD.
Of course, the other alternative is to simply #include _every_ file in your project into a single source file, then compile that. It’ll probably be faster than anything else you do, and eliminates several other foot–guns as well. And it means that your build script can just be a shell script with a single line that runs the compiler on that one file.
But these days I greatly prefer Rust, where building is always just “cargo build”. Doesn’t get much easier than that.
This sounds like it would be fine for code that you write yourself. But if you're only compiling code you wrote then C++ build systems are pretty trivial. The hard bit is dependencies.
One big lesson that newer languages like go and rust seemed to have learned is that the tooling, building and dependency management need to be dictated as part of the language ecosystem. Dealing with tons of other C++ projects written by other people (even in the same company) - how to specify dependencies, where their build artifacts can be found, etc - is a HUGE pain in the ass and consumer of my time.
They all get the tooling wrong though because none can stand the idea that you might want to mix languages, or add their new language to an existing project with existing tooling.
Java, JavaScript, and Ruby didn't do this, and yet they all have solid build and dependency management stories. Java and JavaScript have even managed to have multiple build and dependency management tools existing at once without there being fragmentation and ruin. So clearly, having those tools dictated by the language is not essential.
I'm not sure why C and C++ have such a bad story here. Some combination of greater intrinsic complexity (separate headers, underspecified source-object relationships, architecture dependency, zillions of build flags, etc), a longer history of idiosyncratic libraries which people still need to use, the oppressive presence of distro package managers, and C programmers just being gluttons for punishment, probably.
Dependencies fit into this model, too. Presumably the dependencies build wherever they came from. Do that, package the output, and put it somewhere your project can use it in the suggested manner.
The point is that the tool is opinionated and demands that this be the case for projects that work with it. Not that the author believes all .cc / .h files work that way.
Your use case would be served by C23's #embed [1]. The same thing has been proposed for C++ but repeatedly kicked down the road because the standardisation committee wanted to make it more general even though no one had any demand for that so they didn't know what it would look like. (C++ standardisation in a nutshell.)
So, you do things in a way that would not support this approach. She’s not saying “this is always how it’s done”, she’s explaining what practices you would need to commit to in order for her approach to be viable:
> “If you want something like this to work, you have to commit to a certain amount of consistency in your code base. You might have to throw out a few really nasty hacks that you've done in the past. It's entirely likely that most people are fully unwilling or unable to do this, and so they will continue to suffer. That's on them.”
I find that the real problem is no one wants to properly learn how their build system works. I don't care if it's make, cmake or bazel -- whatever it is, you need to _learn_ it. I've worked with folks that have 20 years of experience, fantastic C/C++ developers, that look at a makefile and say "Ugh what is this complicated mess, can you do it in cmake or bazel or something" and expect a silver-bullet where-in the makefile build will somehow transform itself into a self-describing intuitive build system by virtue of some sort of hype-osmosis.
This is so true, it happened to me more than once.
A couple of projects ago, we had a complicated build process (7-8 manual build steps that depend on files generated from each other before) for an embedded system. I wrote a little makefile deleting all those 7-8 shell scripts and i was asked to re do it in cmake.. I was like wtf.. each clearly defined step in makefile would turn into multiple unreadable function calls in cmake.. why would anyone want to do that..
Not that Makefiles are perfect, but sometimes, the right tool for the job isn't the shiniest. Make does a job of being "good enough" for a lot of little tasks.
If this sounds like you, do yourself a favor and go through these slides (or watch the talk they came from):
https://github.com/boostcon/cppnow_presentations_2017/blob/m...
It really clarified things for me, but also avoids going too much into detail. You will definitely need more info as you go along, but you can look those things up in the docs. This presentation does a good job at showing the core essentials that you can build your knowledge on later.
The problem in my experience is you invest time learning build system A, then a year later build system B comes out, and not only do you need to relearn a bunch of stuff again, often build system B does some of the stuff of system A but not all of it, plus it does new stuff that you've never encountered before. Then this cycle repeats, endlessly, and every new team you join has adopted the newest build system.
Granted some ecosystems are worse than others here. in the JavaScript world it went something like: make/grunt/bower, gulp/webpack, esbuild, parcel, vite, rollup, and on and on it goes.
Even in the conservative Java ecosystem we've been through maven, ant, groovy/gradle...
Most of these tools offer incremental improvements for a huge learning cost. It's a nightmare.
Dead Comment
I end up "Randomly" stabbing at things until it works just well enough to get that particular thing done then dropping it all because it was such a painful experience.
Compared to something like cargo which works really well, C++ and it's build tools just feel flaky.
It may be that I'm just missing a mental model to get to grips with it, but no other major programming language is like that from my experience.
Looking for something that is still alive in 2023
Looking for something that is still alive in 2023
Theon Greyjoy in "Game of Thrones" exhibited this condition.
It is a thing to overcome.
What on earth.
I have found myself wishing that GCC would also get something like this.
I suspected that someone might have done it before, but didn’t know of any implementation. I’ll take a closer look at Visual C++ (used it in the last millennium for work) before deciding how mine should work.
FD: CMake developer
Of course, the other alternative is to simply #include _every_ file in your project into a single source file, then compile that. It’ll probably be faster than anything else you do, and eliminates several other foot–guns as well. And it means that your build script can just be a shell script with a single line that runs the compiler on that one file.
But these days I greatly prefer Rust, where building is always just “cargo build”. Doesn’t get much easier than that.
> Of course, the other alternative is to simply #include _every_ file in your project into a single source file, then compile that.
Yeah, no... recompiling the entire project whenever any file is touched is way too slow for any non-trivial project.
I'm not sure why C and C++ have such a bad story here. Some combination of greater intrinsic complexity (separate headers, underspecified source-object relationships, architecture dependency, zillions of build flags, etc), a longer history of idiosyncratic libraries which people still need to use, the oppressive presence of distro package managers, and C programmers just being gluttons for punishment, probably.
Not always the case; I have a project with
and a default.h containing this used in the main code as printing the contents of the yaml file to stdout.Your use case would be served by C23's #embed [1]. The same thing has been proposed for C++ but repeatedly kicked down the road because the standardisation committee wanted to make it more general even though no one had any demand for that so they didn't know what it would look like. (C++ standardisation in a nutshell.)
[1] https://thephd.dev/finally-embed-in-c23
> “If you want something like this to work, you have to commit to a certain amount of consistency in your code base. You might have to throw out a few really nasty hacks that you've done in the past. It's entirely likely that most people are fully unwilling or unable to do this, and so they will continue to suffer. That's on them.”
https://bpt.pizza/
Deleted Comment