Readit News logoReadit News
soapdog · 7 years ago
a developer post a well rounded tool with documentation that was built with a lot of care. The comment thread is mostly about other tools or people dismissing the work done because another available offer is more popular or common.

When did "Show HN" threads became shark tank? Can people at least check and post about the tool itself instead of discussing CMake vs Ninja vs Meson?

fnord123 · 7 years ago
>When did "Show HN" threads became shark tank?

From the 2007 announcement of Dropbox:

https://news.ycombinator.com/item?id=8863

""" 1. For a Linux user, you can already build such a system yourself quite trivially by getting an FTP account, mounting it locally with curlftpfs, and then using SVN or CVS on the mounted filesystem. From Windows or Mac, this FTP account could be accessed through built-in software. """

DataWorker · 7 years ago
Does anybody actually use Dropbox anymore? Feel like it’s a dead product.
ranit · 7 years ago
You may accomplish the behavior shift you desire by posting about the tool yourself. Then people will start supporting, disagreeing, posting questions, etc. Bashing the behavior might not work well.

People tend to stick to familiar topics ... as shown in the next Meson comment turned to Python2-vs-Python3-compatibility in a whim.

waruqi · 7 years ago
Yes, I agree very much.
soapdog · 7 years ago
I really want to use your tool in a future project. I know Lua well enough and being encumbered at work with a legacy project that requires a crazy build pipeline that uses gyp and friends, makes me eager to try some other stuff.

Thanks for trying to improve the ecosystem around native code building tools. I appreciate it a lot.

Fnoord · 7 years ago
If you post on Show HN you can expect feedback and discussion about related tools (f.e. in threads about Ghidra recently, comparisons with IDA Pro and Radare are made). It is up to the readers value and moderate the discussion(s).
2bluesc · 7 years ago
Meson[0] has been gaining in popularity and has migration tools for cmake projects.

Large projects such as systemd and gnome[1] have migrated or have been migrating for years

[0] https://mesonbuild.com/

[1] https://wiki.gnome.org/Initiatives/GnomeGoals/MesonPorting

waruqi · 7 years ago
But the meson installation has many dependencies, it is not very lightweight, it needs python, ninja, and xmake has built-in luajit, no third-party dependencies, more lightweight
shaklee3 · 7 years ago
Python is preinstalled on any major Linux distro. Ninja is tiny, so those don't seem that heavyweight.
jandrewrogers · 7 years ago
I can't think of a context in which I've used Meson where it ever felt not lightweight. Python is already installed in every system and ninja is quite small.
flohofwoe · 7 years ago
AFAIK Meson requires a python installation, which is a non-trivial dependency and on some platforms requires additional manual setup steps.

Having everything in a single standalone executable is vastly preferable IMHO.

alexeiz · 7 years ago
Depending on Python is actually much better than dependencies on other runtimes, just as Java. Firstly, Python is everywhere now. And secondly, you can just 'pip install --user' and use it even without root privileges (which is a great thing if you're in a restricted corporate environment).
philpem · 7 years ago
It's pretty easy to install Python on Mac (use Brew) or Windows (use WinPython), and most Linux distros include it by default... running an installer is hardly an onerous task.
rienbdj · 7 years ago
The Cmake migration tools are not very good.

Meson has no glob support.

Meson does not support any form of remote caching.

blattimwind · 7 years ago
Meson doesn't seem to be a significant improvement over CMake beyond syntactic sugar. It uses the exact same, unreliable models as CMake, just with a slightly nicer-seeming syntax and about 242 more dependencies.
NikkiA · 7 years ago
I really like Meson, but I've found with it that the corner cases where the simple syntax doesn't work get really hairy really fast, eg producing vst plugins, or linking against something that doesn't have a deep integration with pkgtool or such.
pfultz2 · 7 years ago
Meson looks nice, but it still lacks a way to tell it where your dependencies are installed(like cmake’s CMAKE_PREFIX_PATH). You can try to get by, by setting pkg config path, but it doesn’t help for dependencies that don’t support pkgconfig.
waruqi · 7 years ago
You can try xmake's dependency package management.

    add_requires("libuv master", "ffmpeg", "zlib 1.20.*")
    add_requires("tbox >1.6.1", {optional = true, debug = true})
    target("test")
        set_kind("shared")
        add_files("src/*.c")
        add_packages("libuv", "ffmpeg", "tbox", "zlib")

geezerjay · 7 years ago
> a way to tell it where your dependencies are installed(like cmake’s CMAKE_PREFIX_PATH).

That's not how dependencies are discovered in cmake. Dependencies are added with calls to find_package, and if you have to include dependencies that don't install their cmake or even pkconfig module then you add your own Find<dependency>.cmake file to the project to search for it, set targets, and perform sanity checks.

ridiculous_fish · 7 years ago
Why migrate from CMake to meson?
kstenerud · 7 years ago
CMake feels a lot like C++89: Lots of things you can do, but there are problems:

* No standardization or opinionated design, so you can't share your work easily.

* No sane defaults, so your build system is always fragile, difficult to maintain, and done wrong.

* No best practices, so people keep making the same mistakes over and over.

* Misguided attempt to remain compatible with the steaming pile of legacy they've accumulated over the years.

* Bad documentation, so there's no way to learn how to do things better.

* Steep learning curve with limited payoff, so most people don't bother.

Meson does some of these things better. It's still not pretty, but it's nicer to use than CMake.

IshKebab · 7 years ago
Nobody who has used CMake would ask that so I assume you haven't!

The answer is that CMake is mad and full of gotchas. Think of it like the PHP of build systems. Here is a classic example:

https://cmake.org/cmake/help/latest/command/if.html#variable...

jandrewrogers · 7 years ago
In my experience the primary reason people migrate is that it is significantly simpler for ordinary developers to maintain and configure builds in Meson. They both target ninja but the learning curve for CMake is definitely steeper.
TheCabin · 7 years ago
Oh, the number of hours of my life I spent debugging CMake files of third party libraries.

You are very fortunate if you import libraries that just work. This is also true for "modern" CMake.

sandGorgon · 7 years ago
what about Bazel ? https://bazel.build/
IshKebab · 7 years ago
Bazel is very slow and heavy and only Google uses it.
waruqi · 7 years ago
claudius · 7 years ago
What I miss about these tools is some "relatively" straightforward dependency detection and generation.

That is, I have a bunch of .cpp files which need to be compiled into individual executables in a folder bin/. I also have a folder inc/ which contains some headers (.h) and those headers possibly also have some associated TU (.cpp).

Now g++ can already generate a dependency graph of headers for an executable. It is then (with a standard Makefile and some supporting bash scripts) quite straightforward to mangle that graph into a list of translation units (namely those files whose name matches a depended-on header) which must be compiled and linked into the executable.

That is, I can simply create a new "executable file" .cpp file in bin/, include some headers therein and when I say make, the Makefile automagically figures out which (internal) dependencies it needs to link in when linking the executable.

Now that I have these "relatively straightforward" scripts and the corresponding Makefile, the incentive to move to another (nicer) build system which would require me to rebuild this infrastructure to fit into this other build system's view of the world is quite low – unless there is some way to do this directly?

Xmake as shown here (and also Meson linked in a sister comment) appear to still require manual selection of dependencies.

JdeBP · 7 years ago
> Now g++ can already generate a dependency graph of headers for an executable.

Actually, it cannot; and this should be well known. It emits in practice less than half of the information that it knows from path lookup, that a build system really needs to know.

* https://news.ycombinator.com/item?id=15060146

* https://news.ycombinator.com/item?id=15044438

waruqi · 7 years ago
Xmake can only simplify the maintenance management of dependencies, improve the usability and maintainability, but can not fully realize the way you say
humanrebar · 7 years ago
Your workflow is one way to build and link, but not the only way. I might want to build several of those TUs into a static library, link against it, and ship it alongside a few executables.

And when it comes to creating a library, it's difficult to infer which TUs should be pulled in or left out since you'd need to see at least representative samples of the use of that library to be able to infer that.

flohofwoe · 7 years ago
How would a tool know from a header dependency, in which source file the implementation for the header lives? C or C++ don't require any relationship between a declaration file and implementation file. The implementation could be in an entirely differently named source file, or spread over various files, mixed with implementation code from other headers, or included right in the header.
stkdump · 7 years ago
In the common 2 step generation model of C and C++, this information is not needed. When generating object files it is not relevant which .c/.cpp file corresponds to which header files, because they are no inputs to that. Linkong has to happen when any object file changed.
adrian_b · 7 years ago
Generating automatically the dependencies is trivial with gcc and GNU make, if you just take care to group adequately in directories and subdirectories.

I.e. you just have to put all the source files from which you generate object files that will go in the same libray in a set of directories which does not contain files that will not go there.

Similarly, all the source files for the object files required for an executable, except those that are in libraries, should be in a set of directories.

The sets of directories need not be disjoint, just a given set must not contain files that must be excluded for linking a certain target, as that will make the building process more complex.

Given this constraints, it is possible to write a universal GNU makefile usable for any project, which will generate automatically all dependencies.

For any executable or library you want to build, it is enough to write an extremely small makefile, containing 4 lists (of defined symbols, of source files, of directories with header files and of libraries) and the name of the generated file and its type (excutable, shared library, static library).

At the end you need to include a makefile that is good for any project targetting a certain CPU + operating system combination.

The makefiles per CPU/OS must define a few things, e.g. the compiler used and other utilities, option flags for all, locations of the tools and so on, then you include a unique makefile for all architectures and operating systems.

I have started using this method more than twenty years ago and I have never ever needed to write manually any dependency information.

Whenever I see the huge and intricate and impossible to maintain makefiles that are too frequently encountered in many software projects, I wonder how one is willing to waste so much time with a non-essential part of the project.

From my point of view, building easily any large software project is a problem solved a long time ago by gcc & GNU make, but for reasons that I cannot understand most people choose to not do it in the right way.

Of course having to use in 2019 a programming language which does not implement modules by any better method than including header files is something even more difficult to understand, but I still must use C/C++ in my work, as there is no alternative for most embedded computers.

geezerjay · 7 years ago
There's no way to achieve that with today's standard C++ as it requires metadata to access/infer package version numbers.

This will hopefully change with the introduction of C++ modules in C++20 but until them the best option available to C++ programmers is either manually managing third-party libraries or employing dependency management tools such as Conan.

claudius · 7 years ago
This is for internal dependencies of a project and indeed outside the scope of the standard which does not say that if a function is declared in file abc.h then it is defined in file abc.cpp and that this file is compiled into an object file abc.o which then must be linked during linking of any file which includes abc.h.

However, just because it is (like most build system questions) outside the scope of the standard does not mean that it isn’t possible to define some project-internal rules about what gets compiled and linked into what and that the build system cannot apply those rules to take work away from users.

The few external dependencies my project has are installed semiautomatically before any compilation starts.

jhauris · 7 years ago
This is really great work, great documentation. It looks like CMake, but with a full featured scripting language.
waruqi · 7 years ago
thanks!
Game_Ender · 7 years ago
Does it have any distributed build or caching support? That is my minimum bar for a C++ build system. ccache and distcc/icecc are too limited, you want something integrated with your build system directly.
waruqi · 7 years ago
Distributed builds are being planned, but not yet implemented. You can see https://github.com/xmake-io/xmake/issues/274
RcouF1uZ4gsC · 7 years ago
For better or worse though, CMake has won! Many IDE's including Visual Studio can directly work with CMake files. In addition, even Google which is famous for doing things their own way, has now added official, first-class CMake support to their open source C++ library Abseil https://abseil.io/blog/20190402-cmake-support

If you are writing an open source C++ library, even if you support some other C++ build system, chances are you will also have CMake support as well.

While I have no doubt, xmake is easier to use than CMake (just having Lua over CMake's abomination of a language is a great improvement), the fact that so many libraries and tools already support CMake is going to make adoption an uphill battle.

geezerjay · 7 years ago
Cmake won against the incumbent, which was autotools. Still it's still far from being an enjoyable tool, whose experience is made even worse by its god-awful docs.
shstalwart · 7 years ago
Personally, I vastly prefer autotools, both as a user and developer. When I got to the point I needed some kind of build system, I found autotools much easier to learn than cmake.

As user, I find the experience with autotools to be much nicer as well. For whatever reason, the interface just seems more intuitive. I mean, ./configure --help will tell you basically all you need to know. An underappreciated bonus is that you don't have to install more stuff just to __build__ some program you might not even want. I've run into more than one project that required a specific version of cmake, which as luck would have it, was not the version packaged with my distro. This leaves you either building another cmake first or finding a tool/library that isn't so persnickety.

Given the choice between trying a project that uses CMake or or autools, I'll choose the autotools based project every time.

de_watcher · 7 years ago
I find the docs to be fine...
kevin_thibedeau · 7 years ago
Now we just need a better DSL that can generate Cmake files.
usrnm · 7 years ago
And CMake itself started as a simple DSL for generating Makefiles. We've gone full circle
yoz-y · 7 years ago
I quite like CMake, I find it to be the least bad of the bunch. With recent additions I'd say that the only problem is that many packages still need to pull a CMake module from somewhere to be found because they do not offer pkg-config files.
waruqi · 7 years ago
Here is an official dependency package repository for xmake. https://github.com/xmake-io/xmake-repo