Lots of people whine and complain on the set of build tools we often refer to as a collective by the term ‘autotools’. That term tends to include autoconf, libtool and automake.
I think a certain amount of criticism is warranted against this family of aged tools that are unix-centric, have cryptic ways to control them (I think there’s a reason m4 macros is not widely used…) and they are several independent tools with a tricky mix of cross-breeding.
The upsides include them being well tested, fairly well known, there’s a wide range of existing tests done for them, they work fine when cross-compiling and they support building out-of-source tree just fine.
But what about the alternatives?
I spend time in projects where the discussion of ditching autoconf come up every once in a while, as sure as that the sun will rise tomorrow. The discussion is always that tool Z is much better and easier to deal with and that everything gets shiny if we just switch. That Z is a lot of different tools that are available today, including CMake, scons, waf or cDetect.
The problem as I always see and why I almost always argue against Z is that autoconf is old, trusty, proven and I know it. The Z tool is often much newer, less proven, less peoeple involved in the project know Z, use Z or know how to customize it (since new tests will be needed and some tests will need to be changed etc). So even though Z is sometimes accepted as a testing ground in my projects, a year or two after the Z was accepted – unless I myself have accepted it and joined its efforts – Z has lagged behind to a point where it isn’t good anymore since I don’t know it and most people are rather fixing the traditional autoconf stuff. So we extract the Z support again.
But if we would never accept new tools we would never evolve, and yes indeed autoconf and friends have their share of flaws.
The question is of course when to switch – what kind of project in what development state etc – and which alternative that is useful for a particular project. Me being a developer primarily working with plain C and working with lowlevel code and libraries mostly will no doubt have a different view than those who use other languages, who do more “apps” or perhaps even GUI programming…
Can you help me point out good build system comparisions and overiews? I’ve tried to find good comparisions but I failed. Just about all of them are written by the authors of one of these tools.
My ambition is to create some sort of comparison document myself. I think the comparison could include autotools, cmake, waf, scons, cdetect, qmake and ant. Any more?
(I got triggered to write this blog post after my post to the trio mailing list on this topic.)
I personally dislike autoconf and friends, typically they create configure scripts that are broken for cross compiling which I do a lot.
My solution is to organize my projects extensively. Every project creates one library. The headers go in an ‘include’ directory. The sources for the library go in the ‘src’ directory. Tool programs and test programs and single source files and are linked with the library and live in a directory called ‘tools’ or ‘tests’ respectively.
Once you do this, it is easy to hand create a Makefile for any unix like platform. It is easy to create an xcode project. it is easy to create a visual c++ project. A 4 line qmake pro file can be used with wildcards for qmake.
Once you lay out your project’s files in a logical way it is always easy to regenerate your project’s build script.
The reality is that a single tool can not support all the platforms at once – The typical architectures needed are too diverse like Mac OS X universal binaries, iphone, Windows7, and various forms of Embedded Linux.
I have seen autoconf/configure scripts break so often in these different configurations that I do not trust it anymore. Portability is hard and there is no silver bullet except to have a continuous integration tool performing compiles on all supported platforms.
–jeffk++
Wise words Jeff,
My experience is actually rather that autoconf is the best of the available options for making build systems that are cross-compile friendly. But admittedly that’s biased by my own skillset.
I think I _really_ got turned off of autotools when I really wanted to learn it and actually purchased the original book, and the simplest examples did not work! Plus it created configure scripts that had syntax errors on Mac OS X! Perhaps now it has settled down. For my own projects I decided to depend on GNU Make and BASH only, and created a “magic makefile” which does everything I need:
the latest version is at http://opensource.jdkoftinoff.com/project/projects/magicmake/wiki
and the previous version with more info is at:
http://opensource.jdkoftinoff.com/jdks/trac/wiki/MagicMakefileV5
By the way, recently I had to do some java work and ended up using Maven2 – A big departure from normal build systems. Unfortunately is very integrated with Java – but the kinds of things it does is very worthwhile like managing both direct and transitive dependencies, source code layout by convention, no need to modify any files when you add source files, etc. Plus it integrates amazingly well with continuous integration tools like hudson. There is nothing like it for C and C++ projects.
jeffk
A magic makefile is really hard to get to work with 35 different unixes (where each installation can have a myriad of different installation subtleties etc), some of them pre-POSIX. I’ve managed to get my autoconf solutions to do it. (in the cURL project)
I’d say that autoconf’s weakest points are its arcane syntax and the fact that it requires a POSIX shell etc.
But of course, each project has its own set of needs and requirements and what suits one project perfectly might not at all suit the next…
The system I’ve had the best results with is autotools. As in Daniel’s case I am biased, since I know atuotools better than the others.
Getting to your idea about writing a comparative text on the various build systems. It’s a great idea and a needed one.
I think a good starting point would be to start listing the requirements on such a build system.
* easy to use for end user
* cross compiling possible
* easy to split configuration files and programs into separate dirs
in the resulting installable file
stuff like that, perhaps a bit more specific 😉 Have you started such a list?
/hesa
… it doesn’t take long before one comes to the insight that different programs (or rather languages) and target platforms may have different requirements. But we can twist that around again and think about the quick increase in speed/mem of new devices so I think that we have to think about all programs to at some time execute in a “small device”.
No, I haven’t started on this yet. I’ve mostly been thinking about it so far.
Ideally, it would be nice to list a long list of requirements and properties of various build/config systems and then basically figure out how all competitors do in all the listed aspects. I mean, I think we must accept that there will be no obvious winnner for everyone, but done right a comparison document would still help everyone interested in this.
What I found is that what is most important is to organize your project in such a clean way so that you could even compile and link it with a single gcc command line if you had to, for instance something like this:
gcc -Iinclude src/*.cpp src/*.c tools/httpd.cpp -o httpd
The magicmakefile by definition only works for platforms that I care about.
I have yet to see autotools work for compiling a project in the following situations:
* an application for the Cell processor with tasks running on the SPE as well as the SPU (different compiler required)
* software compiled by various DSP compilers where you really do need to have different optimization options set on different source files in order to have a correct executable – and not just to avoid compiler bugs either.
* software that works on 64 bit windows compiling with the Microsoft SDK – which is required since GCC/G++ v4 is not yet compliant with the requirements of WIN32 and WIN64 and most importantly S.E.H.
* software running on iphone where the executable not only needs to be cross compiled but also needs to be cryptographically signed.
* software running on mac os x or debian linux or redhat linux or windows where you need to create a full featured pkg, deb, rpm, or msi installer file.
* software running on mac os x where you not only need to do cross compiling but the cross compiling for multiple platforms is done by the same gcc invocation, ie: ‘-arch i386 -arch x86_64 -arch ppc -arch ppc64’ but in this case you must use different tools to manage static libraries.
In all these situations, the best practice is to either make your own hand built makefile, or xcode project, Wix project, or Visual Studio 2008 solution file – So it is best to organize your project so that it is very simple to do any of these things. What I did with the magic makefile is to make the typical linux/mac os x command line tools and libraries automatic and any other special case platforms are easy to manage due to simple project layout.
Regards,
Jeff Koftinoff