How and why we ported our build to GNU Autotools

Wed 10 April 2013
By Tom Parkin

If you're a source customer for ProL2TP you'll notice that the 1.5 release incorporates a new build system. Where we used to use hand-crafted makefiles, we now use GNU Autotools to configure and build ProL2TP. This has been a very positive move for us, although not one without its challenges! In this blog post I'm going to look at how we managed the move to Autotools, and the benefits it bought us as a development team.

Why the change?

The previous ProL2TP build suffered from a few rough edges, like most build systems do. Nothing too serious, but enough that we felt the build could use a bit of love. And if we were going to give it that love, it seemed worthwhile considering porting to a build tool of some description -- the more functionality we could leverage in the build tool, the less we'd need to code up ourselves in our makefile.

If you've done any research into build tools for Linux, you'll know there are lots to choose from. And in some respects, Autotools isn't the obvious choice. There are plenty of people ready to deride it as being arcane and difficult to understand, crufty, slow, old-fashioned, you name it. But despite that, Autotools has survived for a long, long time, and is in active use by lots and lots of projects. It must be doing something right! As we weighed up the options, Autotools came out ahead for us for the following reasons:

  • Autotools is familiar to end users. If you're used to building software on Linux systems, you know how to drive Autotools-based builds.

  • Autotools is portable. We know from experience that Autotools is very widely supported in the Linux world. It will work on everything from big servers to cross-compilation environments for embedded systems.

  • Autotools is very configurable. The split Autotools offers between configuration and build makes it easy to tweak a build for a new platform.

  • Autotools has a small dependency footprint. By and large, if you're building software on a Linux system, you have everything you need to support Autotools. There's no need to install a whole shedload of extra packages, or even worry about the specific versions of different tools. It (mostly) just works.

The porting process

Having decided to move to Autotools, we had to get on and actually do the work. Since we didn't have an Autotools guru on the team, there was a certain amount of work involved in getting up to speed with the system in the first place.

Here's where Autotools' fearsome reputation for complexity really started to bite! And it's true -- it is quite a complex system, and it does take a little while to learn. But it isn't impossible. Here's how we managed it.

Start from scratch

We started our new build with a clean slate. If at all possible I'd strongly recommend doing the same. Copy-and-pasting from other builds may seem like a quick way to get started, but it's important to know what each bit of the build does and why it's there. You can skimp on the initial learning, but you will have to eventually invest that time when you want to customise things later on.

Do some background reading

Autotools is a system of a number of interacting components. If you're a "top-down" type of thinker like I am, you'll find it much easier to get to grips with Autotools if you can get an overview of all the moving parts before diving into the details.

I found that good place to start is John Calcote's Autotools (you can read some draft content from the book here). I recommend ignoring the famous "goat book", at least to start off with. The goat book is showing its age, whereas Calcote's book is much more up-to-date.

In addition to the book there are some excellent resources around the web which I found very useful when getting started. I recommend taking a look at Flameeyes' great Autotools Mythbuster, as well as JezUK's fantastic step-by-step walkthough on autoconfiscating the Arabica XML toolkit.

Solve one problem at a time

Once you know roughly how the Autotools work, don't immediately try to pull your entire build into automake/autoconf files all at once. Instead, break it down into pieces so you don't get overwhelmed.

When I was porting the ProL2TP build, for example, I started off by focusing first on one component of the ProL2TP suite (prol2tpd), ignoring the others. I concentrated on the options the current Makefile supported, and figured out how to provide those using Once I'd done that, I began work on a minimal to build one of the simple sub-modules of prol2tpd.

By starting small you can incrementally create the new build by overcoming one manageable challenge at a time. This greatly reduces the Autotools learning curve, especially if you learn best by doing.

Keep sanity testing your work

Build systems are just the same as any other code: they're much easier to work on if you have good unit tests in place. When porting a build you can save yourself a lot of time and trouble by working some checks and balances into your development process.

Some tactics I found useful included:

  • comparing gcc command lines from the old and new build to check compiler flags and invocation
  • leveraging "make dist" and "make distcheck" to ensure I'd fully captured the project tree in the Autotools build
  • scripting like-for-like installation and packaging comparisions on the outputs from the new and old systems

By including these kinds of checks as you go, you should be able to transition to the new build with very little fuss, as most of the issues and regressions will be caught during development.

A brave GNU world... the good and the bad

Now that our new build is up and running in production, it is possible to look back on the port and draw out some pros and cons from the move. Was it worth the effort?

Without a doubt, the answer to that question is "yes!". We found that the new build reached feature-parity with the previous system early on in development, leaving us free to concentrate on more advanced features that help support our development process. It's now very easy for us to create distribution packages, with version numbers and changelog entries autogenerated from a single source. If we need developer debugging enabled in a package, it's not a problem. Source code releases are a snap with "make dist". Parallel builds work seamlessly. As one member of the team put it, the differences are like night and day!

This said, the new system isn't perfect. We still have a few rough patches in the build:

  • Nested builds. Early on in development I decided to treat the ProL2TP suite as a set of separate Autotools builds. This made logical sense because some of ProL2TP's components are shared with other projects. By splitting the build up, the shared components could be easily build as standalone modules. The flip side of this arrangment is two-fold. Firstly, each nested build has a configure script. On our development boxes the extra time taken to run these isn't noticable, but it really adds up on smaller platforms. The second drawback is that we need some top-level configuration to tell each component about the tree layout in the configuration stage. This feels a bit messy, although it is effective in practise.

  • Autotools metadata is a pain. Most Autotools projects come with a pre-canned configure script out of the box, and this is what users are expecting to see. However, no developer works on the configure script directly: it is actually a compiled output from the autoconf tool. Managing this (and other bits of compiled metadata) is a pain. You either keep the metadata out of SCM, in which case all the developers have to remember to recreate it in a fresh checkout, or check it in and ensure people remember to refresh it when they're working on the build system. Either approach is mildly annoying.

  • Autotools is all or nothing. Autotools gurus might dispute this, but we found it difficult to use "just a bit" of Autotools. Attempting to use just one component of the suite in isolation is rather like trying to pull one piece of string out of a tangled bag. You tend to get the whole ball of string whether you want it or not.

  • Autotools is prescriptive. Again, experts might argue otherwise, but our experience in porting ProL2TP suggests that Autotools as a whole tends to be prescriptive about how you should go about doing things. We had a couple of cases where we wanted to do something seemingly reasonable, but ended up fighting the tools to get it done. In some cases the friction we encountered was a subtle hint that there was a better way to do what we wanted, but in others, it seemed like we were having to jump thorough hoops for no real reason.

To conclude...

Our Autotools journey hasn't been without its challenges, but the end result is fantastic. Our build system is far more useful to us than it was before, is more familiar to our users, and is more portable across a range of systems.

Oddly, though, while Autotools has worked well for us, I struggle to give it a wholehearted recommendation. It can be frustrating and bizarre. It is certainly showing its age. It is widely misused, so there are many anti-patterns in existing projects to lead you astray. On the whole, it's a bit of a pig! But it is a pig which has survived a long time in an area of computer science which is evolving. Build systems in general seem to tend toward the hoary and unlovable over time, and there's no indication that newer and trendier systems will be any different.

So, what's the conclusion? "Autotools: give it a go, you'll tear your hair out, but you'll get the job done!"? It's not the kind of slogan any ad-man would thank you for, but on the whole I think it's just about right.