Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Image

Autotools 148

Muad writes "John Calcote is a senior software engineer in Novell's Linux business, who after slogging up the steep learning curve the Autotools triad poses to those packaging software according to the portable GNU conventions for the first time, very kindly decided to make the experience easier to newcomers by sharing his years of experience and carefully crafted bag of tricks. His book is a welcome update to a field that has not seen entries now for a full ten years, so long has been the time since GNU Autoconf, Automake, and Libtool by Gary V. Vaughn, Ben Ellison, Tom Tromey, and Ian Lance Taylor hit the shelves. Unfortunately, the publishing industry is driven by the need to turn a profit to fund its endeavors, and specialist items like this book are not obvious candidates for volume selling - which is a credit to No Starch Press' willingness to venture down this path." Keep reading for the rest of Federico's review.
Autotools: A practitioner's guide to GNU Autoconf, Automake, and Libtool
author John Calcote
pages 360
publisher No Starch Press
rating 8/10
reviewer Federico Lucifredi
ISBN 1593272065
summary Teaches how to master the Autotools build system to maximize your software
The book opens with John's experiences in adopting the Autotools, and quickly offers what is in my view a very important word of caution that is often lacking in the few tutorials I have seen on the Net: the Autotools are not simply a set of tools but foremost the encoded embodiment of a set of practices and expectations in the way software should be packaged the GNU way. While it is acceptable for beginners not to know what these expectations are, the right frame of mind to approach the Autotools is to focus on learning what way the Autotools operate, what they are trying to accomplish, and why. Attempting to use the Autotools without understanding the bigger picture will lead to very high amounts of pain, as it is one of the toolsets most difficult to adapt for use separate from the policies they represent, so strongly are these conventions embedded in their fabric. With this understanding, it becomes possible to generate extensive configurations with a few lines of Autoconf or Automake - without this understanding, it very quickly becomes a battle to force a round peg into a square tool.

John's style is more extensive and takes a longer path to the "technical meat" of the problem than the 10-year old alternative, but in this reader's opinion it flows significantly better as there is an underlying story, a thread that connects the bits of what is otherwise a pretty arid subject. For those masters of shell-fu, this book is a page-turner, while for mere mortals it is a good, approachable, path into a difficult skill.

The book is structured around the packaging of two different projects, the first being a simplified "Hello, World" project to provide a digestible introduction to the processes and technology of the Autotools, while the second representing the full-blown packaging of a complex, real-world project (the FLAIM high-performance database). This is a very good approach, breaking the theory into many practical examples of practice, and providing many ready-made bits that the rest of us can start our own configuration build files from. The result is a first half providing a gentler, streamlined introduction to the subject matter, before the full jump into the gory details of the most complex possibilities the toolset offers. While it must be noted that John attempts to keep away from those most fine details which "may be subject to change" between minor releases of the tooling, which is doubtlessly good for both our scripts' and the book's shelf life, it must be observed that he does not shy away from very dense (and otherwise utterly undocumented) material, such as the use of M4 macros in Autoconf, something a colleague of mine once pointed to me as "the one more reason I'd rather chew broken glass than deal with Autotools".

Assuming you have the requisite knowledge of Make, Shell scripting (particularly Bash), and GCC that are essential to a developer, packager, maintainer or buildmaster of a Linux, BSD or *NIX project, or that you are on your way to achieving those skills, this is a book that belongs in your shelf, right next to the RPM documentation. This is material for experts or experts in the making, but in my opinion you will find no better introduction to this complex subject. I had it on my wish list well before it was ever printed, and its presence on my desk caused several other developers in my office to order their copies pretty much on the spot upon finding out of its existence. Either as a learning tool for a skill you are trying to attain, or as a reference to turn to when faced with the complexities of this unique set of tools, this book is well worth its price tag.

I certainly hope this is not the last publication we see on the Autotools in this decade, but either way, it is a good start indeed - and my hope is that the publisher will refresh the title when an update is warranted, without waiting ten years!

Federico Lucifredi is the maintainer of man (1) and a Product Manager for the SUSE Linux Enterprise and openSUSE distributions.

You can purchase Autotools: A practitioner's guide to GNU Autoconf, Automake, and Libtool from amazon.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.

*

This discussion has been archived. No new comments can be posted.

Autotools

Comments Filter:
  • by Anonymous Coward on Monday September 27, 2010 @03:17PM (#33715262)

    They have been.

    Cmake among others has effectively replaced autotools. It's FAR easier to deal with, cross platform, fast, will build makefiles, visual studio solutions, and X-code, and supports testing and other things.

    There are some other ones around too like Scons, but the point is, anyone starting a new project now with autotools is a dolt or a masochist or both.

    Autotools is dead. Let's let it be buried in peace, please.

  • by Urban Garlic ( 447282 ) on Monday September 27, 2010 @03:54PM (#33715692)

    I often need to install software in an environment that's different from where it's going to be run, e.g. I install it on a file server, where the target directory is /export/apps/stow/, and then I use "stow" to put it in /export/apps, which clients mount as /usr/site, so they see "/usr/site/bin/", and are set up to look in /usr/site/lib for libraries, and so forth.

    I don't know if this is instrinsic to newer build schemes or not, but my recent experience has been that "old-style" (autotools-based) packages work just fine, they interoperate well with stow, accept the "--prefix" argument to configure, and work just fine for the clients. Cmake-based packages tend to hard-code path names into start-up scripts, which then break on the clients, which view the app in a different hierarchy -- they don't have /export, in particular.

    Now, it may well be that these are badly-written cmake scripts, and cmake is perfectly capable of doing it right, I honestly don't know. But it seems to me that cmake (and Python's easy-install, and every other new build scheme I've run across in the past few years) are all part of a new generation of tools which really want the build-time and run-time environments to be the same, because they're built around the "single-user isolated workstation" model.

    But it's not true. Lots of us still have centralized file servers that use NFS exports to make centrally-managed Linux applications available to many clients. The new tools make some things easier, but this, they make harder.

    Also, uphill through the snow both ways, and we liked it, get off my lawn, kids today don't know nothin', no respect I tell you.

  • by af1n ( 1031572 ) on Monday September 27, 2010 @03:54PM (#33715696) Homepage
    Poul-Henning Kamp of the Varnish about autotools: "Did you call them autocrap tools ? Yes, in fact I did, because they are the worst possible non-solution to a self-inflicted problem." Read more at: http://www.varnish-cache.org/docs/2.1/phk/autocrap.html [varnish-cache.org]
  • by Anonymous Coward on Monday September 27, 2010 @04:35PM (#33716186)

    Care to point out what new problems autotools creates? From my experience, autotools projects tend to work flawlessly while cmake ones tend to throw a lot of obscure errors that no one is able to figure out.

  • by Entrope ( 68843 ) on Monday September 27, 2010 @04:54PM (#33716388) Homepage

    The autotools suite requires that software developers keep revising things that worked before, because autotools has some new paradigm for some aspect of its operation every year or so.

    For example, one of my open source projects lets the user specify which extra modules should be compiled into the binary. (It doesn't use loadable modules because that was even more painful when we started out.) Over a span of about two years -- and I think three "minor" (1.x) releases of automake -- the approved way of conditionally linking object files changed twice. The changes were not documented, and nobody bothered to describe any way that would work across the several versions of automake that were in common use at the time. In contrast, doing the same thing with autoconf alone or with some non-autotools script is dead simple.

    autoconf has also gotten progressively fussier. For example, simple m4 macro invocations like FOO(bar) used to work fine. Now they often generate a warning that the argument is not quoted properly. To pick one example of prior art, C solved that particular precedence problem about 35 years ago. There is no good reason for autoconf to make the software maintainer throw in all the []s that it wants.

    Those are just a few examples from my experience; others will have more.

  • by Kaz Kylheku ( 1484 ) on Monday September 27, 2010 @06:06PM (#33717076) Homepage

    can compensate for the low quality of this garbage, not to mention its poor performance (some builds spend more time running configure than actually compiling, installing, and tarring up the resulting run-time and devel packages!)

    These tools have to be redesigned from the ground up by someone who understands that free software has made large inroads into the embedded world where it needs to be, doh, cross-compiled.

    For my own projects, I develop a carefully-crafted configure shell script by hand and recommend everyone do the same.

    The script has carefully developed and tested support for (1) cross-compiling, (2) configuring/building in a separate directory, and (3) installation into a temporary package directory.

    Furthermore, this script should ONLY test the features that are actually needed by the program. (The program should assume a reasonably modern system; don't bother testing for obscure bugs in System V release 3, okay?) The script should make sure that it tests only header files and compiling with the toolchain that it is given. The configure test programs programs should never accidentally include something in the build machine /usr/include, or link something from /usr/lib!

    Configure scripts should never run any program that they compile because it may be the wrong architecture, and they should not have an "if cross compiling" check which disables their features when cross-compiling and substitutes dumb defaults! Quite simply, implement the tests so that cross-compiling is not needed.

    For instance, instead of making a C program which outputs values with printf, make it so that the values are used as initializers for static data. Then use the cross-compiling toolchain's "nm" utility to extract the values from the compiled object file. You don't have to run a program to know how wide a (void *) is. Just do "static char size_of_void_star[sizeof (void *)]". Compile it, and then interrogate the .o to discover what is the content of the size of the data region named by size_of_void_star!

    Look at the stupid configure script for GNU bash. When crossing, it assumes totally pessimistic values for all checks that can't be done when cross-compiling. Unless you explicitly override the ac_* variables yourself, you get a shell which has no job control! The bash build assumes that your kernel has a tty system dating back to the middle 1980's.

    Nobody should dig through a configure script to find out what broken tests they have to override by setting the variables manually.

    Shell code should never be generated by m4 macros, let alone ones which frequently change.

    Look at the stupidity of thisl. Suppose you want to produce a minimal patch for some open source project that uses autoconf. Suppose that as part of your patch you have to enhance the configure script with some new options. To do that, you have to modify configure.in. But projects ship with the generated configure script. So you want to regenerate that, right? Oops, your version of autoconf is different, and so you get a HUGE diff. Worse yet, the generated configure script breaks!

    So people end up with ten different versions of autoconf installed, so they can always run the right one which matches what the configure script was produced with that is in the tarball.

    Build configuration from scratch is not difficult. It is easy. Autotools do not simplify anything. They monstrously complicate something which is already simple and doesn't require simplification or automation. Using the Autotools is a false economy. You may think you are getting something for free and saving time, but in the end you will spend more time over the life of your project wrestling with this garbage (and also waste the time of countless other people you don't even know) than if you just carefully wrote a small configure script by hand.

  • by Anonymous Coward on Monday September 27, 2010 @06:34PM (#33717328)

    only working shell+make+gcc should be needed to actually build software.

    You've identified the key dividing line between Autotools fans and detractors. If you absolutely must limit your dependencies to "shell+make+gcc", Autotools may be the best tool for the job. I just don't understand why anybody chooses to limit themselves that way. It seems like all pain and no gain.

  • by JanneM ( 7445 ) on Monday September 27, 2010 @06:44PM (#33717382) Homepage

    I've just recently been in the situation of selecting a build system for a project with an existing codebase. I looked at the obvious alternatives, including cmake.

    In the end, I chose autotools.

    When you're doing a non-trivial project, cmake doesn't become any less complicated than autoconf and automake anymore - if your build is complex, you have to deal with that complexity somewhere after all. And there's a lot more and better resources for using autotools than cmake around, for figuring out odd corner cases. If you have a somewhat odd build requirement, chances are somebody else has already solved it using autotools already.

    From my experience so far, most of what people dislike about using autotools come from Automake. But Automake is of course completely optional to use, and Autoconf - which provides most of the benefits - was made to be standalone. If you have a system with existing makefiles, it makes a lot of sense to simply use Autoconf to configure the app and the makefiles and leave Automake alone.

    This is a lengthy but really illuminating document on using the autotools, that specifically goes through using autoconf alone and on how to adapt an existing project: http://www.freesoftwaremagazine.com/books/agaal/brief_introduction_to_gnu_autotools/ [freesoftwaremagazine.com]

  • by Anonymous Coward on Monday September 27, 2010 @06:48PM (#33717410)

    "I don't really have that problem with python tools as python's virtual environment tools seem to handle things nicely. Tools such as pip natively handle virtual environments, automatically installing into it when one is active."

    Which is another aspect of the very same problem. So their solution to ignoring they should seggregate functional development from bug fixing, that they should take API stability as an almost sacred cow is "reinventing" the "static linked environment"?

    Try to install two disparaged tools based on the same shared environment and welcome to the fun of app A asking for 'foo 1.2.3' and app B needing 'foo 1.2.4', so you end up producing a virtual environment for each and every app you install. Good luck then with tracking security advisories for "foo" (and all the other dependencies of each installed app). Good luck with finding an upgrade path that will just fix known security issues without breaking functionality for any of your installed apps.

    But, but, but... you should use the very bleedging edge -the developer of App A will say, without paying attention to the fact that you manage 2000 machines with 150 different main apps from different developers and dates, disregarding even that each of those developers will define "bleeding edge" as "whatever it happens to run in my development box, disregarding if it's really needed or it's stable or it will be supported by my dependecies' developers".

    Ahhh, youngsters... now, you get off my lawn!

  • by macshit ( 157376 ) <snogglethorpe@NOsPAM.gmail.com> on Tuesday September 28, 2010 @03:56AM (#33720318) Homepage

    I agree -- autoconf is independent, and does a great job handling system configuration stuff without involving automake -- but I think you're being a bit unfair to automake.

    For projects that "fit" automake, it's actually a wonderful tool, as it allows a highly concise description of the package contents and dependencies, with almost zero fat and overhead, and does pretty much all the typical boilerplate stuff (convenience targets, separate build-directory support, installation, automatic dependency generation, consistency checking, etc) automatically without the user ever having to see the ugly internals. However automake also imposes a degree of structure on a project (not surprisingly, roughly following that of many GNU packages), and is not so flexible if you want something different. For packages that don't follow this structure, or which make particularly complex demands on the build system, automake may not fit very well, and may end up just getting in the way. [It's not entirely inflexible though -- it does try to provide for customization to some degree, and remember, it's essentially a wrapper around makefiles, and will pass through traditional makefile rules largely unscathed.]

    I'd highly recommend trying automake first though, just to see if it works for you, because when it does work, it's really nice.

    From what I've been able to figure out, most people who say they hate the autotools aren't realllly griping about the actual functionality, but rather expressing their distaste at the really grody implementation. It's very true that the implementation details of autotools (a bizarre mixture of m4, bourne-shell, perl, make, etc, all of which are even more ugly that necessary in order to remain portable) is not for the squeamish. However a user of autotools doesn't have to care about the implementation details for the most part -- and from a user's point of view it actually works quite well, and has some very nice features.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...