Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Image

Autotools 148

Muad writes "John Calcote is a senior software engineer in Novell's Linux business, who after slogging up the steep learning curve the Autotools triad poses to those packaging software according to the portable GNU conventions for the first time, very kindly decided to make the experience easier to newcomers by sharing his years of experience and carefully crafted bag of tricks. His book is a welcome update to a field that has not seen entries now for a full ten years, so long has been the time since GNU Autoconf, Automake, and Libtool by Gary V. Vaughn, Ben Ellison, Tom Tromey, and Ian Lance Taylor hit the shelves. Unfortunately, the publishing industry is driven by the need to turn a profit to fund its endeavors, and specialist items like this book are not obvious candidates for volume selling - which is a credit to No Starch Press' willingness to venture down this path." Keep reading for the rest of Federico's review.
Autotools: A practitioner's guide to GNU Autoconf, Automake, and Libtool
author John Calcote
pages 360
publisher No Starch Press
rating 8/10
reviewer Federico Lucifredi
ISBN 1593272065
summary Teaches how to master the Autotools build system to maximize your software
The book opens with John's experiences in adopting the Autotools, and quickly offers what is in my view a very important word of caution that is often lacking in the few tutorials I have seen on the Net: the Autotools are not simply a set of tools but foremost the encoded embodiment of a set of practices and expectations in the way software should be packaged the GNU way. While it is acceptable for beginners not to know what these expectations are, the right frame of mind to approach the Autotools is to focus on learning what way the Autotools operate, what they are trying to accomplish, and why. Attempting to use the Autotools without understanding the bigger picture will lead to very high amounts of pain, as it is one of the toolsets most difficult to adapt for use separate from the policies they represent, so strongly are these conventions embedded in their fabric. With this understanding, it becomes possible to generate extensive configurations with a few lines of Autoconf or Automake - without this understanding, it very quickly becomes a battle to force a round peg into a square tool.

John's style is more extensive and takes a longer path to the "technical meat" of the problem than the 10-year old alternative, but in this reader's opinion it flows significantly better as there is an underlying story, a thread that connects the bits of what is otherwise a pretty arid subject. For those masters of shell-fu, this book is a page-turner, while for mere mortals it is a good, approachable, path into a difficult skill.

The book is structured around the packaging of two different projects, the first being a simplified "Hello, World" project to provide a digestible introduction to the processes and technology of the Autotools, while the second representing the full-blown packaging of a complex, real-world project (the FLAIM high-performance database). This is a very good approach, breaking the theory into many practical examples of practice, and providing many ready-made bits that the rest of us can start our own configuration build files from. The result is a first half providing a gentler, streamlined introduction to the subject matter, before the full jump into the gory details of the most complex possibilities the toolset offers. While it must be noted that John attempts to keep away from those most fine details which "may be subject to change" between minor releases of the tooling, which is doubtlessly good for both our scripts' and the book's shelf life, it must be observed that he does not shy away from very dense (and otherwise utterly undocumented) material, such as the use of M4 macros in Autoconf, something a colleague of mine once pointed to me as "the one more reason I'd rather chew broken glass than deal with Autotools".

Assuming you have the requisite knowledge of Make, Shell scripting (particularly Bash), and GCC that are essential to a developer, packager, maintainer or buildmaster of a Linux, BSD or *NIX project, or that you are on your way to achieving those skills, this is a book that belongs in your shelf, right next to the RPM documentation. This is material for experts or experts in the making, but in my opinion you will find no better introduction to this complex subject. I had it on my wish list well before it was ever printed, and its presence on my desk caused several other developers in my office to order their copies pretty much on the spot upon finding out of its existence. Either as a learning tool for a skill you are trying to attain, or as a reference to turn to when faced with the complexities of this unique set of tools, this book is well worth its price tag.

I certainly hope this is not the last publication we see on the Autotools in this decade, but either way, it is a good start indeed - and my hope is that the publisher will refresh the title when an update is warranted, without waiting ten years!

Federico Lucifredi is the maintainer of man (1) and a Product Manager for the SUSE Linux Enterprise and openSUSE distributions.

You can purchase Autotools: A practitioner's guide to GNU Autoconf, Automake, and Libtool from amazon.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.

*

This discussion has been archived. No new comments can be posted.

Autotools

Comments Filter:
  • by koinu ( 472851 ) on Monday September 27, 2010 @02:13PM (#33715174)

    ... they should be replaced by something else.

    • by Dr. Sp0ng ( 24354 ) <mspong@ g m a il.com> on Monday September 27, 2010 @02:14PM (#33715186) Homepage
      Here you go [cmake.org].
      • by Urban Garlic ( 447282 ) on Monday September 27, 2010 @02:54PM (#33715692)

        I often need to install software in an environment that's different from where it's going to be run, e.g. I install it on a file server, where the target directory is /export/apps/stow/, and then I use "stow" to put it in /export/apps, which clients mount as /usr/site, so they see "/usr/site/bin/", and are set up to look in /usr/site/lib for libraries, and so forth.

        I don't know if this is instrinsic to newer build schemes or not, but my recent experience has been that "old-style" (autotools-based) packages work just fine, they interoperate well with stow, accept the "--prefix" argument to configure, and work just fine for the clients. Cmake-based packages tend to hard-code path names into start-up scripts, which then break on the clients, which view the app in a different hierarchy -- they don't have /export, in particular.

        Now, it may well be that these are badly-written cmake scripts, and cmake is perfectly capable of doing it right, I honestly don't know. But it seems to me that cmake (and Python's easy-install, and every other new build scheme I've run across in the past few years) are all part of a new generation of tools which really want the build-time and run-time environments to be the same, because they're built around the "single-user isolated workstation" model.

        But it's not true. Lots of us still have centralized file servers that use NFS exports to make centrally-managed Linux applications available to many clients. The new tools make some things easier, but this, they make harder.

        Also, uphill through the snow both ways, and we liked it, get off my lawn, kids today don't know nothin', no respect I tell you.

        • by GooberToo ( 74388 ) on Monday September 27, 2010 @03:06PM (#33715828)

          I've run into the same problem with cmake. I don't really have that problem with python tools as python's virtual environment tools [python.org] seem to handle things nicely. Tools such as pip [python.org] natively handle virtual environments, automatically installing into it when one is active.

          Also, there are lots of nice wrappers [python.org] to work with python's tools, for developers, such as gogo [bitbucket.org].

          • by Anonymous Coward on Monday September 27, 2010 @05:48PM (#33717410)

            "I don't really have that problem with python tools as python's virtual environment tools seem to handle things nicely. Tools such as pip natively handle virtual environments, automatically installing into it when one is active."

            Which is another aspect of the very same problem. So their solution to ignoring they should seggregate functional development from bug fixing, that they should take API stability as an almost sacred cow is "reinventing" the "static linked environment"?

            Try to install two disparaged tools based on the same shared environment and welcome to the fun of app A asking for 'foo 1.2.3' and app B needing 'foo 1.2.4', so you end up producing a virtual environment for each and every app you install. Good luck then with tracking security advisories for "foo" (and all the other dependencies of each installed app). Good luck with finding an upgrade path that will just fix known security issues without breaking functionality for any of your installed apps.

            But, but, but... you should use the very bleedging edge -the developer of App A will say, without paying attention to the fact that you manage 2000 machines with 150 different main apps from different developers and dates, disregarding even that each of those developers will define "bleeding edge" as "whatever it happens to run in my development box, disregarding if it's really needed or it's stable or it will be supported by my dependecies' developers".

            Ahhh, youngsters... now, you get off my lawn!

        • by Abcd1234 ( 188840 ) on Monday September 27, 2010 @03:11PM (#33715892) Homepage

          I don't know if this is instrinsic to newer build schemes or not, but my recent experience has been that "old-style" (autotools-based) packages work just fine, they interoperate well with stow, accept the "--prefix" argument to configure, and work just fine for the clients.

          Not true. No true at all.

          I used stow for many many years. And it's true, a lot of packages just worked. You'd do a "configure --prefix /run/time/target", then "make install prefix=/install/time/target", and it would work. But this is *hardly* universal. Many many apps fail due to install-time relinking, which means you have to hack libtool and the build a bit in order to get them to install. It's doable, certainly, but it's hardly automatic.

          Now, I haven't had to deal with cmake-based systems recently, so it may be that they're even more broken than this. But it's never been completely trivial to build for one runtime path, while installing to another.

          'course, as a user on an isolated workstation, I finally just moved away from stow, and manage my custom-built packages using checkinstall. But that's obviously no solution when you're managing applications on an NFS share.

        • Have you tried to define CMAKE_INSTALL_PREFIX [cmake.org], CMAKE_INSTALL_RPATH [cmake.org] / CMAKE_SKIP_RPATH [cmake.org], etc when configuring the software you are trying to build? I'd say they pretty much fix whatever problems you are running into. BTW, CMake also supports "make install DESTDIR=/whatever"
        • by coats ( 1068 ) on Monday September 27, 2010 @04:22PM (#33716688) Homepage
          And there's worse -- supporting multiple simultaneous build targets. Most of my stuff I build {optimized, debug, optimized-profiling} x {gfortran/gcc, g95/gcc, sunf95/suncc, ifort/icc} for a set of 12 simultaneous build-targets. Conventional build systems do not support multiple simultaneous build-targets well.
        • by Kaz Kylheku ( 1484 ) on Monday September 27, 2010 @05:17PM (#33717172) Homepage

          Hi Urban Garlic,

          The most important thing to know is that the --prefix argument (for a correctly designed configure script which follows the conventions) indicates the run-time installation directory of the program. I.e. the path given to --prefix may actually be compiled into the program and then used by that program at run-time.

          The average free software user compiles the program on the same machine where he will run it. And so the prefix is also the place where the program is copied during "make install".

          The default value for --prefix is often /usr/local, for this specific use case.

          If you're building a distro or toolchain, you may have to override --prefix to /usr. (Assuming that the programs you are making wll go into /usr/bin, /usr/lib, etc on the target system).

          The point is that the prefix has to be correct for the ultimate destination where the software lives.

          Of course, when you're preparing a package for another system (perhaps a cross-compiled package, even), and the prefix is /usr, of coures you can't do "make install" because that will try to copy into /usr on the local build system! You must install into some temporary directory (or a "sysroot" directory where you are building up the filesystem image for a target). The way this is done varies from package to package. Some packages accept an extra configure parameter, like --install_dir. Many packages accept a make variable called DESTDIR which is specified on the "make install" command line; the contents of DESTDIR is prepended to the prefix. In some cases you can override the prefix variable itself at install time "make install prefix=...".

          • by chrb ( 1083577 ) on Monday September 27, 2010 @05:45PM (#33717400)

            You must install into some temporary directory

            I'm sure he knows this since he is already using Stow [gnu.org]. Stow works pretty well for having multiple versions of different software packages built from source and installed simultaneously, and having a proper package management system for it all. Though these days I use Checkinstall [asic-linux.com.mx] - having the final package as a .deb or .rpm make it a bit easier to distribute the built packages.

        • by Anonymous Coward on Monday September 27, 2010 @05:29PM (#33717298)

          "Now, it may well be that these are badly-written cmake scripts, and cmake is perfectly capable of doing it right"

          The critics about the book make a point about "the Autotools are not simply a set of tools but foremost the encoded embodiment of a set of practices and expectations in the way software should be packaged the GNU way".

          It's not that cmake can or can't be properly used in order to provide a platform-independent prebuild environment but it lacks the grayback experience. As the old motto says, "those who don't understand UNIX are condemned to reinvent it, poorly". Unix-zen, once "the cool wave" is about 40 years old now. New generations feel they know better than the old farts (and many times, they are right) and that they can do it better and simpler... just because they are not (still) aware of the whole landscape and its corner-cases. So, yes, they might re-implement the old farts' tools better and simpler (after all, they have the advantage of building of top on the shoulders of giants, so to say) but, at the same time, they are condemned to fail on the same stepstones their olders managed to workaround just because they choose to ignore them, and then add their own package of failures.

          "But it seems to me that cmake (and Python's easy-install, and every other new build scheme I've run across in the past few years) are all part of a new generation of tools which really want the build-time and run-time environments to be the same, because they're built around the "single-user isolated workstation" model."

          Young is always proud and think to know better than anybody else. They just produce tools for what they know and they choose to ignore everything else. No wonder they fail on things that would seem trivial for older people that already had to fight with them years ago.

          This is the generation of 'reinventing the wheel': they are doing better in some aspects and fail heavily on things that should be known by now.

          • by Anonymous Coward on Monday September 27, 2010 @05:59PM (#33717516)

            because they are not (still) aware of the whole landscape and its corner-cases

            Youngsters can undervalue knowledge of "the whole landscape and its corner-cases", but old farts can also overvalue it. The beauty of new tools like CMake is that they can leave the past behind, and stop worrying about corner cases on obsolete platforms. At one time I was an absolute expert on MSDOS and Windows 3.1. I leave that off my resume now because it no longer has any practical value, and it just makes me look old :-(

      • by TheRaven64 ( 641858 ) on Monday September 27, 2010 @05:10PM (#33717110) Journal
        I've looked at cmake, and it seems like a really nice solution. When I work out what the corresponding problem is, I have no doubt I will use it.
        • by oiron ( 697563 ) on Tuesday September 28, 2010 @12:59AM (#33719856) Homepage

          Actually building on multiple platforms without maintaining separate build files for each is the problem...

          CMake was created to build Kitware's other products, most notably VTK and ITK. To date, I've built both, and other things built on top of them on three platforms, with several variations: GCC on Linux, both 32bit and 64bit, MinGW and Visual C on Windows. I don't need to install anything else apart from CMake and the compiler (and associated Make package) on each of those platforms, run it once, and then run make. On the other hand, just you try running autotools on Windows...

          The other option, which I see a lot of projects using, is to have multiple build systems - an autotools one for Unices, and a .sln/.vcproj set for Windows. I don't think I need to point out the fun in maintaining that...

          • by b4dc0d3r ( 1268512 ) on Tuesday September 28, 2010 @03:02PM (#33727514)

            In most cases, the MSVS project files are a courtesy for the people who are probably going to be using MSVS. I remember piles of 1990's open-source projects which ran on Windows, and everyone was asking for the project files. What is this other compiler you speak of - Cygwin isn't a compiler it's a fake unix environment, you can't make a Windows program without MSVS they say. Plus, whomever is doing the Windows port probably uses MSVS anyway.

            For a true open-source solution, they would provide the command-line build. That way people can use the free version of msvs if they want. But for debugging and patching, and especially if I'm porting, I want to use MSVS because it's what I'm familiar with and it works.

            Before .NET, I used to call MSVS the only decent software MS made, with SQL a close second. Not that they were perfect, but they were developed for coders by coders. Then it took 5 years for MSVS .Net to mature, and it's only getting better. But it's the standard, and the guy porting probably uses it, so why not include it.

            • by oiron ( 697563 ) on Wednesday September 29, 2010 @01:50AM (#33731562) Homepage

              The other compiler is MinGW32 [mingw.org] - there are others, like Borland C, and ICC...

              I think CMake supports them all.

              There are still piles of projects running on Windows. Right now, I'm on Windows (work machine), with KDE 4.4, Inkscape, Gimp, VTK and ITK among other things installed. Remember, lots of devs work on Windows, and quite a large body of users too. Not to mention other environments, like embedded systems which might or might not be able to work with full autotools...

    • by StripedCow ( 776465 ) on Monday September 27, 2010 @02:16PM (#33715238)

      Indeed, but a book can make it easier to develop such a thing.

    • by Anonymous Coward on Monday September 27, 2010 @02:17PM (#33715262)

      They have been.

      Cmake among others has effectively replaced autotools. It's FAR easier to deal with, cross platform, fast, will build makefiles, visual studio solutions, and X-code, and supports testing and other things.

      There are some other ones around too like Scons, but the point is, anyone starting a new project now with autotools is a dolt or a masochist or both.

      Autotools is dead. Let's let it be buried in peace, please.

      • by samjam ( 256347 ) on Monday September 27, 2010 @03:24PM (#33716052) Homepage Journal

        If that were true you wouldn't have needed to say it

      • by JanneM ( 7445 ) on Monday September 27, 2010 @05:44PM (#33717382) Homepage

        I've just recently been in the situation of selecting a build system for a project with an existing codebase. I looked at the obvious alternatives, including cmake.

        In the end, I chose autotools.

        When you're doing a non-trivial project, cmake doesn't become any less complicated than autoconf and automake anymore - if your build is complex, you have to deal with that complexity somewhere after all. And there's a lot more and better resources for using autotools than cmake around, for figuring out odd corner cases. If you have a somewhat odd build requirement, chances are somebody else has already solved it using autotools already.

        From my experience so far, most of what people dislike about using autotools come from Automake. But Automake is of course completely optional to use, and Autoconf - which provides most of the benefits - was made to be standalone. If you have a system with existing makefiles, it makes a lot of sense to simply use Autoconf to configure the app and the makefiles and leave Automake alone.

        This is a lengthy but really illuminating document on using the autotools, that specifically goes through using autoconf alone and on how to adapt an existing project: http://www.freesoftwaremagazine.com/books/agaal/brief_introduction_to_gnu_autotools/ [freesoftwaremagazine.com]

        • by macshit ( 157376 ) <snogglethorpeNO@SPAMgmail.com> on Tuesday September 28, 2010 @02:56AM (#33720318) Homepage

          I agree -- autoconf is independent, and does a great job handling system configuration stuff without involving automake -- but I think you're being a bit unfair to automake.

          For projects that "fit" automake, it's actually a wonderful tool, as it allows a highly concise description of the package contents and dependencies, with almost zero fat and overhead, and does pretty much all the typical boilerplate stuff (convenience targets, separate build-directory support, installation, automatic dependency generation, consistency checking, etc) automatically without the user ever having to see the ugly internals. However automake also imposes a degree of structure on a project (not surprisingly, roughly following that of many GNU packages), and is not so flexible if you want something different. For packages that don't follow this structure, or which make particularly complex demands on the build system, automake may not fit very well, and may end up just getting in the way. [It's not entirely inflexible though -- it does try to provide for customization to some degree, and remember, it's essentially a wrapper around makefiles, and will pass through traditional makefile rules largely unscathed.]

          I'd highly recommend trying automake first though, just to see if it works for you, because when it does work, it's really nice.

          From what I've been able to figure out, most people who say they hate the autotools aren't realllly griping about the actual functionality, but rather expressing their distaste at the really grody implementation. It's very true that the implementation details of autotools (a bizarre mixture of m4, bourne-shell, perl, make, etc, all of which are even more ugly that necessary in order to remain portable) is not for the squeamish. However a user of autotools doesn't have to care about the implementation details for the most part -- and from a user's point of view it actually works quite well, and has some very nice features.

    • by bogaboga ( 793279 ) on Monday September 27, 2010 @02:44PM (#33715560)

      decided to make the experience easier to newcomers by sharing his years of experience and carefully crafted bag of tricks.

      Even better would be reading that this gentleman had gotten behind efforts to make working with the tools easier. Simply teaching me tricks though welcome, is not good enough. Working with the tool(s) still is difficult.

    • by Anonymous Coward on Monday September 27, 2010 @03:47PM (#33716324)

      Oh but I might still buy a copy. Just to wipe my @ss with. I can't begin to think of the hours I've wasted debugging build failures of this heap of cr@p.

    • by avgapon ( 1851536 ) on Monday September 27, 2010 @04:11PM (#33716570)
      Why invent Makefile writing scripts or even programs when make and Makefiles can easily do all that is required for cross-platform (and cross-target) compilation? http://sourceforge.net/projects/mk-configure/ [sourceforge.net]
    • by CarpetShark ( 865376 ) on Monday September 27, 2010 @06:12PM (#33717614)

      My thoughts exactly. Autotools is/are abominations.

  • by Vintermann ( 400722 ) on Monday September 27, 2010 @02:16PM (#33715230) Homepage

    I suppose it's nice that someone writes a book like this, since a lot of existing projects use autotools (or more commonly, try to by means of copy/paste and cargo-cult based build scripting).

    But autotools should really be phased out. It solves a lot of problems that aren't problems anymore, and makes a helluva lot of new ones in the process. There are a lot of up and coming build systems to challenge it, and then there's CMake which is an OK compromise between those and practicality.

    • by Anonymous Coward on Monday September 27, 2010 @03:35PM (#33716186)

      Care to point out what new problems autotools creates? From my experience, autotools projects tend to work flawlessly while cmake ones tend to throw a lot of obscure errors that no one is able to figure out.

      • by Entrope ( 68843 ) on Monday September 27, 2010 @03:54PM (#33716388) Homepage

        The autotools suite requires that software developers keep revising things that worked before, because autotools has some new paradigm for some aspect of its operation every year or so.

        For example, one of my open source projects lets the user specify which extra modules should be compiled into the binary. (It doesn't use loadable modules because that was even more painful when we started out.) Over a span of about two years -- and I think three "minor" (1.x) releases of automake -- the approved way of conditionally linking object files changed twice. The changes were not documented, and nobody bothered to describe any way that would work across the several versions of automake that were in common use at the time. In contrast, doing the same thing with autoconf alone or with some non-autotools script is dead simple.

        autoconf has also gotten progressively fussier. For example, simple m4 macro invocations like FOO(bar) used to work fine. Now they often generate a warning that the argument is not quoted properly. To pick one example of prior art, C solved that particular precedence problem about 35 years ago. There is no good reason for autoconf to make the software maintainer throw in all the []s that it wants.

        Those are just a few examples from my experience; others will have more.

      • by DiegoBravo ( 324012 ) on Monday September 27, 2010 @04:10PM (#33716562) Journal

        > Care to point out what new problems autotools creates?

        Builds that take half an hour just to "configure", checking for the existence of things like strcpy(), but anyway fail at compile or link time because a missing symbol in an upgraded dynamic library?

        • Builds that take half an hour just to "configure", checking for the existence of things like strcpy(), but anyway fail at compile or link time because a missing symbol in an upgraded dynamic library?

          Is there ever a situation in which configure couldn't cache the "yes" answers and reuse them between programs? That's what I find most frustrating about autoconf.

          I can understand not caching "no" answers long-term. For example, you try to configure package Foo and find that it depends on package Bar. Once you've installed Bar, you don't want configure to remember that Bar wasn't installed last time.

          However, once "checking for bar.h... yes" comes up, is there ever a situation (short of uninstalling Bar) when you don't want that answer to be cached? I could happily imagine a case where configure always uses a specific host-wide cache file by default. Add some logic to "pkg_delete" or "aptitude remove" to delete that cache file so that removing an installed package would invalidated the cache, but otherwise leave it intact. Is there a corner case that could cause problems with this arrangement? On the FreeBSD systems I administer, I'd be willing to bet that half the time spent in a typical software upgrade is spent waiting for configure to decide if stdlib.h is still installed.

      • by onionman ( 975962 ) on Monday September 27, 2010 @05:22PM (#33717222)

        Care to point out what new problems autotools creates?

        The fact that I can't use them on Windows. Seriously, I can't see how we call a build system "cross platform" when it doesn't cooperate with the most widely deployed build environment.

        It seems to me that it might be possible for a Python-based build system to have enough platform independence that it could build projects on Linux, Mac, and Windows. SCons looks like it's coming along. I'd really like to see more developers get behind it, but on the open source projects that I've worked on many of the developers are antagonistic toward any OS other than Linux.

        • by NJRoadfan ( 1254248 ) on Tuesday September 28, 2010 @06:34AM (#33721010)
          Cygwin and Mingw32 support automake, but I haven't had much luck with them. VS.NET isn't very pretty either if you need external libraries. Sometimes setting up the project file can be a headache, but at least I can get projects to compile.
        • by Elbows ( 208758 ) on Tuesday September 28, 2010 @10:51AM (#33723302)

          I use SCons at work for a project that builds on Windows, Mac, and Linux, and we even supported IRIX for a while. Our build is pretty complicated, including some code-generation steps where we build a program, run it, and generate new source files based on the output, and SCons works pretty smoothly. It's definitely lacking in a few areas, but lately it's been quite stable despite adding new features fairly quickly.

          We do have a lot of "if windows... elif macos" kind of code. But I don't know if any system can entirely get rid of those. There are some platform-specific concepts that are really hard to abstract in a platform-independent way (e.g. side-by-side assembly nonsense on windows, or the weird handling of shared library paths on Mac).

          The biggest downside I see to SCons is that your build system can easily get very complex, and then you have to debug and maintain all that code.

      • by MtHuurne ( 602934 ) on Tuesday September 28, 2010 @11:37AM (#33724038) Homepage

        One problem is that a configure script prints lots of things to stdout. The few relevant lines are drowned in pages of irrelevant ones. For example, I once built SDL_image and later a program that links to it and that program's attempts to load PNG files failed. It turns out that the configure script of SDL_image failed to link with libpng and said so, but I didn't catch that message because it was between hundreds of useless lines.

        I didn't expect SDL_image to build without PNG support, since I explicitly told configure to include PNG loading (--enable-png). Still it decided to not include the feature since it thought libpng was unavailable. I think it's reasonable that if I request something explicitly and it cannot be delivered, the result should be an error and not omitting the requested feature. I don't know if this behavior is built into autoconf or it is just a practice associated with autoconf, but I have seen it happen on many different packages.

        Another problem is that a lot of unnecessary checks are done. It seems half the packages out there check for a Fortran compiler while they don't contain a single line of Fortran code. Also features are checked that are present on every system produced in the last 10 years and in most cases compilation will fail anyway if the feature is not present since the result of the configure check is ignored by the actual code. The unnecessary checks both add to the irrelevant output and slow down the build.

        Because configure is a shell script, it runs its checks sequentially. So even if you have a quad core system, only one core is being put to use. For compilation Make is used, which is given a dependency graph that allows it to run certain tasks in parallel. It would make sense to do the configuration step in Make as well, but that is not how autoconf works.

        Because autotools generate files, if an error occurs for example while running the configure script, you have to manually trace it back to the configure.ac or to one of its include files. This is a generic problem when generating code, but in autotools it is made worse by the use of a macro expander instead of a compiler: many errors that could be caught by a compiler will be forwarded to the output by a macro expander. And in the case of automake, there are two layers of expansion: Makefile.am -> Makefile.in -> Makefile, increasing the effort of tracing problems back to their origin.

        Note that these points are design flaws in the autotools, so it's not likely they can be addressed while staying compatible with thousands of existing packages using the current autotools. So I would like to see a replacement for autotools, but so far every other system I tried (Ant, SCons, qmake) works great for simple builds but becomes a struggle for more complex builds.

    • by djm ( 126641 ) on Monday September 27, 2010 @07:48PM (#33718370) Homepage

      It's true, pretty much. We developed configure scripts and ways to generate them in the days of 28.8kbps modems and they had to work on Unix System III and Xenix and HP-UX. We couldn't assume anything like Perl or Python was available. Linux distros were only just appearing, and there were no package management systems. Windows was still a 16-bit DOS shell. It was a different world. I'm amazed this stuff has endured as long as it has with so few changes. By the time Automake was written, several years after Autoconf, we at least felt we could assume the presence of Perl.

      Want to know why it's called "Autoconf", which I think is a bit ugly of a name? I wanted to call it "Autoconfig", but when you add a version number and ".tgz" to that, you exceed the 14-character file name limit of some of the Unix variants it had to be downloaded and installed on!

      Dave MacKenzie
      Autoconf's main developer

  • by hkz ( 1266066 ) on Monday September 27, 2010 @02:18PM (#33715274)

    John Calcote is a senior software engineer in Novell's Linux business, who after slogging up the steep learning curve the Autotools triad poses to those packaging software according to the portable GNU conventions for the first time, very kindly decided to make the experience easier to newcomers by sharing his years of experience and carefully crafted bag of tricks.

    The book opens with John's experiences in adopting the Autotools, and quickly offers what is in my view a very important word of caution that is often lacking in the few tutorials I have seen on the Net: the Autotools are not simply a set of tools but foremost the encoded embodiment of a set of practices and expectations in the way software should be packaged the GNU way.

    Good heavens, I'm all for sentences with body, but this is terrible. I actually stopped reading the article after the second one. You know what this site could use? Editors.

  • by larry bagina ( 561269 ) on Monday September 27, 2010 @02:28PM (#33715380) Journal
    A guy from novell/suse is reviewing a book by another guy from novell/suse. When his novell/coworkers see the book at his novell/suse desk, they immediately buy a copy.

    Did I miss any novell/suses?

  • Crappy Approach (Score:2, Insightful)

    by firewrought ( 36952 ) on Monday September 27, 2010 @02:33PM (#33715436)
    I understand that adoption/marketing/historical factors may have justified this particular approach to cross-platform builds of C/unix apps, but is this such a big problem that it requires 5-6 languages to solve (counting the syntax of C, sh, configure.ac, Makefile.am, makefile and possibly other intermediate formats)? Sheesh...
    • by loonycyborg ( 1262242 ) on Monday September 27, 2010 @02:47PM (#33715592)
      Probably they just wanted to make the build process not require anything other than sh and make to be installed on end user's system, so they used m4 macro processor to generate shell scripts and Makefiles which can just be included in the tarball.
    • by ebuck ( 585470 ) on Monday September 27, 2010 @03:00PM (#33715762)

      Those who don't understand Automake are doomed to repeat the mistakes of build systems that are not designed like Automake.

      The one language that actually drives all of Automake is ML. Funny it is the one language you didn't list, but you listed a bunch of the high level macro files that get expanded with ML. If you don't know sh, then you shouldn't be programming on the command line (stick to an IDE that does the compilation for you). C isn't required for any component of Automake. If you write a makefile with automake, you made a big mistake, as Automake writes the makefiles for you.

      There are other systems out there which are easier to use, but there's only a handful that does things in a manner that is highly reliable and portable on many platforms. Those that do strive for such goals end up operating like Automake, but often they do so without allowing such easy access to the internal guts of what is really going on.

      Yes, I like cmake too, but bashing Automake just because you don't understand it is just the computer equivalent of name-calling.

      • by Waffle Iron ( 339739 ) on Monday September 27, 2010 @03:13PM (#33715928)

        Hmmm... are you sure that you didn't mean the macro language M4? I thought that ML was the pure functional ancestor to languages like Haskell.

        Anyway, I played around with M4 a little bit because I thought it looked handy for a few things. It has a deceptively simple specification that only takes a few pages, but it's one of the most extreme examples of "emergent behavior" I've encountered. Even simple tasks rapidly become mind boggling due to the deceptively tricky nature of recursive text substitutions and quoting. It's a real brain teaser of a language. It does seem like it would be a nifty tool if I spent enough time to really figure it out.

  • by Bitch-Face Jones ( 588723 ) on Monday September 27, 2010 @02:46PM (#33715580)
    with a nail than use autotools
  • by af1n ( 1031572 ) on Monday September 27, 2010 @02:54PM (#33715696) Homepage
    Poul-Henning Kamp of the Varnish about autotools: "Did you call them autocrap tools ? Yes, in fact I did, because they are the worst possible non-solution to a self-inflicted problem." Read more at: http://www.varnish-cache.org/docs/2.1/phk/autocrap.html [varnish-cache.org]
  • by the eric conspiracy ( 20178 ) on Monday September 27, 2010 @02:58PM (#33715738)

    Shouldn't that be shallow learning curve? Steep would imply quick effortless progress towards expertise (at least if you put the independent variable on the abscissa and dependent on the ordinate as is customary). A shallow learning curve now would imply slow progress...

  • by locallyunscene ( 1000523 ) on Monday September 27, 2010 @03:02PM (#33715788)
    Let's Rooolllll out!
  • by perpenso ( 1613749 ) on Monday September 27, 2010 @03:12PM (#33715908)

    Unfortunately, the publishing industry is driven by the need to turn a profit to fund its endeavors, and specialist items like this book are not obvious candidates for volume selling - which is a credit to No Starch Press' willingness to venture down this path.

    The situation is not as dire as this post seems to suggest. Print on demand [wikipedia.org] is an option for a book such as this. Getting a publisher like No Starch is great since they will provide traditional editing, review and marketing services however the publisher is not necessarily making any great investment since they too can take advantage of a print on demand type of approach. There is no longer a need to print a large number of books up front.

  • CMake (Score:5, Informative)

    by paugq ( 443696 ) <pgquiles@el[ ]er.org ['pau' in gap]> on Monday September 27, 2010 @03:31PM (#33716140) Homepage
    Obligatory link to a good autotools alternative: CMake [cmake.org]. And my CMake tutorial, Learning CMake [elpauer.org].
    • by mathfeel ( 937008 ) on Monday September 27, 2010 @05:50PM (#33717434)

      Thanks for the link. Time to read up on CMake. When debug a broken build script in Gentoo, my personal experience is that I usually figure out what's wrong more quicker with Autotool than Cmake. The main reason is probably that I am more familiar with the former from reading this autotool online tutorial/reference: Autotool Mythbuster [flameeyes.eu]

    • by achurch ( 201270 ) on Tuesday September 28, 2010 @12:45AM (#33719798) Homepage

      is that it's got ugly syntax, effectively no cross-compiling support, and less-than-helpful documentation. And its generated Makefiles sometimes miss changes in header files, forcing you to "make clean".

      But yeah, it's still a good alternative to autotools.

  • by wowbagger ( 69688 ) on Monday September 27, 2010 @04:11PM (#33716574) Homepage Journal

    I'll throw this bit of fuel on the flame-fest, in the form of a question:

    Does anybody else find that Autotools based projects, while being very cross-platform, are almost impossible to actually cross-compile?

    I do embedded systems work, and the embedded universe is moving to Linux as the kernel, and quite frequently a sub-set of the Gnu environment for the runtime. So you get things like BitBake, OpenEmbedded, and Angstrom, which attempt to enable you to build a complete system from sources. However, what all these environments do is run QEMU to emulate the target CPU in order to do the builds.

    Now, that's stupid IMHO: I have this VERY FAST multi-core workstation, and I am
    1) Throwing away all but one core, and
    2) Hobbling that core emulating a completely different architecture (e.g. emulating an ARM to build for the OMAP).
    Much of the - I shall use the term "work" although a more bovine-scatological term springs unbidden to mind - involves writing "recipes" for BitBake to work around issues in Autotools.

    Much of Autotools seems to me to assume that the machine building the code will be the machine running the code - or at least, a machine of the same type as the machine running the code. So all the Autotools "magic" to deduce structure layouts, word sizes, byte orders, and such all assume that you can
    a) compile the code using the host's C compiler,
    and
    b) Run the resulting probe programs on the host.
    Both of which are totally FALSE for cross-compiling.

    I used to say that you could design a new CPU that nobody had ever seen, and once you ported Binutils, GCC, and Linux to it, you could build an entire functioning distro for it, just by iterating over the various source packages for and cross-compiling. I know better now: most of the source packages out there DO NOT cross compile worth a damn! They might BUILD NATIVELY on a wide range of architectures, but not cross-compile.

    • by Kaz Kylheku ( 1484 ) on Monday September 27, 2010 @05:10PM (#33717106) Homepage

      Indeed, requiring that part of the build takes place on the target machine, or having to emulate it, is an incredibly lame-assed copout.

      At Zeugma Systems, I produced an embedded GNU/Linux distro known as Zeugma Linux. The rule of the project was that everything cross-compiles.

      No MIPS instruction was emulated during the build.

      I took the lessons that I learned, and incorporate them into the small configure scripts that I write by hand, which takes far less effort over the life of the project than dealing with Autotools.

    • by bongey ( 974911 ) on Monday September 27, 2010 @07:19PM (#33718146)
      What are you talking about ? http://tbingmann.com/2010/apidocs/autoconf-2.65.zip/autoconf_14.html [tbingmann.com]
      From someone setup the build for a production system to compile on linux and target platform being windows, I am little confused.
      `--build=build-type'
      `--host=host-type'
      `--target=target-type'
      • by wowbagger ( 69688 ) on Monday September 27, 2010 @08:57PM (#33718816) Homepage Journal

        Now, configure something like CORBA targeting a PPC, but configuring on an X86, for compilation on an X86 using a cross compiler.

        Oops - all your structure padding code for the CORBA martialling is broken, because the autoconf scripts all assume they can find out the padding of the structures by emitting a program, building it, running it (whoops! wrong arch!) and getting the output.

        It's all well and good if the program is trivial enough that it does no serious probing of the system, or if the configure host is the same CPU type as the target, but break that, and unless the project's configure script is set up to correctly detect a cross compile, to provide a set of configuration parameters you can provide on the command line to set the variables that would normally be inferred by probing, AND has the wit to refuse to run without you providing those parameters because it detected you are cross compiling, and you will NOT get a running program by default, NOR will you be able to get one without major surgery on the configuration data by hand.

    • by achurch ( 201270 ) on Tuesday September 28, 2010 @12:56AM (#33719842) Homepage
      Interesting you should mention this; I've had the same problems you describe trying to get CMake to cross-compile, but with autotools, "--target=other-cpu" has generally worked fine in my experience (making it just about the only redeeming feature in that spaghetti mess of shell and m4 code). Admittedly I haven't tried building an entire Linux distribution, so maybe I just happened to choose packages that don't rely on running test programs, but IIRC autotools will explicitly disable the standard runtime tests when cross-compiling.
      • by MtHuurne ( 602934 ) on Tuesday September 28, 2010 @10:21AM (#33722818) Homepage

        Note that the option to specify the target architecture is "--host". This makes sense for GCC and binutils and almost nowhere else, but every package has to deal with this naming. To make things worse, configure scripts created by recent versions of autoconf print a warning when you use "--host" in this way, but offer no alternative.

        Whether or not a package using autotools will cross compile depends on how the configure.ac was written. Often it is copy-pasted together using fragments from other packages without really understanding what they do and in that case it is likely to break. If the configure.ac was written with cross compilation in mind or just does not attempt any tricky stuff, cross compilation will work fine.

        Cross-compilation support is not magically added by autotools, the maintainer still has to consider the fact that the produced binaries might not be executable on the machine doing the build. The same is true of cross-platform support, by the way: autoconf does not provide it out of the box, it only provides a means of implementing it.

    • by Vintermann ( 400722 ) on Tuesday September 28, 2010 @01:19AM (#33719934) Homepage

      I know better now: most of the source packages out there DO NOT cross compile worth a damn! They might BUILD NATIVELY on a wide range of architectures, but not cross-compile.

      I think this is the problem with autotools. It gives the impression of supporting lots of things, but the majority of scripts out there even break if you try to build in a separate tree from the source code. All those checks for the behavior of strcpy() and so on impresses newbies to think that their program would compile on Xenix and Sys5 and what have you, while in practice it's completely pointless.

    • by MtHuurne ( 602934 ) on Tuesday September 28, 2010 @12:24PM (#33724888) Homepage

      I indeed regularly run into configure scripts or Makefiles that think they can execute the binaries they have built on the build machine.

      Just as bad are configure scripts that realize they cannot run a certain check in a cross compile and will use a hardcoded pessimistic result instead. In many cases the package then breaks during compilation because this alternative is completely untested since no actual system still in use requires it. In other cases it leads to useful features being turned off or unnecessary workarounds being enabled.

      Another problem I often run into when compiling for embedded systems is not caused by autotools: a lot of packages pretend to be far more modular than they actually are. When aiming for a minimal footprint I like to disable all features that this particular system does not require. Deviating from the default configuration leads to compile errors on many packages. Sometimes because header files are included unconditionally even if the code using them is between #ifdefs. Sometimes because what is thought to be an optional dependency is actually not (or no longer) optional. And sometimes because package features have dependencies between themselves which are not (or incompletely) enforced by the build system.

  • by eexaa ( 1252378 ) on Monday September 27, 2010 @04:36PM (#33716806) Homepage

    please note that all current 'replacements' are totally wrong and actually work only as puny build systems, not supporting any of the great portability benefits that autotools give. Scons, cmake, whatever else depend on their working installation on _build_ machine. This is wrong, only working shell+make+gcc should be needed to actually build software.

    So...

    Is there ANY good "replacement", preferably lightweight, with this great virtue? As far as I know, there's none.

  • by Kaz Kylheku ( 1484 ) on Monday September 27, 2010 @05:06PM (#33717076) Homepage

    can compensate for the low quality of this garbage, not to mention its poor performance (some builds spend more time running configure than actually compiling, installing, and tarring up the resulting run-time and devel packages!)

    These tools have to be redesigned from the ground up by someone who understands that free software has made large inroads into the embedded world where it needs to be, doh, cross-compiled.

    For my own projects, I develop a carefully-crafted configure shell script by hand and recommend everyone do the same.

    The script has carefully developed and tested support for (1) cross-compiling, (2) configuring/building in a separate directory, and (3) installation into a temporary package directory.

    Furthermore, this script should ONLY test the features that are actually needed by the program. (The program should assume a reasonably modern system; don't bother testing for obscure bugs in System V release 3, okay?) The script should make sure that it tests only header files and compiling with the toolchain that it is given. The configure test programs programs should never accidentally include something in the build machine /usr/include, or link something from /usr/lib!

    Configure scripts should never run any program that they compile because it may be the wrong architecture, and they should not have an "if cross compiling" check which disables their features when cross-compiling and substitutes dumb defaults! Quite simply, implement the tests so that cross-compiling is not needed.

    For instance, instead of making a C program which outputs values with printf, make it so that the values are used as initializers for static data. Then use the cross-compiling toolchain's "nm" utility to extract the values from the compiled object file. You don't have to run a program to know how wide a (void *) is. Just do "static char size_of_void_star[sizeof (void *)]". Compile it, and then interrogate the .o to discover what is the content of the size of the data region named by size_of_void_star!

    Look at the stupid configure script for GNU bash. When crossing, it assumes totally pessimistic values for all checks that can't be done when cross-compiling. Unless you explicitly override the ac_* variables yourself, you get a shell which has no job control! The bash build assumes that your kernel has a tty system dating back to the middle 1980's.

    Nobody should dig through a configure script to find out what broken tests they have to override by setting the variables manually.

    Shell code should never be generated by m4 macros, let alone ones which frequently change.

    Look at the stupidity of thisl. Suppose you want to produce a minimal patch for some open source project that uses autoconf. Suppose that as part of your patch you have to enhance the configure script with some new options. To do that, you have to modify configure.in. But projects ship with the generated configure script. So you want to regenerate that, right? Oops, your version of autoconf is different, and so you get a HUGE diff. Worse yet, the generated configure script breaks!

    So people end up with ten different versions of autoconf installed, so they can always run the right one which matches what the configure script was produced with that is in the tarball.

    Build configuration from scratch is not difficult. It is easy. Autotools do not simplify anything. They monstrously complicate something which is already simple and doesn't require simplification or automation. Using the Autotools is a false economy. You may think you are getting something for free and saving time, but in the end you will spend more time over the life of your project wrestling with this garbage (and also waste the time of countless other people you don't even know) than if you just carefully wrote a small configure script by hand.

    • by wowbagger ( 69688 ) on Monday September 27, 2010 @09:09PM (#33718852) Homepage Journal

      I whole-heartedly agree with you - and I'll make another point. Much of what auto* was designed to do was to allow porting the Gnu toolchain over to a target which did not have it - e.g. porting binutils and GCC to some Unix box that did not have them yet, thus they had to assume as little about the target system as possible to allow bootstrapping. However, the idea was that once you HAD binutils, GCC, and libc, you could use them, and have a predictable standard environment.

      Now-a-days, that is almost a gimme - any new system will almost certainly have GCC as a part of the bring-up. So why not move on to a pkg-config style system, where there is an executable that can be run for the target platform, that can answer questions like "How big is a pointer? an int? A float? Do you have these library routines? What do I do to dynamically link a program?" pkg-conf does a wonderful job for the programs it knows about, why not extend it to answer all the myriad of standard questions that auto* seems to ask every time it's run?

      Sure, you'd have to generate that program for a new target - but guess what? most of the "questions" are ones GCC already knows the answer to, so why not just make GCC be the "pkg-config" for basic CPU arch type questions? Then, even if I am cross-compiling for a Floobydust300 rev B processor, all I have to do is 'floobydust-unknown-linux-gcc --whatis "sizeof(int)"' and I have my answer. (alternatively, dump an XML file, or a key=value file, or header, or any number of other approaches).

      • by oiron ( 697563 ) on Tuesday September 28, 2010 @01:11AM (#33719892) Homepage

        Interesting solution, but it doesn't answer what happens when I don't use GCC - suppose I'm on Windows, using MSVC, or for that matter, on Linux using ICC or clang? Those are still important use cases for cross-platform applications and libraries.

        So, let's have a program that knows about all these compilers and platforms, and can generate the appropriate build scripts.

        Hello, CMake... ;-)

    • by multipartmixed ( 163409 ) on Tuesday September 28, 2010 @08:32AM (#33721332) Homepage

      Thanks for the tips.

      I too develop configuration scripts by hand, but frankly have not cross-compiled anything in over a decade. I've been wondering how the heck to support cross-compiling cleanly, and from your laundry list, it looks like I'm pretty close already -- our best practices jibe pretty well.

      Now if only you could tell me how to get upstream packagers to stop asking me to convert a *very* complicated build to autoconf, I'd be even more grateful. :D

Pascal is not a high-level language. -- Steven Feiner

Working...