Username or EmailPassword
So, the bitch is back?
No mention of Trolltech's build tools.
you may want to RTFA again, qmake is there
however, it's not a "make alternative" since it just generates a Makefile
This was a rather informative article. I didn't know about all these 'Make Clones', and the only alternative I've heard of was jam, thanks to the Crystal Space Project. (http://www.crystalspace3d.org)
Personally, I thought it treated most of the tools vairly, and gave good information. Nice article!
Nobody really likes them, but there's no real alternative for widespread use at this time.
I hate autotools. It's incredible such a horrible piece of software has achieved such widespread use.
Look at pmk.sf.net if you want an alternative to autoconf ,libtool and pkgconfig. The project is also working on a tool that automatically produce makefile from scanning the source. There are examples somewhere on the website.
boost.build v2 is by far the best (v2 is not the same as original boost.build). It is awesome for large/complex c++ projects and is even easier/simpler for 'hello world' type of projects compared to regular Make.
The next runner up is Scons.
If you build complex c++ projects or even simple 'hello world' programs, then you cannot go wrong with either of these.
I especially like boost.build v2 because it wraps all the vendor-specific compiler/linker flags into generic ones. In other words, you can use the same build file for multiple compilers, multiple versions, etc.
"Jam is a software build tool that makes building simple things simple and building complicated things manageable."
Unless you are doing crazy things with your build system, (ie having it run regressions, build itself, etc) jam is probably the best tool I have encountered.
1. parallel build and dependency tracking really work. Make does paralle builds properly in a directory, but jam can really do a parallel build over the entire source tree. To be brutally honest, most object files can be run in parallel quite easily, linking is the task that __must__ be done serially.
2. Top level Jamrules holds most of the complexity for the build system. Modification for the top level Jamrules file is trivial and new machine types and compilers can be added in minutes.
3. make has the tab problem, jam only has the whitespace problem. IMHO, whitespace is good for jam usage.
4. supports openvms! That platform is a bit obtuse, but I am amazed that jam supports it.
5. modification of jam to support user defined rules is doable though not always documented.
1. building a target and then using that target to build other targets is hard. Trivial cases obviously work, but non-trivial cases where a target builds another file that is needed by another part of the build system can be tricky.
2. SharedLibrary rule is not built into 'normal' jam, this has been a sore point for many years, apparently the author does not want to fix it. (my biggest complaint BTW).
3. assumes hpux and solaris have only one architecture. Again, if the author would maintain it, this could be resolved trivially with a few lines in jam.h
4. parallel builds do not always work with the msdev compiler - though to be fair, this is more a problem with the MSDEV compiler than it is with jam.
5. ownership: Chris Siewald owns it, but overall there is no clear leadership. Getting changes approved seems to take months and everyone I talk to maintains their own "handrolled" version because the pace of their development is faster than the changes can be pushed back into jam.
However, let's be honest, a lot of software these days is being ported/developed on java/.net There are obviously niche players that will still use c/c++ (operating systems, CAD, databases, virtual machines, heavy-duty math libraries, etc etc.)
But more and more I see ant/java/eclipse as the new build/language/ide of choice for server class systems.
boost.build v2 uses jam so it benefits from all the advantages jam has to offer.
I believe boost.build v2 is being enhanced to support the Python language but I prefer jam which is the default.
I use jam alot in my own open source projects (crystalspace, netpanzer, lincity to name some released ones). While the tool itself has a really nice language and all the pro features you listet in your comment. The builting rules are not good. I replaced nearly all of them while working with the tool. In case you're interested I published them here:
Prolog would be a good alternative
How would Prolog be a good alternative to make? Explain!
I'm glad people are trying to address this. This is one of my beefs with *nix and open-source built applications (I HATE using make on windows). As a developer, it's discouraging to use, and I shouldn't have to worry about this stuff.
Luckily, IDEs like KDevelop usually do the work for you, with automake, but sometimes something screws up still.
There's nothing wrong with make. Make is great for what it does--simple dependency building. Make isn't just used for building programs--it can be used for anything dependency-related. Make is supposed to provide simple dependency-support; if you need more, you should implement it on top of Make, not complain about how make is "bad" and go off and make a new build system.
It is sad human beings try to make the perfect tool for doing an imperfect job. The design of "building programs" is horrible flawed and until the design is fixed, no tool will ever be a good solution.
No tool will ever be the 'perfect' solution. However, at the end of the day, you need to compile source code and you make (forgive the pun) do with software that is on hand. That being said, there are certainly better and worse versions of make - gnumake and opusmake are some of the better versions that I have seen.
Instead of critisizing the design of building programs, why not offer some meaningful alternatives.
Maybe I can dream up software and magically get it installed on an operating system - yeah matrix style, let's work towards that and not bother with this 'compilation' you mentioned.
I think the reason make is so popular is because it's so easy to get started with. When you first start putting a program together, and there's just a few source code files, and you think, "gee, I need an automated way to run the gcc/g++/java/whatever command so I don't have to type it out every time" -- using make is an easy decision. There's no xml to learn, no pseudo-language. You just list the targets, what they depend on, and the command to build them. Done and done.
I guess make is insidious that way.
I'm not experienced enough to know why folks complain about make though. I've never had the pleasure of dealing with makefiles larger than one page or so.
This is why I can't bring myself to learn the autotools. Most of the time I have some small project and the autogenerated autotools crap is bigger than my code will ever be. Do I need to run a 5 minute ./configure that detects all sorts of stuff for a completely self contained program? Seems like overkill for most projects.
Hate autotools, try bmake.
I've been using it for C based projects and with some tweaking of the .mk files C++ works well also. Anything that autotools can do can be implemented in the more general .mk files for the platform.
...don't use C/C++. Its part of the C/C++ culture and only the C language (well, ok, assembler as well) is so stupid that a compiler cannot find out dependencies itself.
VC++ does fine without make files.
Make works. That's all that's needed,and why Make is so widely used.
It's that simple. But it's something that's way beyond the grasp of the idiots who run around bashing Make.
PS. Notice most of the Make Bashers are Windows Developers.
Doesn't say much for their programing skills does it?
Yes, it says a lot about their programming skills all right, it says that they focus more on the actual programming than fiddling around with make.
Rake is a very interesting replacement for make, written the Ruby way:
I just wanted to post about rake, I never used it but since i started using ruby I've become aware of it. Have you used it? What can I expect when I start using it?
Albeit I haven't tried all the alternatives the article mentions , I have to say that ANT seriously rocks. First, the ant files aren't as compliated as make files (but this might just be me ...) and you can do a lot more stuff out of the box,
Secondly ANT is easily extensible. I while ago I had to write my own Task, extending ANTs core functionality. This was so easy that I had the first version running within a day. Additionally an ANT Task can modify the structure of the build file during runtime, which gives you an enormous amount of flexibility and, of course, the opprtunity to seriously mess with ANT, implementing all kinds of dirty tricks
On the minus side is, that it's a Java tool, so ANT is perhaps not the first choice for C programmers. But on the other hand ANT "just works", so no need to understand java. Perhaps you can even compile it with GCJ, but I don't know.
Did I mention that ANT rocks?
Ant so does not rock!
XML == hype
Make is so much simpler and also way more "extensible".
Even though Ant files are XML based (hence the quotes), Make is way more customizable.
e.g.: extending core functionality of ant takes you a day, extending make takes me 5 minutes and some shell commands.
I am not sure what you ment when you say the Ant does more stuff out of the box but I think that you would have a hard time backing that one up when you consider that make files give you access to the shell.
Also ants xml files build files are the worst to read, while good old makes files are much more clear.
Ant DOES rock
XML == hype
You're absolutely right, and I really hate, that nowadays just "putting XML in" is seen as magically increasing the powers of every application one can think of, BUT I think an XML based format is perfectly suited for ANT.
Make is way more customizable. [...] some shell commands.
I understand what you mean, but ANTs tasks are more abstracted. Let me explain: If you start cp'ing some stuff around there's not much difference, ANT has an "task" for this, but this does not much more than cp, it's just slower .
But consider a more complex example, where you have to write let's say 100 lines of shell code. Chances are that it runs one the box where you programmed it, while when wrapping it in an ANT task you at least have the chance to make it platform independent (if you need it). And you can do more advanced stuff with ant task like for example querying database. (grmbl, I've a really hard time making up good examples. I feel you'll tell me that you don't need all this stuff in our buildfiles ). Ok, I try to to explain the task I've written. I think there is no easy way how I could've achieved this with make cleanly:
We had a lot of projects in WSAD (Eclipse based IDE) which need to be build. Unfortunately WSAD doesn't use make, ant or anything like it to build the workspace, so we provided ant build files for every project. Of course we also had the master buildfile, which controls the the building of all projects. Since the projects had certain dependencies we need to track them in the ANT masterbuildfile. This was a manual and error prone process. With ANT we did the following: In the master buildfile call the custom task. This task started and parsed WSADs configuration for the workspace and the different projects. From this configuration we got the location of each project and it's dependencies with other projects. Based on this information a new section could be created in the currently running ANT masterbuildfile which was then programmatcally invoked to build all projects (note: This "new section" is never written on the disk, you just work with the representation of the ANT File, sort of like modifying a DOM tree).
Ant does more stuff out of the box but I think that you would have a hard time backing that one up when you consider that make files give you access to the shell
Ok, more platform independent stuff. Before you correct me: I DO know thatt certain tasks like bzip'ng stuff still need the bzip2 executable. Additionally there exists some very specialized tasks e.g. for working with CVS systems. Did I mention, that using ANT tasks all these functionality can be used with the same, consistent ANT/XML syntax, so you don't have to build 100 different commandlines for 100 different shell tools.
So, of course, ANT isn't for everyone, but it definitely ROCKS!
I'm always interested in build tools, and I thought this article was a good overview. Having already used many of the tools mentioned I didn't learn a whole lot. However, Scons caught my attention and after finishing the article I read the Scons user guide and converted several of my smaller projects to use Scons. The SConstruct files are one or two lines compared to 10 lines for the Makefiles. I think using Python for building software makes a lot of sense. It is more powerful and readable than M4 or XML, and more free than Java. It has been almost a year since I've used Boost.Jam v2, but unless it has made some serious progress, Scons is easier and integrates better with other tools.
> No tool will ever be the 'perfect' solution. However,
> at the end of the day, you need to compile source code
> and you make (forgive the pun) do with software that is
> on hand. That being said, there are certainly better
> and worse versions of make - gnumake and opusmake are
> some of the better versions that I have seen.
There are certainly tasks which are now done by 'make' and other build tools which are in priciple not necessary to specify in a makefile:
- figuring out dependencies automatically - the dependencies are unambiguously determined by the source code. If they aren't, something is seriously wrong with your programming language.
- figuring out automatically which tools to use. Source files need to be consumed by the compiler. Documentation files need to be comsumed by some HTML generator, for example. Syntax specifications must be consumed by a parser generator. (*)
- figuring out automatically what needs to be linked. Most programs have a single entry point (like the main function in C) so all you have to do is specify where it is. Dependencies are again unambiguous.
If you do it right, you'll have to specify little more than the position of the source code tree, the entry point to the program (if more than one are possible), and compilation options (debug/optimize/profile, target architecture and the like).
(*) In some cases a file could be used in different ways, like producing documentation or a parser generator from a syntax specification. In that case, the reference to that specification makes it clear what is meant: A reference to the syntax spec from within other documentation means 'generate documentation'. A reference from data structures means 'generate AST data structure'. A reference from executable code means 'generate a parser'.
Sure, you can do that--if you want to bloat the compiler with unrelated features. While your at it, you might as well add in version control. In fact, you might as well just turn it into an OS!
Is building the software it compiles really an unrelated feature though? These days, advanced compilers (those for Lisp, Scheme, Dylan, ML, etc, and those for C like ICC or Pro64) need to see the whole program anyway, to perform global optimizations. The old "compile each file into a .o and then link them together" approach just doesn't work, not from an optimization standpoint nor from a language semantics standpoint (eg: templates in C++). Once the compiler is looking at the whole program, it already contains code to do the dependency analysis across files, so why not reuse it to make it easier to build the software?
Beyond that, powerful compilers can also do a lot of analysis of your software and present the user with information about those analyses. Intel C++ and CMUCL/SBCL give reports about what constructs the compiler was able to optimize, while Functional Developer (a Dylan compiler), is integrates with the IDE to color-code those function calls that it couldn't statically determine the target to. Other, more experimental programs, can use slicing algorithms to show the programmer the data flow within a program, so he can understand other peoples' code better.
You'd call this bloat, but where else should you put it? If you put everything into an external tool like C does, then you have to duplicate parsing and program analysis code in a number of different places. And of course, those external programs will always have half-assed parsing and analysis tools, since otherwise they'd be as much work to write as the compiler! If the compiler includes all the functionality to analyze the program's behavior, as a part of its job function, why not take advantage of that preexisting functionality to make the programmer's life easier?
> Also ants xml files build files are the worst to read,
> while good old makes files are much more clear.
Negative. This might be true for some simple program, but for bigger programs the complexity of the make files grows exponentially with the size of the program, unless you use some macro tricks which keep the complexity under control but take you days to understand.
(such tricks are still better then a complete mess of unmaintainable make files, but they are NOT easy to read)
1) Yes, the autotools are nearly as enjoyable as a nailgun to the foot. They are fragile, practically impossible to debug, really don't solve the problem of platform independence, add complexity rather than remove it, and are generally just a huge PITA.
2) Ant is not a solution. It depends on Java and XML, which will buy you Windows but remove many other platforms. If it were written in C, then it might be okay for a platform-independent build tool (Java is for applications, not fundamental tools, IMO).
3) VC++ dispenses with makefiles for its own pile of shit kludge, so it is not an alternative. Besides, this is a discussion of platform independent tools, which Microsoft doesn't do business in.
4) I really wish GNU make didn't extend Make's syntax in so many ways. It just makes for pain and suffering when trying to compile software on a system that isn't the developer's GNU/Linux version X.Y with tools A, B, and C installed but not D, as it breaks B. Sigh.
5) You know, Make has been around for twenty years, and no one has really replaced it. That speaks volumes about how Make solved a problem and did it well enough that most people continue to use it, in spite of attempted improvements.
"5) You know, Make has been around for twenty years, and no one has really replaced it. That speaks volumes about how Make solved a problem and did it well enough that most people continue to use it, in spite of attempted improvements."
It did it well enough, but it's by no means brilliant. You may as well say that Microsoft Windows has been around for fifteen years, and no one has really replaced it, which speaks volumes about how Windows solved a problem.
Having had to set up gmake, autoconf, etc., for my own project, I'm also one of those who would shed no tears if gmake went away tomorrow.
For a small C/C++ project (say, less than 10kloc) the time needed to make a good autotools build system easily equals the time to make the real program. Sure, to just get automake to compile the thing on your own system can be made in a few minutes, but to then get all checks done correctly is another issue. You may need any combination of Qt, Gtk, Glib, zlib, libxml, Perl, Python, ImageMagick, OpenGL, SDL etc etc, and for a good build system all these should be checked for and the user should have a chance to specify paths for includes, libs and whatnot.
And then to get stuff installed is another matter. There are some nice standards for things, but for instance Python apps aren't easy to install at all (where do the .py files really go, site-packages isn't really for apps).
I've started to just ignore autotools and just make something simple that works for me, such as using plain Makefile:s or Qt's qmake. For Python apps I use "extract this tar.gz and your app is installed in a subdir". If someone else needs something more complex build systems to install my apps they can contribute and maintain it.