autoconf
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: parallelized configure


From: Alexander Holler
Subject: Re: parallelized configure
Date: Wed, 15 Jan 2014 20:54:59 +0100
User-agent: Mozilla/5.0 (X11; Linux x86_64; rv:24.0) Gecko/20100101 Thunderbird/24.2.0

Am 15.01.2014 16:06, schrieb Bob Friesenhahn:
On Wed, 15 Jan 2014, Alexander Holler wrote:

Sure, people do all kind of stuff in build systems. But a lot of tests
which do fly by when configure is called do look like taken from a
standard repository. And many of those tests are pretty simple and
without any dependencies.

It seems that all tests have dependencies.  What tests do you think
don't have any dependencies?  If I specify CC on the configure command
line, that may influence the outcome and demonstrates that there are
dependencies.

I don't think it makes much sense to talk about the basic stuff. Of course, running tests which do need the configured compiler options (like testing for includes), should be done after the tests have run which do discover the compiler options. So you will need some basic blocks which are serialized. But after configure has found out compiler options (which can be tested parallel too), many, many tests could be parallelized. So, yes, you are right, there are dependencies I should have mentioned. (Maybe the tests should just be organized by make to have an easy way to declare dependencies ;) )

Anyway I didn't wanted to talk about how to do it, I was interested on the state of such a feature I've seen some comments about 2 years ago.

Hmm, I'm using the parallel feature from gnu make very successful since
ext4 with support for nanosecond timestamps appeared (so since many
years). Without a fs with high resolution timestamps I had a lot of
problems, but with high resolution timestamps it works most of the time
like a charm.

This is perhaps off-topic for the Autoconf list.  Packages with a lot of
recursion and less than 64 target object files per directory, or with
some targets which take substantially more time than other targets to
compile, do not perform as well as they should for parallel builds on
modern CPUs (with 4-64 cores available).  If the object files compile
very quickly, then there is also less gain from parallel builds since
the build framework may take more time than the actual compiles.
Lastly, if there are many linking steps, the build time is dramatically
increased since linking is sequential in nature, and if there is
needless re-linking (a common problem with recursive builds) then the
problem is multiplied.

I would assume it is off-topic. And talking about that doesn't make much sense. Most people don't care about the build time when they organize the files of their projects. And for a good reason. And it is granted that make cant't invoke 64 compile threads, if there are only 4 source files. ;)

But basically parallel builds do nowadays work just fine with make.

To come back on-topic, I have the feeling that many packages already do spent more time in configure (if they use autotools), than they need to actually build. Thats why I ended up here, asking for the state of parallelized configure,

So for me everything is answered, thanks for the answers.

Regards,

Alexander Holler



reply via email to

[Prev in Thread] Current Thread [Next in Thread]