[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
profiling, what takes so long?
From: |
Torsten Mohr |
Subject: |
profiling, what takes so long? |
Date: |
Wed, 1 Feb 2006 21:55:32 +0100 |
User-agent: |
KMail/1.8 |
Hi,
for a project in the office i wrote a Makefile. There we
use ClearCase for version control. We can get a copy of
the project to the local hard drive, then the Makfile
works quite fast and without problems.
We can also get a faked network drive from ClearCase with
the project files in it. Both have the same directory
structure.
When we try to compile the sources on the faked network
drive, everything takes so incredibly long. In that project
previously a quite simple script was used to stupidly compile
all the sources. People now complain that this worked
better than the Makefile.
Is there a way for me to test now what takes so long?
Could you give me any hints on what could make sense
to measure and where (by changing the sources) to measure
it?
I could think of:
- the "stat"s
- the commands started from "make"
- searching for dependencies
- internal calculations
Thanks for any hints,
Torsten.
- profiling, what takes so long?,
Torsten Mohr <=
- Re: profiling, what takes so long?, Brendan Heading, 2006/02/01
- Re: profiling, what takes so long?, Paul D. Smith, 2006/02/01
- Re: profiling, what takes so long?, Dan Kegel, 2006/02/01
- Re: profiling, what takes so long?, Johan Bezem, 2006/02/01
- Re: profiling, what takes so long?, Torsten Mohr, 2006/02/04