[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Limiting parallel when used with recursion
From: |
Ole Tange |
Subject: |
Re: Limiting parallel when used with recursion |
Date: |
Mon, 3 Aug 2015 07:03:03 +0200 |
On Sun, Aug 2, 2015 at 2:36 AM, Schweiss, Chip <chip@innovates.com> wrote:
> The problem with that is that parallel will start execution on parent folder
> before the child process is finish.
Ahh. Yes.
One solution is to find the max depth.
Run all for that depth using GNU Parallel.
Do the same for depth-1.
This way you should have very little time wasted as you will
parallelize over different subdirs at the same level. You will only
have wasted time at the end of each depth. This should work:
# Find the maxdepth
MAX=$(find | perl -ne '$a=s:/:/:g;$max=$a>$max?$a:$max;END{ print $max+1 }')
# For each depth (D) in MAX..1:
# Find files/dirs at depth D and do_stuff on them in parallel
seq $MAX -1 1 | parallel -j1 -I D 'find . -mindepth D -maxdepth D |
parallel do_stuff {}'
/Ole