[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [PATCH] wc: line-buffer the printed counts
From: |
Pádraig Brady |
Subject: |
Re: [PATCH] wc: line-buffer the printed counts |
Date: |
Tue, 22 Dec 2009 10:26:19 +0000 |
User-agent: |
Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.5) Gecko/20091204 Thunderbird/3.0 |
On 22/12/09 09:38, Jim Meyering wrote:
Pádraig Brady wrote:
wc is a essentially a digesting function like sha etc.
in that it produces a 1 line summary per file.
The attached patch ensures that those lines are output
atomically for concurrent wc processes.
Note in general one can use `stdbuf -oL cmd` to line-buffer
a process which outputs to stdout, but I think this
should be done internally in this case.
...
+ wc now prints counts atomically so that concurrent
+ processes will not intersperse their output.
+ [the bug dates back to the initial implementation]
Nice. Thanks!
You might want to cast this as an improvement, rather than a bug fix.
Well the related md5sum change was cast as a fix,
and it is just fixing an issue with running in parallel,
and we are just going for a bug fix release now.
So I'm inclined to leave it as a fix.
This made me think of doing the same for du, but I'm hesitant.
Many tools parse a single (voluminous) stream of du output,
and forcing that to be line-buffered sounds like it'd impose
too much of a penalty.
I noticed that du has always called fflush() after each entry,
which handles the issue also. I said I wouldn't change to
calling setvbuf as du is also outputting NUL terminated entries.
cheers,
Pádraig.