coreutils
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: My experience with using cp to copy a lot of files (432 millions, 3


From: Jim Meyering
Subject: Re: My experience with using cp to copy a lot of files (432 millions, 39 TB)
Date: Mon, 11 Aug 2014 16:53:41 -0700

On Mon, Aug 11, 2014 at 6:55 AM, Rasmus Borup Hansen <address@hidden> wrote:
> Hi! I recently had to copy a lot of files and even though I've 20 years 
> experience with various Unix variants I was still surprised by the behaviour 
> of cp and I think my observations should be shared with the community.
...
> I had started cp with the "-v" option and piped its output (both stdout and 
> stderr) to a tee command to capture the output in a (big!) logfile. This 
> meant that somewhere the output from cp was buffered because my logfile ended 
> in the middle of a line. Wanting the buffers to be flushed so that I had a 
> complete logfile, I gave cp more than a day to finish disassembling its hash 
> table, before giving up and killing the process.

Thanks for the analysis!

For reference, in case there is a next time, rather than killing the
process, you could have attached to it with "gdb -p PID" then run
"return" a few times until causing the process
to return from main.  Then it would have terminated normally, skipping
only the heap-
freeing instructions.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]