|
From: | Paul Eggert |
Subject: | Re: better default support for parallel compression |
Date: | Thu, 4 Nov 2021 13:15:10 -0700 |
User-agent: | Mozilla/5.0 (X11; Linux x86_64; rv:91.0) Gecko/20100101 Thunderbird/91.2.0 |
On 11/3/21 22:13, Mike Frysinger wrote:
with the rise of commodity multicore computing, tar feels a bit antiquated in that it still defaults to single (de)compression. it feels like the defaults could & should be more intelligent. has any thought been given to supporting parallel (de)compression by default ?
Not really. Some thought would be required, I assume. For example, parallelizing decompression might hurt performance, as 'tar' is typically I/O bound in that case. It'd be nice if someone could think this through and do some performance measures.
[Prev in Thread] | Current Thread | [Next in Thread] |