[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Does -l option of gzip work correctly for large files (about 10GB)?
From: |
Eric Blake |
Subject: |
Re: Does -l option of gzip work correctly for large files (about 10GB)? |
Date: |
Tue, 12 Jul 2011 15:06:50 -0600 |
User-agent: |
Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.18) Gecko/20110621 Fedora/3.1.11-1.fc14 Lightning/1.0b3pre Mnenhy/0.8.3 Thunderbird/3.1.11 |
On 07/12/2011 02:47 PM, Peng Yu wrote:
> Hi,
>
> $ gzip -l 28122s_2_sorted.txt.gz
> compressed uncompressed ratio uncompressed_name
> 1247648790 1285935835 3.0% 28122s_2_sorted.txt
>
> The uncompressed file size according to "gzip -l" is about 1.2GB.
This is a FAQ. The gz file format is inherently flawed, and cannot
store sizes larger than 4GiB. Anything larger than that is silently
wrapped back around.
> However, I extracted the file. The file size is 9.2GB. I'm wondering
> if there is a bug in gzip when it deals with large files.
Only if you can call a design flaw a bug. There's no way to fix this
short of introducing an extension to the gz file format, and even then,
it would only work for implementations of gzip that understand that
extension.
--
Eric Blake address@hidden +1-801-349-2682
Libvirt virtualization library http://libvirt.org
signature.asc
Description: OpenPGP digital signature