lzip-bug
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Lzip-bug] Packaging lzip


From: Antonio Diaz Diaz
Subject: Re: [Lzip-bug] Packaging lzip
Date: Sat, 22 Nov 2008 04:05:16 +0100
User-agent: Mozilla/5.0 (X11; U; Linux i586; en-US; rv:1.7.11) Gecko/20050905

Hello Bram,

Bram Neijt wrote:
After reading about lzip in this months Linux Format and seeing that it
was not yet packaged, I decided to try packaging it for the Ubuntu
Universe repository.

Thank you very much.


With lzma already packaged and other compression algorithms already in
the packaging queue (lrzip for example) the actual package may not make
it into the Ubuntu Universe repository.

If the lzma already packaged is one that produces lzma-alone files it would be better to replace it with lzip. See this quote from http://lists.gnu.org/archive/html/lzip-bug/2008-11/msg00003.html :

"Lzip provides a much simpler and reliable implementation. Lzip also provides a simple but safe file format, with magic bytes and integrity checking. The stable branch of lzma-utils, the one "widely used", uses the lzma-alone file format, which lacks both. I hope everybody who cares about data safety will switch from the lzma-alone format to the lzip format as soon as they know about it."


In light of this LZMA already being in the repositories, it might be
good to create a table which compares lzip to the other compression
algorithms. lrzip for instance shows off better compression then LZMA
but it doesn't mention it's speed.

Lrzip does have significative differences from gzip/bzip2/lzip:
- It is optimised for large files, especially larger than 100MB.
- It only works on single files.
- Is not really usable (for compression) with less than 256MB (of RAM).
- Does not work on stdin/stdout.
- Lacks integrity checking (and it will be quite hard to implement).


A table containing the compression ratio, compression speed and
decompression speed might be good to post on the lzip website.

There are specialized benchmark sites like http://www.cs.fit.edu/~mmahoney/compression/text.html or http://www.maximumcompression.com/index.html with much more and better data that what I can test myself.


Best regards,
Antonio.




reply via email to

[Prev in Thread] Current Thread [Next in Thread]