|
From: | Andrew Ferguson |
Subject: | Re: [rdiff-backup-users] Failing backup: Long int too large to convert to int |
Date: | Thu, 2 Apr 2009 08:02:27 -0400 |
On Apr 2, 2009, at 4:24 AM, Thomas Jarosch wrote:
? File "/usr/lib/python2.2/gzip.py", line 253, in close ? ? write32(self.fileobj, self.size) ? File "/usr/lib/python2.2/gzip.py", line 19, in write32 ? ? output.write(struct.pack("<l", value)) OverflowError: long int too large to convert to int = 2D--------------------------------------------------------------------The box is running Centos 3 using python 2.2.3 and librsync 0.9.7+patch.Any idea what could be wrong? Or some way to log the filenam of the troublesome entry?
Hi Thomas,I'm quite positive that is a bug in the gzip support shipped with Python 2.2.x which was fixed in Python 2.3.x. That version of the gzip interface could not handle files > 2GB.
So, the best solution is to upgrade Python, at least to 2.3.x. If that is not feasible, then you can use --no-compression to disable compression on your increment files, which will avoid this bug. Or, you can use an --exclude rule to not backup files > 2GB.
If you want to confirm that this is the case, then you can use the -v5 option to rdiff-backup, which will tell you the name of each file that it is working on. The last file it is working on before this error should be > 2GB.
Andrew
[Prev in Thread] | Current Thread | [Next in Thread] |