[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Duplicity-talk] "Result too large" error on running backup
From: |
Adam Mercer |
Subject: |
[Duplicity-talk] "Result too large" error on running backup |
Date: |
Tue, 22 May 2007 00:40:59 -0400 |
Hi
I'm trying backup my home directory and I am receiving the following
error when trying to run a backup:
address@hidden ram]$ duplicity --no-encryption /Users/ram/
file:///Volumes/Backup/Home
No signatures found, switching to full backup.
Traceback (most recent call last):
File "/opt/local/bin/duplicity", line 373, in ?
if __name__ == "__main__": main()
File "/opt/local/bin/duplicity", line 366, in main
if not sig_chain: full_backup(col_stats)
File "/opt/local/bin/duplicity", line 142, in full_backup
bytes_written = write_multivol("full", tarblock_iter, globals.backend)
File "/opt/local/bin/duplicity", line 79, in write_multivol
else: at_end = gpg.GzipWriteFile(tarblock_iter, tdp.name)
File "/opt/local/lib/python2.4/site-packages/duplicity/gpg.py", line
254, in GzipWriteFile
try: new_block = block_iter.next(bytes_to_go)
File "/opt/local/lib/python2.4/site-packages/duplicity/diffdir.py",
line 407, in next
result = self.process(self.input_iter.next(), size)
File "/opt/local/lib/python2.4/site-packages/duplicity/diffdir.py",
line 487, in process
data, last_block = self.get_data_block(fp, size - 512)
File "/opt/local/lib/python2.4/site-packages/duplicity/diffdir.py",
line 507, in get_data_block
buf = fp.read(max_size)
File "/opt/local/lib/python2.4/site-packages/duplicity/diffdir.py",
line 338, in read
buf = self.infile.read(length)
IOError: [Errno 34] Result too large
Any idea whats causing this, and how I can fix it?
Cheers
Adam
[Prev in Thread] |
Current Thread |
[Next in Thread] |
- [Duplicity-talk] "Result too large" error on running backup,
Adam Mercer <=