duplicity-talk
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Duplicity-talk] Large backups to S3


From: j_duplicity
Subject: Re: [Duplicity-talk] Large backups to S3
Date: Sat, 03 Apr 2010 09:15:41 -0700
User-agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.5) Gecko/20100119 Thunderbird/3.0

Peter,

Have I completely missed the bus?  Are buckets no longer limited to 5GB?

-J

--
On 04/03/10 09:13, Peter Schuller wrote:
How does duplicity handle S3 bucket overflows?  I have several hundred GB of
data in a large director tree that I would like to start backing up to S3.
  I'm really loath to attempt to partition this data into buckets by hand.
  I'm assume that duplicity can do some sort of magic for me?
What "overflow" are you concerned with? Unless you're backing up
petabytes of data (or whatever the limit would be) you can effectively
consider your S3 bucket to be infinite in size.

If you're backing up hundreds of gigs, I would suspect the concern's
you'll have include:

* Cost. You'll have to make periodic backups as you don't want an ever
increasing chain of incremental backups.
* Time to completion. If there is no parallelization backing up
hundreds of GB of data will likely take significant time.






reply via email to

[Prev in Thread] Current Thread [Next in Thread]