help-gnunet
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Help-gnunet] Error uploading file


From: Christian Grothoff
Subject: Re: [Help-gnunet] Error uploading file
Date: Sat, 17 Dec 2005 11:38:11 -0800
User-agent: KMail/1.8.3

On Saturday 17 December 2005 09:38 am, David Kuehling wrote:
> Hi,
>
> >>>>> "Christian" == Christian Grothoff <address@hidden> writes:
> >
> > See, not the issue.  Still I would think performance will likely be
> > not-so-great with such a huge DB (indexing is good for performance!).
>
> Since gnunet data blocks are accessed in a random order, I thought that
> there would be not much difference between accesing random blocks from
> indexed files and accesing random blocks in a database...

Well, in my experience it makes a difference.  I think it is simply this: 
database accesses are generally more costly than file accesses.  If you can 
have a smaller database (by having some data externally in files), more of 
the (important indexing) information of the database can be cached (in 
memory) and thus the overall performance can go up.  

> > I'm not sure I understand this.  The *memory* used by gnunet?  You'll
> > just use much more disk space (making a copy of the files in the DB)
> > instead of linking to existing content.  This will cost you DB access
> > performance...
>
> But I don't want to keep the files I publish on my "server" computer.  I
> just upload them into the database and remove them afterwards. 

Then insertion is the right choice.

> BTW the 
> gnunet-insert manpage is somewhat unclear about what indexing means, it
> reads:
>     Since 0.6.2 GNUnet will make a copy of the file in the directory
>     specified in gnunet.conf.
>
> Which sounds like the files will _always_ be copied, which seems like a
> bad idea, since it would then be quite difficult to keep track of the
> amount of storage used by gnunet's indexed content...

Right, that's not quite correct.  We use symlinks whenever possible.  I've 
clarified the man page, thanks for pointing this out.

> > can you try applying the following patch:
>
> [..]
>
> I applied your patch, and this is the result:
>
> Dec 17 17:54:53 WARNING: Datastore full (2149286829/2147483648) and
>   content priority too low to kick out other content.  Refusing put.
>
> You are right, GNUnet assumes that the database is full.  Would be neat
> if gnunet-insert displayed that message...  But why is my database
> limited to only 2GB?  What can I do to boost it to 10GB?

Interesting.  I had thought the bug was that the size estimate (the first 
number) would be 10 GB -- not the quota being 2 GB (second number).  Can you 
just double-check your config file?  There are TWO quotas in there: QUOTA and 
INDEX-QUOTA.  Which one did you set to 10 GB? I checked the code again, and I 
cannot see anything that would make this value be 2 GB if the config file is 
right.

> BTW one of the reasons for doing full inserts for all files was, that
> this way GNUnet would be able to do smart decisions about what content
> to drop when the 10GB limit is reached.  Seems that the natural fading
> of priorities is quite slow for my node (and for the limited 2GB
> database size)...

Well, the natural fading was intentionally set to be slow -- we don't want 
people to insert a file today and find out tomorrow that it has been replaced 
by random garbage (which, while having zero priority may still become equal 
to "faded" content) from the network just because it faded too quickly. 

Christian




reply via email to

[Prev in Thread] Current Thread [Next in Thread]