help-gnunet
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Help-gnunet] files > 2 gb


From: Christian Drechsler
Subject: [Help-gnunet] files > 2 gb
Date: Sun, 20 Oct 2002 16:58:13 +0200 (CEST)

hi there!

so here is a quick-and-easy solution for those who use rpm-based systems
to recompile their packages with large file summit support:

- find the lib directory of your rpm installation, normally /usr/lib/rpm/

- look into the rpmrc file you find there. don't edit this file! it will
  be overwritten next time you install a new version of rpm.

- find the line beginning with
  optflags: i386
  (hint: rpm normally always uses that architecture in order to keep the
  stuff portable. if you want to compile directly for athlon, e.g., you
  will have to set "buildarchtranslate: athlon: athlon" in /etc/rpmrc,
  too. and you certainly must use the corresponding optflags:-line from
  /usr/lib/rpm/.)

- create (or edit) the file /etc/rpmrc

- copy the "optflags: i386"-line to that file and add "-D_LARGEFILE_SOURCE
  -D_FILE_OFFSET_BITS=64" to it

- get the source rpm of the package you want to recompile

- issue "rpm --rebuild <package>.src.rpm"

- if everything works as it should you will find the readymade rpm in
  /usr/src/redhat/RPMS/i386/ (or athlon/ or any other architecture if you
  changed sth according to the hint above).

- now install it by saying "rpm -U --force <package>.rpm". the --force
  param is necessary if you have the same version number running already.
  rpm would otherwise decline to overwrite the old files.

this workes out of the box here. but if i try to start gnunetd now with my
old large database, it stops with
Oct 20 15:35:09 getDatabase: failed to open database file 
/home/zottel//.gnunet/data/content.gdb with error: File seek error

if i use the version without LFS, gnunetd first starts without problems
but crashes a little later. why is that now? maybe within the file there
are inidices pointing beyond the 2gb line that turn around to small values
with the old 32bit integer, and as the file never got bigger than 2 gb,
now they point beyond the end of the file? that's the only answer i can
imagine. as compilation and stuff worked without the slightest problems,
i'm quite positive that the database could now grow beyond 2 gb. i
probably won't be able to repair the old one, but what the heck ... ;-)

comments?

regards, zottel





reply via email to

[Prev in Thread] Current Thread [Next in Thread]