help-octave
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Importing large amounts of data


From: Francesco Potortì
Subject: Re: Importing large amounts of data
Date: Mon, 18 Jun 2012 11:33:07 +0200

>Right now I am trying to do this with a 150x150x1000 int array. This
>array has a small memory footprint in C++ and the file being pushed from
>the C++ program to the octave script is around 65MB. 

Those are 22.5e6 elements.  If you are using a binary representation
with 4-byte integers, you should have a 90-MB file.  If you use 16-bit
integers, half that measure.  65 MB, if there are no errors, indicates
that you are using a text representation, which is good and easy to
debug for your case, but may become slow if you are planning to use much
bigger arrays.

>                                                     When reading this
>into Octave it already consumes 8GB of RAM, which is quite a surprise,

Octave uses 8-byte floats by default, but it can read and write 1, 2, 4
and 8-byte integers.  Even when using the default, your array should
consume around 180 MB.  If you see 8 GB, something is going wrong.

>but not the main problem (I have memory to spare right now). However the
>reshaping is already going of for two days now on a multi-cpu Xeon Server.

This too is strange.  Should be in the order of a few seconds at most.

>Whats going wrong? How should I approach this to get it done?

Tell us exactly what format you are using for writing the file (an
example with a small array will suffice) and what commands exactly you
use for reading it in.  For example, try with a 2x2x3 array first.

-- 
Francesco Potortì (ricercatore)        Voice:  +39.050.315.3058 (op.2111)
ISTI - Area della ricerca CNR          Mobile: +39.348.8283.107
via G. Moruzzi 1, I-56124 Pisa         Fax:    +39.050.315.2040  
(entrance 20, 1st floor, room C71)     Web:    http://fly.isti.cnr.it


reply via email to

[Prev in Thread] Current Thread [Next in Thread]