parallel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Limiting memory used by parallel?


From: Hubert Kowalski
Subject: Re: Limiting memory used by parallel?
Date: Fri, 26 Jan 2018 14:58:32 +0100 (CET)

Is it in any case similar to https://savannah.gnu.org/bugs/index.php?51261 ? https://lists.gnu.org/archive/html/parallel/2017-06/msg00009.html ?

> Dnia 26 styczeń 2018 o 10:34 hubert depesz lubaczewski <depesz@depesz.com> napisał(a):
>
>
> On Thu, Jan 25, 2018 at 11:26:24AM -0500, Joe Sapp wrote:
> > Hi depesz,
> >
> > On Thu, Jan 25, 2018 at 10:33 AM, hubert depesz lubaczewski
> > <depesz@depesz.com> wrote:
> > [snip]
> > > But it looks that parallel itself is consumming HUGE amount of memory
> > > - comparable with size of /some/directory itself.
> > >
> > > Server that I run it on has 64GB of ram, and the script gets killed
> > > after ~ 3 minutes with "Out of memory!" error.
> >
> > Have you tried using --memfree? That will limit parallel from
> > spawning new processes
>
> OK, I read the docs, and I don't get it.
>
> Why would I want to limit parallel from spawning new processes?
>
> The problem, in my opinion, is that parallel is using memory for some
> kind of buffer when splitting data, instead of operating in some small
> block size, like 8kb, or whatever to copy data form input to one of the
> outputs.
>
> The final scripts do not use lots of ram. It's just parallel that is
> using over 10GB of ram to split data and put it into multiple pipes.
> This doesn't look OK to me - isn't it a bug?
>
> Best regards,
>
> depesz
>
>

reply via email to

[Prev in Thread] Current Thread [Next in Thread]