bug-wget
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Bug-wget] Feature: Concurrency in recursive downloads


From: Anthony Bryan
Subject: Re: [Bug-wget] Feature: Concurrency in recursive downloads
Date: Mon, 3 Aug 2009 23:43:59 -0400

On Mon, Aug 3, 2009 at 7:05 PM, <address@hidden> wrote:
> I've been using wget to recursively download parts of a web page, and would
> find it very useful if wget allowed for concurrent downloads (upto some
> max), so that the "queued" URLs can be downloaded using some sort of pool of
> downloaders. I didn't see any discussion on this on the list archives or
> even on google in general. I'm curious if this is something which has been
> considered since it seems very useful to me in speeding up downloads.

have you looked at mulk? it might have the features you're looking for already.

http://mulk.sourceforge.net/

"Multi-connection command line tool for downloading Internet sites
with image filtering and Metalink support. Similar to wget and cURL,
but it manages up to 50 simultaneous and parallel links. Main features
are: HTML code parsing, recursive fetching, Metalink retrieving,
segmented download and image filtering by width and height."

-- 
(( Anthony Bryan ... Metalink [ http://www.metalinker.org ]
  )) Easier, More Reliable, Self Healing Downloads




reply via email to

[Prev in Thread] Current Thread [Next in Thread]