bug-gnu-utils
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

wget 1.8: get rid of awkward single file quota exception


From: Dan Jacobson
Subject: wget 1.8: get rid of awkward single file quota exception
Date: Mon, 17 Dec 2001 07:10:06 +0800
User-agent: Gnus/5.090004 (Oort Gnus v0.04) Emacs/20.7 (i386-mandrake-linux-gnu)

in wget 1.8 info:

`-Q QUOTA'
`--quota=QUOTA'
     Specify download quota for automatic retrievals.  The value can be
     specified in bytes (default), kilobytes (with `k' suffix), or
     megabytes (with `m' suffix).

     Note that quota will never affect downloading a single file.  So

[that's odd, why not do the same for all?]

     if you specify `wget -Q10k ftp://wuarchive.wustl.edu/ls-lR.gz',
     all of the `ls-lR.gz' will be downloaded.  The same goes even when
     several URLs are specified on the command-line.  However, quota is
     respected when retrieving either recursively, or from an input
     file.  Thus you may safely type `wget -Q2m -i sites'--download
     will be aborted when the quota is exceeded.

[well, it still doesn't work, at least for a single file. I thought it
was saying a universal workaround ... oh it says 'never' above.

i dont understand why you don't want to respect a -Q in certain cases,
i mean isn't the user telling you his wishes?]

jidanni$ echo http://op.gfz-potsdam.de/GMT-Help/Archive/threads.html|wget -Q10k 
-i - -O -|wc
--06:50:02--  http://op.gfz-potsdam.de/GMT-Help/Archive/threads.html
           => `-'
Resolving localhost... done.
Connecting to localhost[127.0.0.1]:8080... connected.
Proxy request sent, awaiting response... 200 OK
Length: 1,129,241 [text/html]

100%[=====================================================>] 1,129,241    
608.93K/s    ETA 00:00

06:50:26 (608.93 KB/s) - `-' saved [1129241/1129241]


FINISHED --06:50:26--
Downloaded: 1,129,241 bytes in 1 files
Download quota (10,240 bytes) EXCEEDED!
  24942   80365 1129241

-------------------------------------------
now doing
$ echo SAME_URL \\n SAME_URL|wget ....
one sees that the quota is only checked before loading the second
[same] file, apparently no matter how big the first file is, it's all
coming thru.  if all this is true then you have a fundamental
algorithm flaw, and also a good opportunity to remove the special
exception as listed on the info page.
-- 
http://www.geocities.com/jidanni/ Tel+886-4-25854780



reply via email to

[Prev in Thread] Current Thread [Next in Thread]