bug-wget
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Bug-wget] How to download all the links on a webpage which are in s


From: Peng Yu
Subject: Re: [Bug-wget] How to download all the links on a webpage which are in some directory?
Date: Mon, 1 Aug 2011 10:11:01 -0500

On Mon, Aug 1, 2011 at 8:36 AM, Giuseppe Scrivano <address@hidden> wrote:
> Peng Yu <address@hidden> writes:
>
>> Suppose I want download  www.xxx.org/somefile/aaa.sfx and the links
>> therein (but restricted to the directory www.xxx.org/somefile/aaa/)
>>
>> I tried the option  '--mirror -I /somefile/aaa', but it only download
>> www.xxx.org/somefile/aaa.sfx. I'm wondering what is the correct option
>> to do so?
>
> it looks like the right command.  Can you check using "-d" what is going
> wrong?

I get this in the log. It seems that wget wants to download the file,
but it doesn't. Is it the case?

TO_COMPLETE: <something> to http://www.xxx.org/smefile/aaa/xxx.yyy ...

-- 
Regards,
Peng



reply via email to

[Prev in Thread] Current Thread [Next in Thread]