savannah-hackers-public
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Savannah-hackers-public] Please disallow www-commits in robots.txt


From: Thérèse Godefroy
Subject: Re: [Savannah-hackers-public] Please disallow www-commits in robots.txt
Date: Wed, 10 May 2023 20:32:33 +0200
User-agent: Mozilla/5.0 (X11; Linux x86_64; rv:102.0) Gecko/20100101 Thunderbird/102.10.0

Le 10/05/2023 à 19:38, Alfred M. Szmidt a écrit :
    Because it registers every single commit to www,

What is "it"?  How is this different from _any_ -commits list we have?

    including to working directories that webmasters have disallowed,
    for instance */po/, /server/staging/, */workshop/, /prep/gnumaint/,
    etc.

Ok, and?

    Please see https://www.gnu.org/robots.txt.

And?

You've not explained the actual problem.  What are you trying to
solve?

"it" is the www-commits list, which registers all changes to the www
directory, including to pages that are not published yet. I suspect most
of the other *-commits lists deal with source code repositories, which
are public anyway.

If you let crawlers access changes to disallowed directories, you are
defeating the purpose of robots.txt. What was supposed to be unpublished
is actually published.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]