savannah-hackers-public
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Savannah-hackers-public] Please disallow www-commits in robots.txt


From: Corwin Brust
Subject: Re: [Savannah-hackers-public] Please disallow www-commits in robots.txt
Date: Wed, 10 May 2023 16:54:55 -0500

On Wed, May 10, 2023 at 2:04 PM Thérèse Godefroy <godef.th@free.fr> wrote:
>
> Le 10/05/2023 à 20:50, Alfred M. Szmidt a écrit :
> >
> > The purpose of robots.txt is to avoid overloading a web site, it is
> > not to disallow access to pages.
> >
> > This still does not explain what the problem is -- "don't let crawlers
> > crawl" doesn't explain it.  What are you trying to solve?  That
> > unpublished articles are not published before they are finished?
>
> Yes, basically. The purpose of the staging area is to work on articles
> that are not ready yet.
>

FWIW, I agree with Alfred, here, I think.  This appears to be a
request to disable a normal "transparency" feature, enabling all
contributors --and anyone else-- to see what's going on with a public
repository in the name of, effectively, security by obscurity.[1]  As
such, based on what I understand so far, I don't think we should
support this from a Savannah perspective.  Would it make sense to have
a private repo for in-progress work?

In any event, I believe that if we do come to agree there's a change
to be made, such change would be in/to Savannah; thus, I've removed
the FSF sysadmin team from CC.

[1] https://en.wikipedia.org/wiki/Security_through_obscurity



reply via email to

[Prev in Thread] Current Thread [Next in Thread]