|
From: | GNU bug Tracking System |
Subject: | bug#52338: closed (Crawler bots are downloading substitutes) |
Date: | Sun, 19 Dec 2021 16:54:02 +0000 |
Your message dated Sun, 19 Dec 2021 17:53:27 +0100 with message-id <87wnk0pmd4.fsf@gnu.org> and subject line Re: bug#52338: Crawler bots are downloading substitutes has caused the debbugs.gnu.org bug report #52338, regarding Crawler bots are downloading substitutes to be marked as done. (If you believe you have received this mail in error, please contact help-debbugs@gnu.org.) -- 52338: http://debbugs.gnu.org/cgi/bugreport.cgi?bug=52338 GNU Bug Tracking System Contact help-debbugs@gnu.org with problems
--- Begin Message ---Subject: Crawler bots are downloading substitutes Date: Mon, 6 Dec 2021 16:20:55 -0500 I noticed that some bots are downloading substitutes from ci.guix.gnu.org. We should add a robots.txt file to reduce this waste. Specifically, I see bots from Bing and Semrush: https://www.bing.com/bingbot.htm https://www.semrush.com/bot.html
--- End Message ---
--- Begin Message ---Subject: Re: bug#52338: Crawler bots are downloading substitutes Date: Sun, 19 Dec 2021 17:53:27 +0100 User-agent: Gnus/5.13 (Gnus v5.13) Emacs/27.2 (gnu/linux) > Thanks to both of you, And closing! Mathieu
--- End Message ---
[Prev in Thread] | Current Thread | [Next in Thread] |