[Tracker-discuss] PSF tracker causing heavy load

anatoly techtonik techtonik at gmail.com
Thu May 13 15:24:05 CEST 2010


http://bugs.python.org/robots.txt
User-agent: *
Crawl-delay: 20

Cool. Didn't know that is possible.
-- 
anatoly t.



On Wed, May 12, 2010 at 9:45 PM, R. David Murray <rdmurray at bitdance.com> wrote:
> On Wed, 12 May 2010 21:23:02 +0300, anatoly techtonik <techtonik at gmail.com> wrote:
>>On Wed, May 12, 2010 at 8:32 PM, "Martin v. Löwis" <martin at v.loewis.de> wrote:
>>>> I have noticed that the roundup tracker is causing heavy load on the psf
>>>> server.
>>>
>>> You do have access also, don't you? If not, you should (IMO).
>>>
>>> In any case, I put a robots exclusion file on the trackers.
>>
>>Are you sure tracker issues should not be searchable by Google?
>>Searching for specific stacktraces usually gives the best results in
>>bug reports.
>
> The new robots.txt file just tells the robots to slow down, not to stop.
>
> --
> R. David Murray                                      www.bitdance.com
>


More information about the Tracker-discuss mailing list