[issue21469] Hazards in robots.txt parser

Raymond Hettinger report at bugs.python.org
Sun May 11 20:21:45 CEST 2014


Raymond Hettinger added the comment:

Attaching a draft patch:

* Repair the broken link to norobots-rfc.txt.

* HTTP response codes >= 500 treated as a failed read rather than as a not found.  Not found means that we can assume the entire site is allowed.  A 5xx server error tells us nothing.

* A successful read() updates the mtime (which is defined to be "the time the robots.txt file was last fetched").

* The can_fetch() method returns False unless we've had a read() with a 2xx or 4xx response.  This avoids false positives in the case where a user calls can_fetch() before calling read().

----------
assignee:  -> rhettinger
keywords: +patch
Added file: http://bugs.python.org/file35215/fix_false_pos.diff

_______________________________________
Python tracker <report at bugs.python.org>
<http://bugs.python.org/issue21469>
_______________________________________


More information about the Python-bugs-list mailing list