webchecker 1.22 & robots.txt
Robin Becker
robin at jessikat.fsnet.co.uk
Wed May 9 03:50:12 EDT 2001
I'm using webchecker 1.22 and get the following errors
webchecker version 1.22
Round 1 (1 total, 1 to do, 0 done, 0 bad)
....
Round 6 (220 total, 74 to do, 146 done, 0 bad)
Error ('http error', 403, 'Forbidden')
HREF http://www.reportlab.com/ftp/incoming/
from http://www.reportlab.com/ftp/ (incoming/)
http://www.reportlab.com/ftp/?D=A (incoming/)
http://www.reportlab.com/ftp/?M=A (incoming/)
http://www.reportlab.com/ftp/?N=D (incoming/)
http://www.reportlab.com/ftp/?S=A (incoming/)
the remote robots.txt file is
User-Agent: *
Disallow: /ftp/incoming/
Disallow: *://www.reportlab.com/ftp/incoming/
Disallow: mailto:*
what am I missing/fumbling?
--
Robin Becker
More information about the Python-list
mailing list