[Catalog-sig] why is the wiki being hit so hard?

"Martin v. Löwis" martin at v.loewis.de
Sat Aug 4 09:42:45 CEST 2007


> If they do not respect them, then you can use this program:
> http://danielwebb.us/software/bot-trap/ to catch them.
> If you are doing this, Martin, use the German version instead:
> http://www.spider-trap.de/
> because it has a few useful additions.  I forget what now.
> 
> Most scrapers, these days, respect robots.txt which will make this
> program useless for catching them.  But some days you can get lucky.

That would also be an idea. I'll see how the throttling works out;
if it fails (either because it still gets overloaded - which shouldn't
happen - or because legitimate users complain), I'll try that one.

Regards,
Martin


More information about the Catalog-SIG mailing list