Technical Information on BozMo Wikiscan

BozMo Wikiscan runs from the IP and should only crawl sites which appear to be open edit (Wiki) sites. It is manually triggered at present but may get automated at some point

It should obey all the usual robot exclusion protocols but it is a bit tape and scissors at present so please tell me if it goes places it shouldn't (most likely to non-wiki pages on a site with a wiki).

It shouldn't visit the same pages repeatedly and on sites with known bandwidth problems (like WikiPedia) if it comes at all (which it doesn't with WP these days) it staggers page requests.

I will put up info here if there are any changes or issues arising. Details of how to contact me are on the homepage

There is General Info on another page (which I'll improve when I have time).
  • Back to Homepage