Parser for XML Sitemaps to be used with Robots.txt and web crawlers
detects bots/crawlers/spiders via the user agent.
It uses the user-agents.org xml file for detecting bots.
Parser for XML Sitemaps to be used with Robots.txt and web crawlers. (Extended version by mastixmc)
A jQuery plugin that helps you to hide your email on your page and prevent crawlers to get it!
Parse robot directives within HTML meta and/or HTTP headers.
Opensource Framework Crawler in Node.js
Crawler made simple