LinkCheckerBot is a Web Crawler that monitors backlinks of our users.
LinkCheckerBot strictly respects robots.txt rules. So you can have complete control over it on your website if you need to.
To change visiting frequency of LinkCheckerBot, you can specify the minimum acceptable delay between two consecutive requests in your robots.txt file
User-agent: LinkCheckerBot
Crawl-Delay: [value]
If for some reason you want to prevent LinkCheckerBot from visiting some pages of your site, put the two following lines into the robots.txt file on your server:
User-agent: LinkCheckerBot
Disallow: /url-of-page (put Url address of page)
If for some reason you want to prevent LinkCheckerBot from visiting your site, put the two following lines into the robots.txt file on your server:
User-agent: LinkCheckerBot
Disallow: /