Protect your website from bad bots.

Filter bad crawlers and protect your information, save resources and pay less for hosting.

Get code See list of bots
We have information about 44 bots in our database. Last database update: 04.12.2016.

Why should filter bad crawlers?

Save resources

The bad bots are one of the biggest consumers of server's resources. After you denied their access, your resources would be used only from a real visitors of your website.

Improve website speed

Once filtered, the bad crawlers will not be able to make high loads on your server, so robots couldn't use excessive resources that hinder the performance of your websites. The more resources you have available, more quickly and smoothly your websites would work.

Save money

Most of the hosting providers offers a different hosting plans which are based on CPU usage, RAM memory and bandwidth. When less of these resources are used, cheaper would your hosting plan be.


All of the crawlers have a purpose and in many cases that purpose is to extracting some information from your sites. Some of this robots takes your website content, your clients and also other kind of data. Filtering these bad bots, you will protect your visitors, clients, your work and of course yourself.