The bad bots are one of the biggest consumers of server's resources. After you denied their access, your resources would be used only from a real visitors of your website.
Once filtered, the bad crawlers will not be able to make high loads on your server, so robots couldn't use excessive resources that hinder the performance of your websites. The more resources you have available, more quickly and smoothly your websites would work.
Most of the hosting providers offers a different hosting plans which are based on CPU usage, RAM memory and bandwidth. When less of these resources are used, cheaper would your hosting plan be.
All of the crawlers have a purpose and in many cases that purpose is to extracting some information from your sites. Some of this robots takes your website content, your clients and also other kind of data. Filtering these bad bots, you will protect your visitors, clients, your work and of course yourself.