BotRegistry Logo

Block the bots that waste your site resources

Use the resources for your customers, and webcrawlers that help you

How can you stop spammers?

Your website continually being attacked by bots & spammers, web-Crawlers, and attempts to publicise spammy websites from your contact forms and logs - using the computing resources you want to keep for your actual customers, and adding noise to your logs and stats. At BotRegistry, we spend our days checking the public (and other) lists of problem-webcrawlers & drive-by spammers, and then running many more active, and passive, checks - so you can concentrate on business.

Screenshot

How can you save your resources?

By analyzing which bots are visiting your website, and reporting which ones aren't helping you, BotRegistry saves you from spending your resources (computing, and human) to avoid the attackers that want to use your work to benefit them.

Toy robot

How it works

The Bad‑Bots that cause you so many problems don't download or run JavaScript, and they don't download images or 3rd party sources, so they need more active checks by your site to detect the bad crawlers. Drop our pre-written code packages (available for various languages and frameworks) into your site to send the mimimally required details of your visitors to our API and it returns what it knows about them. From a basic, 'Not‑Bot', 'Good‑Bot' or 'Bad‑Bot' response, all the way to specific categories of what the crawler wants to do, and some details of how the decision was made.

Every request helps the service learn and improve the strength of the decision, helping you, (and the wider ecosystem) defend your site from the attackers.

robot trash sorter

Pre-emptive blocking

For a more passive, and automatic blocking, you can also integrate our blocklists. Downloaded to your servers, and integrated either in your webserver, or firewall, these highly optimised lists can block all the IPs we have defined as 'Bad‑Bots' from even reaching your website at all.

Customised options are also optimised based on the type of firewall, or webserver, and to (for example) block by the geographic location of the source, avoiding potentially agressive crawls by legitimate, but ultimately counterproductive sources - such as search engines that are not used by your target market.

Network security with network cable

Stop the bad bots from your website