For one of my first full time jobs I created a small snippet of code that “caught” bad bots attempting to access our online portal powered by Microsoft IIS. We disallowed a particular filename and directory using robots.txt and linking the script URL in the backend code of our homepage. The bad bots in question […]

For one of my first full time jobs I created a small snippet of code that “caught” bad bots attempting to access our online portal powered by Microsoft IIS.

We disallowed a particular filename and directory using robots.txt and linking the script URL in the backend code of our homepage. The bad bots in question would attempt to crawl to it even though we disallowed it and when they hit the URL it would automatically ban them from the entire server for a certain amount of time. I did this by coding a custom IIS module that let me instruct the system to ban them all! It worked quite well.