Bot Busters: Defending Your Site Against Bots

TL;DR: A climate change portal faced performance issues due to excessive bot crawling. After trying various solutions, including updating core and modules, scaling up the server, and exploring different security modules, the problem was solved using Drupal's built-in Ban module to block problematic IP addresses. The Autoban module was then implemented to automate the process, effectively improving the site's performance and resilience against bots. 

 

 

🌐Introduction

In today's digital landscape, websites face numerous challenges, and one of the most persistent is the threat of bots. These automated programs can wreak havoc on your site's performance, especially when they crawl your pages excessively. Recently, we encountered this issue with one of our country climate change portals built on Drupal 7. Here's how we tackled the problem and restored the site's performance.

🔍The Investigation Begins

When we noticed performance issues on our climate change portal, we started with the basics. We reviewed the drupal logs and status report pages. We then updated Drupal core to the latest version and addressed any errors that popped up from outdated modules. However, the problem persisted.

Cloudflare: Not an Option This Time

In previous cases, we've successfully used Cloudflare's Web Application Firewall (WAF) to protect against AI bots. Unfortunately, this site wasn't using Cloudflare, so we had to look for alternatives at the application and web server level.

📈Scaling Up: A Temporary Fix

After updating several modules and even writing a custom script to block bots, we saw no improvement. As a last resort, we scaled up the web server to allow more CPU power. This seemed to work well initially, but the site still struggled during the weekend.

🕵️‍♂️The Culprit Revealed

A deep dive into the Nginx logs revealed the real issue: heavy crawling from specific IP addresses. This excessive bot activity was the root cause of our performance problems.


🛡️Drupal 7 to the Rescue

Fortunately, Drupal 7 comes with a built-in solution: the Ban module. We used this to manually ban the problematic IP addresses, and almost immediately, traffic returned to normal levels.

🤖Automating the Process

To make this solution more sustainable, we installed the Autoban module. This nifty tool creates rules to automatically ban IPs based on traffic logs. Within 12 hours of activation, it had already added three more IPs to our banned list. More importantly, the website remained responsive throughout.

🛠️Other Modules We Explored

Before finding the right solution, we looked at several other modules:

  1. SecKit: While useful for other security measures, it didn't offer specific protections against crawling or limiting access.
  2. Security Review: This module didn't provide direct benefits for our bot problem, but it offered valuable advice on hardening the website's security, which we implemented.
  3. Botcha: This module focuses on protecting web forms from bots, which wasn't our primary issue.

💻Technical Details for the Curious

For those interested in the more technical aspects of our solution:

  1. The Ban module in Drupal 7 works by adding IP addresses to a database table, which is then checked for each incoming request.
  2. The Autoban module uses Drupal's watchdog logs to identify suspicious activity. It can be configured to ban IPs based on various criteria, such as the number of requests within a specific time frame or repeated access to certain paths.
  3. Suspicious IPs were checked against AbuseIPDB

By combining these Drupal modules with server-level optimizations, you can significantly improve your site's resilience against bot attacks and maintain optimal performance.

Comments

Popular posts from this blog

Insights for Software Development Workflows from the Pacific Islands

Government of Tonga’s first mobile app nears completion