A bot is a computer program that operates over the network and is made to automatically perform certain tasks, such as crawling or scraping a webpage. Acting on behalf of one or more users, these automated bots account for nearly 50% of today's web traffic. Meanwhile, a botnet is a network of infected (or "zombified") computers controlled remotely by an outside actor with malicious intent. 

Many bots are relatively simple in their design, comprising a few lines of code, yet more sophisticated bots have been emerging that can handle more complex, time-consuming jobs. Despite this, many bots are actually pretty easy to create with or without code using official tools. That's why bots are commonly used across social media to perform tedious tasks such as tweet, like, share, and others. 

And in many cases, such as with chatbots and others, they're intentionally designed to impersonate human beings. Some bots and crawlers are verified (or recognized and widely accepted), while others function more covertly. Malicious bots can even pretend to be other well-known bots to sidestep security measures or controls within robots.txt files.

Are all bots useful?

Bots power many of the web's essential features and are pivotal in helping users find anything they're looking for, by enabling search engines to inspect and rank websites. Bots can also provide a productivity boost by automating tedious and routine actions (such as repeated social engagement). Overall, bots are responsible for connecting people with relevant information almost instantly. 

However, the pervasiveness of bots is a double-edged sword. While many (such as Googlebot and Bingbot) tackle important indexing tasks across SEO-optimized pages, others are purposefully created to gather sensitive or otherwise unprotected information from the web pages they inspect—or probe for cybersecurity weaknesses.

We've seen the former happen recently with AI crawlers such as ChatGPT-User or ClaudeBot. ChatGPT alone handles over 10,000,000 daily queries, and while those aren't all hitting the same websites simultaneously, understanding the scale is important as the market for these solutions becomes more saturated. And while knowledge gathering bots have legitimate benefits, challenges with privacy and security can't be dismissed.

Additionally, bot networks fittingly called "botnets" are capable of orchestrating sophisticated, resource-sapping DDoS attacks against applications. Having bots patrol the internet means taking the positives with the negatives, and taking steps to block the bad bots that account for roughly 33% of all internet traffic. 

Since bots make up such a large portion of web traffic, forming a targeted blocking strategy is key to minimizing resource drain. In unprotected environments, bots and bot attacks can consume high levels of CPU, memory, and network bandwidth—shutting out legitimate users by blocking access to services. While downtime can be an annoyance (we can mostly stomach a few hours without Reddit or YouTube), it's often a critical issue for services and users within finance, healthcare, or government. In these situations, being separated from your data can have serious consequences.

How do bots work?

Bots typically have three components of varying complexity: 

  • The code – Also called the "business logic," executable code enables bots to do what they're designed to do. This code is machine readable. However, an individual bot developer might opt for low code or no-code options to help create a bot more quickly. 

  • A database – This holds the data the bot draws from to function and also scrapes from crawled pages. 

  • API integrations – For complex bots or those with extended functionality, APIs can help bot developers package new features more quickly and easily. This saves the time and investment needed to build functionality yourself. 

While we've mentioned that bots can have just a few lines of code, many bots have large executable files beyond what a hobbyist could write in a few minutes.

How does HAProxy handle bots and botnets?

HAProxy Enterprise leverages an advanced, multi-layered security approach to protect apps and APIs from threats. Our HAProxy Enterprise WAF, HAProxy Enterprise Bot Management Module, and Global Rate Limiting features prevent bad bot traffic from reaching your servers. We use a combination of weighting and scoring to detect bad bots and promptly block them. And since good bots remain key to web discoverability, HAProxy products let you allowlist valid IP addresses.