What are Bots?
Internet bots, web spiders and crawlers can be your website’s best friend or worst enemy. Some, like Googlebot, will assist website growth by helping it gain SEO rankings, while others will work to take it down.
But, no matter the intent behind their usage, bots are simply agents of web-based software applications. They are typically employed to perform simple, repetitive tasks at the will of their human masters. The nature of those tasks depends on the nature of their operators which – in case of the malicious bots – are also known as “bot shepherds.”
Malicious bots originate from compromised computers, which were infected with trojans and commandeered to perform harmful and illegal actions. Once infected, such computers will usually be integrated into a larger pool of other compromised machines, collectively known as “botnets.”
Each of such botnets can consist of hundreds of thousands of computers, all answering to the directives of their hacker commander. Using these computing resources the hacker can cause damage on a large scale.
And so, botnets can often be used for brute-force attacks, theft of personal information, and, of course, all kinds of spam. However, most commonly, botnets are used for distributed denial of dervice (DDoS) attacks, which leverage their size to bring down commercial and government websites around the globe.
Perhaps even more unsettling is that these malicious bots may currently originate from your own computer, even as you read these words. The bots and the trojans that operate them, were both designed for transprancy and as a result the symptoms from an infected computer are barely perceptible. If your computer was indeed compromised you might notice it running a bit slower at times but other than that, it will appear to be healthy.
New Study Shows – Bots Are Now a Majority of All Web Traffic
Good bots and malicious bots come in many shapes and sizes, and you should be familiar with the most common types so that you are equipped to handle them if the need arises. A DDoS protection firm, Incapsula, recently published a study that details bot traffic trends of the past year.
As that study shows, bots – good and bad – now account for 61.5% of all website traffic, a 21% increase from last year. Most of this increase is from good bots. The jump in good bot activity is likely to increase usage of existing bots, particularly from search engine crawlers as well as the introduction of new bots; namely, those working in search engine optimization (SEO).
Still, while the overall percentage of malicious bot activity has remained mostly unchanged, the trends within this bot demographic have disconcerting implications. Although the percentage of hackers and spammers has decreased significantly, Incapsula reports that activity from “other impersonators” has increased 8%, accounting for 20.5% of total web activity.
The “other impersonators” malicious bots are the worst kind – designed to mimic human-like behavior, usually to avoid various website security measures. As Incapsula notes, data illustrates that cyber criminals are becoming more sophisticated and are now adapting to find subtle ways to prey on their targets.
Nowhere is this more evident than in a realm of DDoS, where new and sophisticated threats can now breach many of the outdated security perimeters. Where previously DDoS attacks were only feared for their high volumes, today’s offenders are feared for their infiltration capabilities and their ability to recruit human-like bots that can go right past the outdated network defenses.
And so, the next challenge DDoS protection providers must face is to know how to differentiate humans from bots.