Bot-generated website traffic rises, again

Good news is good bot activities are increasing with Facebook mobile feed fetcher being the biggest such bot

After showing a downward trend for three consecutive years, website traffic generated by bots showed an upward trend in 2016, overtaking the human generated traffic once again in 2016, according to the Imperva Incapsula Bot Traffic Report, an annual study of bot traffic landscape. In 2015, for the first time in recent years, human generated traffic had overtaken bot generated website traffic, according to the data from the company. The 2016 study was the fifth annual study by the cyber security solution providers which help companies protect their website from the attacks.

According to the study, in 2016, 51.8% of the traffic generated was by bots while 48.2% traffic was generated by humans. From the traffic generated by bots, more than half (28.9% of total) were generated by what Imperva calls bad bots. However, there was an increase in the activity of good bots, says the company in its blog.

Bad bots are bots used for attacking websites while good bots are boots used to support the business and operational goals of different companies.

The study divides good bots into four major categories, as follows:

  • Feed fetcher – Bots that ferry website content to mobile and web applications, which they then display to users.
  • Search engine bots – Bots that collect information for search engine algorithms, which is then used to make ranking decisions.
  • Commercial crawlers – Spiders used for authorized data extractions, usually on behalf of digital marketing tools.
  • Monitoring bots – Bots that monitor website availability and the proper functioning of various online features.

Facebook mobile feed fetcher which fetches website information so it can be viewed in the in-app browser is the most active feed fetcher. Overall it accounted for 4.4% of all website traffic on Incapsula network.

The company divides the bad bots too into four major categories:

  • Impersonators – Bots that assume false identities to bypass security. These are commonly used for DDoS attacks.
  • Scrapers – bots used for unauthorized data extraction and the reverse engineering of pricing models.
  • Spammers - bots that inject spam links into forums, discussions and comments sections.
  • Hacker tools – Scavengers that look for sites with vulnerabilities to exploit for data theft, malware injection, etc.

The bad bots work in the shadow and only a few most notorious ones get to be named. Nitol, Cyclone and Sentry MBA were identified as top bad bots by the study.  

“The most active bad bots are all impersonators used for DDoS attacks. One of these is the Nitol malware, the single most common bad bot responsible for 0.12% of all website traffic. In 2016 the majority of Nitol assaults were launched by impersonator bots browsing using older versions of Internet Explorer,” says the company in a blog post.

The study was based on examination of more than 16.7 billion visits to 100,000 randomly-selected domains on the Incapsula network.

adidas


Add new comment