The Click Fraud Blog | ClickCease
I'm not a robot says a robot with pop art background

What is Bot Traffic?

Bot traffic is essentially non-human traffic to a website. Bot traffic is the result of software applications running automated tasks. Bots can perform repetitive tasks very quickly, at a rate that human beings simply can’t manage.

With this ability to perform repetitive tasks quickly, bots can be used for the good, and the bad. “Good” bots can, for example, check websites to ensure that all links work. “Bad” bots on the other hand can be unleashed to target websites with heavy traffic, enough to overwhelm and take down the site.

As bots are just programmed scripts, they can perform any number of functions. Bots are used for example by search engines such as Google to crawl the web to fetch and analyze information, which in turn lets these companies keep search results updated and relevant.

For end users like browsers of websites, bot traffic isn’t really an issue. For site owners, however, bot traffic is critical: whether it’s to ensure that Google is crawling your site properly, to enhance the accuracy of your analytics results, to ensure the health and performance of your website, or to prevent malicious behavior on your website and ads.
Incredibly, more than half of all web traffic is bot traffic. What’s disturbing, however, is that 28.9%.

Different Types Of Bot Traffic

In order to more fully understand what is bot traffic, one has to look at the various kinds of bot traffic – which could include web crawlers for search engines like Google, or malicious bots which are used to attack websites. Different types of bot traffic can include:

“Good Bots”

Not all bots are bad
  • SEO: Search engine crawler bots crawl, catalogue and index web pages, and the results are used by search providers like Google to provide their service
  • Website Monitoring: These bots monitor websites and website health for issues like loading times, down times, and so on
  • Aggregation: These bots gather information from various websites or parts of a website, and collate them into one place
  • Scraping: Within this category, there are both “good” and “bad” bots. These bots “scrape” or “lift” information from websites, for example phone numbers and email addresses. Scraping (when legal of course) can be used for research for example, but can also be used to illegally copy information

“Bad Bots”

  • Spam: Here, bots are used for spam purposes, often within the “comments” section of websites
  • DDoS: Bots can be used to take down your site with a denial of service attack
  • Ad Fraud: Bots can be used to click on your ads automatically
  • DDoS and Malicious attacks: A Botnet is a network of computers infected with malicious software and controlled as a group, often without the owners’ knowledge

How To Detect Bot Traffic?

Detecting bot traffic is a great first step in ensuring that you’re getting all the benefits of the good bots (like appearing in Google’s search results) while preventing the bad bots from costing you.
When figuring out how to detect bot traffic, the best place to start is with Google Analytics.

If you have wondered to yourself, “Can I see bot traffic in my analytics account?”, the answer is: you can definitely get an indication of it. You need to know what to look out for, and you’ll be able to get an indication of bot traffic, but you may not find a smoking gun.

The key ratios to keep track of here are:

  • Bounce Rate
  • Page Views
  • Page Load Metrics
  • Avg Session Duration

Bounce rate is expressed as a percentage, and shows visitors of your website who navigate away from the site after viewing only one page. Humans are most likely to arrive on your site (from a search engine result, for example), and then click through to explore your offering. A bot isn’t interested in exploring your site, so it will “hit” one page, and leave. A high bounce rate is a great indicator of bot traffic detected.

A high Bounce Rate is an indicator of bot traffic

Page Views are almost the reverse of this. The average visitor might visit a few pages in your site, and then move on. If you suddenly see traffic where 50 or 60 pages are being viewed, this is most likely not human traffic.

Slow site load metrics. This is also really important to monitor. If load times suddenly slow down, and your site is feeling sluggish, this could indicate a jump in bot traffic, or even a DDoS (Distributed Denial of Service) attack using bots. A tech solution might be required in some cases (more about this below), but this is a good first step in how to detect bots.

Avg. Session duration will tell you a lot about how users from different sources are interacting with the site. In the image below, the Microsoft Corp Network is most likely bringing non human traffic. Two seconds is classic for bot clicks.

How To Stop Bots From Crawling My Site

When figuring out how to stop bots from crawling my site, it’s important to keep in mind that some bots are good, that is you want them to be crawling your site. For example, it’s possible to prevent all bots from engaging with your website, this also means you’ll fall out of Google search results, for instance.

Your first stop is your robots.txt file. This is a simple text file that gives guidelines to bots visiting your page in terms of what they can and can’t do.Without a robots.txt file, any bot will be able to visit your page. You can also set up your file so that no bots can visit your page (although see warning above).

The “middle ground” is to put rules in place, and the good news is that the “good” bots will abide by these. The bad news, however, is that the “bad” bots will disregard these rules entirely.

When it comes to the “bad” bots, you’ll need to engage a tech solution. This is where a CDN (Content Delivery Network) service comes in. One of the advantages of a good CDN is the protection they can provide against malicious bots and DDoS attacks. Some of the most common ones are Cloudflare and Akamai, which can stop some bots from crawling site. As Cloudflare themselves say, “Cloudflare’s data sources will help reduce the number of bad bots and crawlers hitting your site automatically (not all)”.

There are also purpose-built anti-bot solutions that can be installed, but it’s important to note that most of these can protect your website relatively well, but cannot protect you outside of that – for example your ads on search engines and other properties.

Another more tedious (and less effective) option is to manually block IP’s where you know that the traffic is bot-related. A trick you can use is check the geographic origin of traffic. If your traffic is usually from the US and Europe, and suddenly you’re seeing a lot of IPs from the Philippines, it could be a bot or click farm.

Why Is It Important To Protect Your Ads?

One of the biggest threats to your ad campaigns, and by extension to the future of your business, is bot traffic. CHEQ and the University of Baltimore economics department showed that even opportunistic bots are set to cost businesses $10 billion in 2020. Bots can be programmed to click on your ads, leaving chaos in their wake: for example, by draining your Adwords account, by causing Google to rate your ad’s performance as poor, by stopping your ad being displayed while competitors’ ads are featured prominently, and by impacting conversion rates and rendering your analytics meaningless.

Bots are a large problem in today’s digital advertising industry. Taking a proactive approach to PPC protection is the only way to ensure that your ad campaigns are safe. All ad managers should consult with third party software to determine how their traffic is being affected by bot activity. A third party expert will help you mitigate this problem by tracking and blocking the bad guys and empty clicks. ClickCease does a great job of identifying fraudulent activity and providing insight into the location, tactics and user behavior of every individual click. This software is crucial in combating an issue that is overwhelming the online advertising industry and draining budgets. To protect your ads, look no further than Clickcease’s bot traffic detecting tools.

When it comes to doing what’s best for your business, protecting your ads against bot traffic should be one of your highest priorities.

Get Your Ads Protected Now

Ilan Missulawin

Ilan is a co-founder and the CMO of ClickCease since 2015.
When he isn't dreaming about click fraud you can find him writing about it.
Two of his favorite things in life are: Peppa Pig and writing about himself in third person.


This site uses Akismet to reduce spam. Learn how your comment data is processed.

  • Thank you very much, I had many questions regarding bots on my website and you just answered most of them. thanks again

    • Hi Dunie, not sure I completely understand your question. But, ClickCease does use a unique algorithm to detect bot traffic which Google and the other advertisers can often miss…

  • Hello ClickCease Team,
    I am using the bot traffic service of to increase my Alexa ranking for my website. I only purchase the professional plan, which uses only unique residential Ip’s and I could not tell the difference between human traffic and their bot traffic. Are there any other indicators than you have mentioned on your website to identify bot traffic? And does it make a difference regarding bad bots using residential ips or datacenter lips?

Block click fraud from ruining your campaign!

Most discussed