Marketers understand the power of automation; it saves time and resources, and helps you achieve better results. But some automation tools can be counterproductive, or used for fraud. And some also put a massive strain on your website.
As a site owner, these tools can visit your site to scrape information, scan content, interact with your web pages, and sometimes, even click on PPC ads. Many automation tools use web crawlers/spiders, which are a form of bot. And increasingly, we’re becoming aware of the issues with bots on the internet.
Not all bots are bad, of course. But knowing which ones have a negative effect on your site and what you can do to block them is becoming more important.
What do these automation tools want?
In many cases, automation tools are incredibly useful. As a digital marketer, chances are that you use a number of automation tools that use bots.
Keyword research software, PPC reporting tools, and social media monitoring packages use bots to collect that raw data. But these very much fall into the good bots category.
So how about those bad bot automation tools? First, let’s summarize automation tools that use good bots vs. bad bots.
Good bots help users all over the internet to interact with your content. They also help you maintain your website. Some examples of good bots include:
- Search engine crawlers: Collecting information for Google, Bing, Yandex, and others, these useful bots ensure the information is on hand when you run a search
- Social network crawlers: Bots on social media can be used to collect information on trending hashtags, repost/retweet content, and even offer useful advice in real time
- Site monitoring bots: SIte monitoring bots include Uptimebot, WordPress pingbacks, and others. They ping your website to monitor its performance and uptime.
- Marketing bots: These bots are used by SEO and content marketing services to crawl your site to deliver information on keyword research and more. This can include PPC, SEO, and social media analysis
- Chat bots: These useful bots automate interaction with your customers and technically don’t crawl the internet. But some of them can collect information from your site and further afield
Bad bots are programmed to perform tasks that will hurt your website, visitors, or ad campaigns. These are the ones we need to target and protect websites from bots.
- Web scraping bots: Using bots to steal content and harvest contact details (to send spam emails to) is one of the most common (and annoying) ways that automation tools affect your site
- Click bots: These fraudulent automation tools are hired to click on whatever they’re hired to click on. It can be simply inflating site traffic, but it can also be your paid search or display ads
- Spam bots: Automated spam bots are used to create spammy backlinks or to generate tons of useless content in your comment fields. They can also be used as part of DDoS attacks
- Carding bots: Credit card fraud is often carried out by automated bots – with thousands of transactions processed in seconds
- Account Takeover bots: These bots take over accounts to access personal data and sensitive information like bank accounts and credit cards. Once obtained, the information can be used for identity theft or fraudulent purchases.
Do I need to block good bots?
In the case of most automation tools used for genuine and useful purposes, you do want these to scan your site. These tools wouldn’t be able to do their job if they couldn’t scan your site.
As for their impact, Google knows that PPC reporting tools don’t represent valid clicks on your ads, so these are not counted. In short, you do not need to block ‘good bots’, such as those used by MOZ or SpyFu.
However, there are also plenty of black hat SEO and PPC tools out there which go beyond mere scans and data collecting. Content scraping and link injections are some common methods of.
So although Google may pick up the well-known automation tools, there is a case to be made for monitoring your traffic to watch for black hat tools. And there are plenty of those.
Why are bad bots becoming a greater threat?
Besides stealing information and running up ad budgets, these programs are also becoming increasingly sophisticated. They can now mimic human behavior, making it harder than ever for marketers to block bots by just looking at their actions.
Consider that these bad bots made up 27.7% of all global website traffic in 2021, up from 25.6% in 2020. Additionally, evasive bots – bots that skirt around standard security measures – make up 65% of all bad bot traffic.
So not only is bot activity increasing online, but a large chunk of these bots are also so advanced that simple measures don’t stop them.
The good news is that some strategies still work to block bots with malicious intent.
5 ways to protect website from bots
1. Ignore bots in Google Analytics
Bots can skew your website’s analytic information, making it harder to accurately interpret the data and make appropriate decisions. So a great first step is to find out how to block automation tools in Google Analytics.
It’s as easy as navigating to the Admin View Settings and selecting the option to “exclude known spiders and crawlers from your data”. This isn’t a permanent solution because it doesn’t keep the bots off your site.
However, learning how to block automation tools in Google will allow you to get more accurate data and make informed decisions.
2. Use CAPTCHA tests
CAPTCHA is a pretty straightforward filter and should be your first line of defense. It will protect websites from bots by presenting simple tests that are easy for humans to respond to (for the most part) and tough for less advanced bots.
CAPTCHA isn’t perfect, as you must have seen from some of those frustrating “are you a robot questions?” One study has also found that captcha may lead to lower conversions, so you should use this approach with caution.
3. Use a firewall
Firewalls are great for blocking unknown threats from accessing your website. They usually have a database of known malicious user agents and IP addresses, and those get blocked automatically.
One of the shortcomings of this approach is that some bots use hundreds of IP addresses. And they can rotate those fairly easily. As a result, many will get through. They also use residential IPs with good reputations, which means that blocking those could also keep real human users out of your site.
4. Use hidden fields
Some webmasters have seen success with hidden fields, or honeypots, when trying to block bots. The idea is that these hidden fields will be invisible to regular users but visible to bots that scan your site’s CSS.
The field serves as a trap. And since many bots tend to fill all available fields, you can immediately filter out those submissions.
Use this approach with caution because there are two downsides. The first is that sophisticated bots can identify and ignore hidden fields like humans. Secondly, search engines tend to penalize pages with hidden fields.
5. Bot blocking software
An obvious way to keep bots and malicious automation tools off your website is to block bad bots. Software such as Bot Zapping by ClickCease is designed to filter out spam, carding, DDoS, and other fraudulent bots from your WordPress website.
The benefit of using ClickCease is that, as well as direct traffic, ClickCease blocks click fraud bots on your ads.
Let the goods guys in, keep the bad guys out
For most marketers, the focus is more often on winning more customers and optimizing conversion costs. The issue of bad traffic on your paid campaigns is one that many choose to overlook, seeing it as a non-issue.
But the world of automation keeps growing, and there are more and more bot based tools available for many tasks.
In fact, the majority of web traffic is made up of automated tools such as web crawlers and bots. And many of these are used for doing sneaky stuff such as fraud.
Blocking bot traffic and click fraud is more important than ever. Especially if you’re spending lots of money on your ad campaigns.
If you’ve never taken a closer look at the traffic on your site, you can run a free audit with ClickCease.
Sign up for our free 7 day trial to get detailed analysis on your paid traffic. You can also use our Bot Zapping tool to view organic and direct traffic on your WordPress sites.