The internet is just full of creepy crawly spider bot traffic and most of the time I could not care less of these types of pests.
Well, for the most part, we don’t see them at all or even know about them.
It’s not like we jump out of seats when we see bad spiders crawling our sites or anything like that.
So why care about things we can’t even see!?
Recently I was playing around in my Shareaholic account.
They have a tab for Human vs Bot analytics, and I scored a big fat F.
I haven’t had a F in anything since my failed attempted at Biology in the first year of University!
Bad Bot Traffic
Based on the verified quality of the traffic visitor your website, your site will get a letter grade (A+, A, A-, B+…) on this page that corresponds to the level of human traffic on your page. The closer you are to an A+, the better your website and marketing strategy is doing! – source
Negative Effects of Bad Bots
What I started to wonder was how could this be negatively affecting my site?
How much of my resources are just being wasted?
Could it affect my load speeds?
I’m always grinding to improve. Especially for my “mobile” readers.
Am I about 5 seconds away from being hacked?
With such a high percentage of these bad spiders, I had to first ask the bad side of me… was I running some crap service that could be doing this to myself?
Well, I couldn’t find anything running.
I know I’ve tested some questionable traffic type gigs in the past. But they seemed to have all ended their run months ago.
So I’ll have to dig deeper.
I definitely don’t want to be paying for server resources for some Russian, or other countries traffic that doesn’t even have real eyeballs.
Or wake up to find my server has been spending spam e-mail to everyone on the planet after being completely hacked.
So it’s time to rid these bot’s, or so-called “bandit Bots” and all while making sure I don’t get rid of any good bots.
Like our friends at Google!
And those other search engines we barely remember the names of anymore.
First, I did some research on the type of bots that are terrorizing me and everyone else. (Yes I know Robots aren’t really Bots but the depiction of Robots as something we can’t see is cool)
Bots vs Humans 2016
According to a 2016 Bot traffic report by Incapsula, who examined 16.7+ billion visits to 100k random domains, they found some interesting numbers.
Numbers stating that only 48.2% of website traffic is actually from humans! While 22.9% are “Good Bots” and 28.9% from those nasty Bad bots. Just wait till you see the infographic that displays the breakdown of these, ranging from monitoring bots at 1.2%, to 24.3% that are impersonators. A huge number.
These “mofo’s” are the bots that assume false identities to bypass security.
Then end up going on a DDos attack.
Impersonator Bots are the go-to tool for hackers
- From scapers to spammers
- Vulnerability scanners and DDos Bots
- Only bot that continues to grow while others are shrinking
- Can solve captcha’s
- Can do things like tweets, fake registrations and so on
- Represents 90% of all cyber attacks
Worst of the worst bots are the following
- Nitrol – 0.12% a trojan that hijacks windows computers and launches a DDoS.
- Cyclone – 0.09% a DDoS bot that receives it’s attack orders from an IRC channel.
- Sentry MBA – 0.03% a popular password cracking tool, but can also be used for DDoS attacks.
Well, alright we established these are bad, and we don’t want them.
But how do we get rid of them?
There is a lot of documentation out there on how to block “some” of them.
Usually with .htaccess files, by user agent’s, via referrer, and ips and other similar methods. But I wanted something a bit more bulletproof. Something that would adapt to new spammers, hackers and other mischief going on.
Also, something any idiot can set up.
Since let’s face it, majority of us even “technical” people are still morons when it comes to security.
So I decided to set a trap…
Time To Say Good Bye To the Unwanted Bot Guests..
The Bad Bot Trap
Typically, good bots will follow your robots.txt file which I talk a bit about on my technical seo audit page, but the bad bots don’t care and they ignore the “rules” because they think they’re cool and are after your assets for the most part.
Step 1. Install Blackhole PRO for bad Bots Plugin.
Blackhole for Bad Bots does one thing and does it well: traps bad bots in a virtual blackhole. It’s a lot of fun, and very effective at stopping bad bots from visiting your site.
The idea is simple: add a hidden link to your site, forbid access to that link via
robots.txt, and then automatically block any bots that disobey the rules and follow the link.
Both free and pro versions work the same way, but Blackhole Pro gives you some sweet features not included in the free version. Like deluxe bot log and adding bots manually. Check out the Blackhole Pro Tutorials at Plugin Planet to learn more.
Step 2. Update Robots.txt file
After installing the plugin, I updated my robots.txt file via filezilla because that’s the only way I can do it on WPengine and added Disallow:/?blackhole to the file.
It should look something like this.
Step 3. Now we wait!
… and it wasn’t more than 40 minutes later that I had some victims falling into the trap! (While doubling checked my ip to make one wasn’t me testing .. of course.)
Good to see some fellow Canadian’s in the list!
At least they say “sorry” after they brute force your login.
Hopefully, this will completely get rid of the problem. I won’t know for a few weeks.
I’m guessing as all traps take time to get their prey, but the evidence is showing that it will work.
Let me run this plugin for a few weeks. Then I’ll do a fresh screenshot of my human vs bot analytics. Then give you the final verdict on getting rid of these pests, and how it affects my time to first byte and all that good stuff.
Here’s that awesome bot infographic I mentioned!
The question I ask you right now is do you know how much of your traffic is just useless bots? Do Tell Below!