robotTo SEOs, when it comes to understanding Googlebot behavior we would use Log Files.  But did you know that they can be used to identify any bad bots that are crawling your site?  After all, finding out what bots are crawling your site is important, as these bots are executing JavaScript, inflating analytics, taking resources, and scraping and duplicating your content.

Over a 90 day period, a 2014 bot traffic report by Incapusla looked at 20,000 websites of various sizes and noticed that bots accounted for over half (56 percent!) of all website traffic.  From those, 29% of those were malicious.  Not surprisingly, the larger you’ve built your brand up, the bigger target you become for these bots.

There are a number of services that you can take advantage of that will help automate advanced techniques to help take care of these bad bots, but there there are some things that you can do to help get you started out in the process of eliminating these bots from your site.   has a post on Search Engine Land that is an easy starting point for those who want to get rid of all the bad bots, and all you need is Excel.  From there, Ben will help you understand the basic of using Log Files, blocking bad bots at the server level, and cleaning up Analytics reports.

Check out Ben’s post by following the link below!

 

Search Engine Land: 3 Steps To Find And Block Bad Bots