know that spiders and bots visit your site very day and multiple times a day. For those who don’t know here is the definition
of bots according to Wikipedia:
An Internet bot, also known as web robot,
WWW robot or simply bot, is a software application that runs automated
tasks (scripts) over the Internet. Typically, bots perform tasks that are both
simple and structurally repetitive, at a much higher rate than would be
possible for a human alone.
based analytics solutions such as Google Analytics, Adobe etc. either stop bots
from executing the script or filter them out before calculating the metrics. Well, this assumption was true years ago, it
file based solutions. However, things
of Google Analytics and Adobe Analytics will filter out the spiders and bots to
an extent but considering the number of new bots that emerge every day, it is
not an easy task, neither for them nor for you.
So what do you do? Since you are
not going to get 100% bot free report filter out as much as possible so the
effect on your reporting and analysis is minimal.
my Digital Analytics Association (DAA) workshop at Chicago eMetrics. She told me that she gets a report every
week from their digital analyst. When asked, Digital Analyst confirmed that the report does
not filter out any activity from spiders and bot becuase he does not have time to
remove them. She was wondering if she should worry about it and push to remove
the bots from report or just accept it.
What do you think? Do you think the report she is getting is
Unless they are selling to spiders and bots, she is not
getting an accurate picture of website usage by real customers.
Make sure to ask your analytics team if they are removing
bots from reporting. If not, then do not accept the reports till they have done
cleanup and are paying attention to it on ongoing basis.