Tired of search engines and other web spiders showing up in your Google Analytics reports and skewing your data? Google brings an important update to Google Analytics today that finally makes it easy to filter bots and spiders from your user stats.
Fake traffic generated by bots and spiders are bad for two reasons:
- They skew your data, artificially inflating visits and unique visitors, increasing bounce rate, and decreasing pages/visit, average visit duration, goal conversion rate, ecommerce conversion rate, etc.
- Increases the negative side effects of sampled data in Google Analytics. Even though the visits are from bots, they still count toward the visits when it comes to sampling.
Google is now using the IAB’s “International Spiders and Bot List” to filter out traffic from search engines and other spiders, which is updated monthly. If you want to know the names of search engine involved in this traffic, you have to pay somewhere between $4,000 and $14,000 for an annual subscription, depending on whether you are an IAB member.
Once you have chosen to filter this kind of fake traffic, Analytics will automatically start filtering your stats by comparing traffic to your site with that of known User Agents on the list. Unitl now, you had to manually filter out this kind of traffic. But now the job is made easy.
The checkbox for enabling this feature can be found by clicking View Settings under All Website Data on the Admin page in Google Analytics.
Depending on your site, you may see a decrease in your traffic. That’s to be expected and the new number indicates where real traffic is coming from.
Via : techcrunch