The Cookiebot scanner crawls your website – typically over a time of up to 24 hours – to look for all the cookies and tracking technology in use.
The scanner simulates a number of regular website users visiting your website. You can think of it as a simulation of around 5 users simultaneously visiting the website and going through all the subpages, clicking all the links, menu points and buttons, playing embedded videos and taking all the actions possible – in other words doing everything that it is possible for a website visitor to do on the website being scanned while at the same time picking up on cookies and trackers in use. The only thing the scanner does not do is fill in forms (like actually subscribing to your newsletter) and pay for goods in a shopping cart (if you have a webshop, the scanner will place items in the shopping cart but will not actually proceed and pay for those items).
The scans are done that way to ensure that all cookies and trackers that a regular user could possibly encounter on your website are found.
Our scanner will not cache any resources, so on every pageload all resources on your website will be reloaded, which might cause resource loads above what you expect from around concurrent 5 visitors.
What can I do to prevent this spike in traffic?
There is no other way to do the scans, so you cannot prevent the traffic from happening. You can, however, exclude the traffic from your regular website statistics: Exclude Cookiebot from Google Analytics data.
Comments
1 comment
Hello, how often is the scan repeated? Or is it once every 24 hours? TIA!
Please sign in to leave a comment.