Our sites auto protect from "rogue" web site crawlers who aggressively crawl our sites; expect for those with known IP ranges and known user-agents (and sometimes a combo of the two e.g. Googlebot is allowed when from the correct IP Ranges)
Hence when a Scan of cookies is done; a HTTP Status Code 429 is returned some of the time as we throttle the access as too many pages have been requested in a short time - i.e. we know its a bot and purposefully slow it down.
I can see CookieBot is using Microsoft as hosting platform; i.e. examples from
The issue is that the User-agent supplied i.e. "Mozilla/5.0+(Windows+NT+10.0;+Win64;+x64)+AppleWebKit/537.36+(KHTML,+like+Gecko)+Chrome/66.0.3359.117+Safari/537.36"
Doesn't identify CookieBot.
PLEASE.... Can you
a) Add to the User-Agent the phrase "CookieBot"
b) Allow us per web site/account in Cookiebot to define the user-agent that is used
Either way we then known "who" the scanner is and then can "un-throttle it"
I've posted this under Bugs and Errors - as the lack of User Agent is causing in correct scan results
Please sign in to leave a comment.