CookieBot Scan list wrong number of pages when listed for scan, This can put my websiet over 5,000 page

Answered

Comments

3 comments

  • Avatar
    Kenan

    Hi Ahgjobapplication

    Your tag pages are real pages that can set cookies and other forms of online trackers, and technically they are real unique pages that return different content based on the tag. 

    Similar questions have already been answered here:

    https://wordpress.org/support/topic/not-interesting-for-bloggers/

    https://support.cookiebot.com/hc/en-us/articles/360003773214-How-does-the-Cookiebot-scanner-define-pages-on-a-website-

    https://wordpress.org/plugins/cookiebot/#what%20does%20cookiebot%20count%20as%20pages%3F

     

    0
    Comment actions Permalink
  • Avatar
    Ahgjobapplication

    For me this is technically one page which show different result based keywords. Having said so cookiebot may not be solution for several of our website as we have similar feature on these website where keywords are linked to search page to give user an option to quickly search based on keyword. as i said they i my comment we have 300-400 article and each article has set of 15-20 keywords which are linked and cookiebot list each of these page as separate page which puts us on a monthly subscription of our $50/ month which is expensive for 1 year. we can spend same money to get custom plugin developed by third part which we can use for all of other website which will be much cheaper in long run.

    Yes this is a nice plugin but this still does automatically make our website GDPR compliant we still have to make changes to make things work as each site is different. You are justifying cost in one of the blog links for scanning. We dont want cookie-bot to do scanning we can manually add this cookies this could be a feature which you should add to it so that it becomes cheaper for website which has large number of page but only use 5-8 different types of cookies.

    regards

    0
    Comment actions Permalink
  • Avatar
    Kenan

    You may consider your search page to be a single page, because it uses the same template for any of the keywords, however, when a search engine passes by, e.g. Googlebot, it will index each one of those pages as a unique page, unless you have filtered them out through robots.txt.

    Cookiebot does not take your sitemap or robots.txt into consideration though, as these UNIQUE pages can still set trackers, and contain different content based on the keyword.

    One scenario where we do filter certain pages, is when the page shows the SAME content in different ways. This could be a webshop, where you set different "view" modes, e.g. one mode shows the products in a grid, the other mode shows them in a list. In this case, you are viewing the same content, but displayed differently.

    Yes this is a nice plugin but this still does automatically make our website GDPR compliant we still have to make changes to make things work as each site is different. You are justifying cost in one of the blog links for scanning. We dont want cookie-bot to do scanning we can manually add this cookies this could be a feature which you should add to it so that it becomes cheaper for website which has large number of page but only use 5-8 different types of cookies.

    The in-depth scans of your website is an integrated feature in the Cookiebot service. The scanner has been developed and refined over the past 6 years to ensure that it finds not only the type of cookies you could manually detect yourself but also dynamic cookies as well as trackers being set by 3rd party services (and often unknown to the website owner and undetectable in the browser). We need to ensure that a full overview of all the tracking going on is presented to the users when they are asked for their consent - our reputation rests on this and so does your legal responsibility as a website owner - so this is not something that we can compromise.

    0
    Comment actions Permalink

Please sign in to leave a comment.