CookieBot Scan list wrong number of pages when listed for scan, This can put my websiet over 5,000 page
AnsweredI am paid user and one of the website has about around 5- 6 general page about about 400-500 articles pages. I noticed that when i started the scan and noticed in Google Analytics that it is listing lot of page which has keywords highlighted as link in the article page
Example on Article page we have text of about 500 words and we have about 10-15 tags/keywords which are linked to search functionality of website. cookibot scan scan list all this page as separate page. so if i have 10 keywords linked in the article to search page it will link them as 10 different pages while it is just one page
Sample text
This is sample text TagOne This is sample text TagTwo, This is sample text Tagthree This is sample text TagFour This is sample text TagSeven This is sample text TagEight
Example
- www.domain.com/Search/TagOne
- www.domain.com/Search/TagTwo
- www.domain.com/Search/TagThree
- www.domain.com/Search/TagFour
- www.domain.com/Search/TagFive
- www.domain.com/Search/TagSix
- .......
- ......
What happens in this scenario it list each tag as separate page while technically this is one page. I am afriad while my websiet may not have more than 500 pages but the way scan works it will list my website over 5,000 page which will put be on higher bill. How can i limit scanner to scan only 100, 200..500 page so that i am always under on subscription plan.
Otherwise i am afraid i will not be able to Cookiebot and have to move to other options.
-
For me this is technically one page which show different result based keywords. Having said so cookiebot may not be solution for several of our website as we have similar feature on these website where keywords are linked to search page to give user an option to quickly search based on keyword. as i said they i my comment we have 300-400 article and each article has set of 15-20 keywords which are linked and cookiebot list each of these page as separate page which puts us on a monthly subscription of our $50/ month which is expensive for 1 year. we can spend same money to get custom plugin developed by third part which we can use for all of other website which will be much cheaper in long run.
Yes this is a nice plugin but this still does automatically make our website GDPR compliant we still have to make changes to make things work as each site is different. You are justifying cost in one of the blog links for scanning. We dont want cookie-bot to do scanning we can manually add this cookies this could be a feature which you should add to it so that it becomes cheaper for website which has large number of page but only use 5-8 different types of cookies.
regards
1 -
Hi Ahgjobapplication
Your tag pages are real pages that can set cookies and other forms of online trackers, and technically they are real unique pages that return different content based on the tag.
Similar questions have already been answered here:
https://wordpress.org/support/topic/not-interesting-for-bloggers/
https://wordpress.org/plugins/cookiebot/#what%20does%20cookiebot%20count%20as%20pages%3F
0 -
You may consider your search page to be a single page, because it uses the same template for any of the keywords, however, when a search engine passes by, e.g. Googlebot, it will index each one of those pages as a unique page, unless you have filtered them out through robots.txt.
Cookiebot does not take your sitemap or robots.txt into consideration though, as these UNIQUE pages can still set trackers, and contain different content based on the keyword.
One scenario where we do filter certain pages, is when the page shows the SAME content in different ways. This could be a webshop, where you set different "view" modes, e.g. one mode shows the products in a grid, the other mode shows them in a list. In this case, you are viewing the same content, but displayed differently.
Yes this is a nice plugin but this still does automatically make our website GDPR compliant we still have to make changes to make things work as each site is different. You are justifying cost in one of the blog links for scanning. We dont want cookie-bot to do scanning we can manually add this cookies this could be a feature which you should add to it so that it becomes cheaper for website which has large number of page but only use 5-8 different types of cookies.
The in-depth scans of your website is an integrated feature in the Cookiebot service. The scanner has been developed and refined over the past 6 years to ensure that it finds not only the type of cookies you could manually detect yourself but also dynamic cookies as well as trackers being set by 3rd party services (and often unknown to the website owner and undetectable in the browser). We need to ensure that a full overview of all the tracking going on is presented to the users when they are asked for their consent - our reputation rests on this and so does your legal responsibility as a website owner - so this is not something that we can compromise.
0
Please sign in to leave a comment.
Comments
3 comments