Skip to main content

Prioritize scanned URLs

Comments

1 comment

  • Hi Harry,

    We do not provide limitations to the crawl based on robots.txt or sitemaps as this could cause accidental noncompliance. Instead we make sure we scan all unique content that visitors can navigate to and access on the website.

    We have added a filter and started a scan and we've completed at just over 1500 pages.

    Regards,
    Elina

    0

Please sign in to leave a comment.