Prioritize scanned URLs

Comments

1 comment

  • Hi Harry,

    We do not provide limitations to the crawl based on robots.txt or sitemaps as this could cause accidental noncompliance. Instead we make sure we scan all unique content that visitors can navigate to and access on the website.

    We have added a filter and started a scan and we've completed at just over 1500 pages.

    Regards,
    Elina

    0
    Comment actions Permalink

Please sign in to leave a comment.