It gathers key log file data to allow SEOs to make informed decisions. Some of the common uses include - Identify Crawled URLs. View and analyse exactly which URLs Googlebot & other search bots are able to crawl, when and how frequently. Discover Crawl Frequency: Get insight to which search bots crawl most frequently, how many URLs are crawled each day and the total number of bot events. Find Broken Links & Errors: Discover all response codes, broken links and errors that search engine bots have encountered while crawling your site. Audit Redirects: Find temporary and permanent redirects encountered by search bots, that might be different to those in a browser or simulated crawl. Improve Crawl Budget: Analyse your most and least crawled URLs & directories of the site, to identify waste and improve crawl efficiency. Identify Large & Slow Pages: Review the average bytes downloaded & time taken to identify large pages or performance issues. Find Uncrawled & Orphan Pages: Import a list of URLs and match against log file data, to identify orphan or unknown pages or URLs which Googlebot hasn't crawled. Combine & Compare Any Data: Import and match any data with a 'URLs' column against log file data. So import crawls, directives, or external link data for advanced analysis.