The Log File Analyser is light, but extremely powerful - able to process, store and analyse millions of lines of log file event data in a smart database. It gathers key log file data to allow SEOs to make informed decisions. Some of the common uses include - Identify Crawled URLs. View and analyse exactly which URLs Googlebot & other search bots are able to crawl, when and how frequently. Get insight to which search bots crawl most frequently, how many URLs are crawled each day and the total number of bot events. Discover all response codes, broken links and errors that search engine bots have encountered while crawling your site. Find temporary and permanent redirects encountered by search bots, that might be different to those in a browser or simulated crawl. Analyse your most and least crawled URLs & directories of the site, to identify waste and improve crawl efficiency. Import a list of URLs and match against log file data, to identify orphan or unknown pages or URLs which Googlebot hasn't crawled. Import and match any data with a 'URLs' column against log file data. So import crawls, directives, or external link data for advanced analysis.