Key features: Scanning of all pages of the site, as well as images, scripts and documents. Obtaining server response codes for each page (200, 301, 302, 404, 500, 503). Determining the presence and content Title, Keywords, Description, H1-H6. Search and display the "duplicate" pages, meta tags and headers. Determining whether an attribute rel="canonical" for each page of the site. Following the directives of "robots.txt" file or meta tag "robots". Accounting rel="nofollow" when crawling pages on your site. Reference analysis - determination of internal and external links to pages (within the site). Determination of the number of referrals from the page (redirect). Determining the level of nesting pages relative to the main. Generate sitemap "sitemap.xml" (with the possibility of splitting into several files). URL filtering by any parameter. Export reports to CSV and Excel (full report in Excel-format).
Differences from analogues Low demands on computer resources, low consumption of RAM. To store data, a local database is used that is characterized by its performance and reliability. Scanning websites of any volumes due to the low requirements of computer resources. Portable format (works without installation on a PC or directly from removable device).