• Improve your website's quality, SEO and user experience
  • Checks internal and external links as well as images
  • Fast scan with many options, progress displayed via dock icon
  • Highlights problems in colour, double-click for further information
  • Many export options including full report, xml sitemap, graphic visualisation
  • Checks your html using the w3c html validator; the public instance or your own installation
  • SEO analysis of a page such as occurrences of keywords or missing page title, meta description, headings and keywords
  • Manages as many sites as you like with different settings for each

    Feed Scrutiny your homepage URL and watch it follow internal links to find all of your pages and carry out these checks from the same viewpoint as a search engine robot.

    Scrutiny stores a lot of data and can only use the memory available to it. If your site is big enough (several thousand pages) Scrutiny may not be able to crawl it in one go. For tips on crawling it in sections, please see Scrutiny's FAQ page.

  • What's New
  • Major improvements to the engine and data storage meaning that even small sites will crawl more quickly and large sites will crawl very much more quickly without slowing down or losing responsiveness.
  • When stop button is pressed, all open threads are abandoned, and then recreated if 'continue' is pressed. Gives a much better user experience.
  • Blacklist and whitelist boxes replaced by a more user-friendly table of rules (existing data will be preserved and presented in the new way)
  • Adds 'By page' links view. If 'bad links only' are showing, the view will show a list of pages requiring attention, expanding to show the bad links on that page.
  • Adds new settings to Preferences, allows setting of limits - default to 200,000 links. Offering the option of limiting the crawl of a large site (maybe better achieved by using blacklist / whitelist rules) but also a safety valve to prevent crashing due to running out of resources when crawling very large sites
  • If starting crawl within a directory, crawl is limited to that directory, ie crawl will go down a directory structure but not up. This matches users' expectations. Previously, crawl extended to all pages in the same domain.
  • Blacklist and whitelist boxes replaced by a more user-friendly table of rules (existing data will be presented in the new way)
  • Fixes problem with robots.txt if more than one user-agent is specified. Now will only use an exclusion list for user-agent = all (*) or Google (ie Scrutiny will respect the file as if it were Googlebot)
  • Moves 'check links on custom error pages' to settings rather than global preferences, and moves the 'labels' preferences to the View rather than General tab of the preferences window
  • Adds Help contents to help menu - links to manual index page
  • Increases maximum number of threads from 30 to 40 (will improve crawling for some sites) with the default now 12 rather than 7. Extreme left (labelled 'fewer') is still a single thread.
  • Updated application icon

  • scrutiny (com.peacockmedia.scrutiny) is a Mac software application that has been discovered and submitted by users of The latest version that our users have reported running on their systems is scrutiny 4.0.3. The most popular version of this application used by our users is scrutiny 4.0.4.

    Operating System: Mac

    Default Install Path: /Applications/