How to get a list of all URLs scanned by WebInspect - both blacklist and whitelist



WI report lists all the URLs visited that have potential security issues, i.e., the blacklist. How to get the whitelist URLs - i.e., the URLs WI went do, but did not find anything. Not interested in the directory structure in the WI dashboard view, its a pictorial view. Need in downloadable / printable format.


Any assistance is greatly appreciated.






  • It sounds like you are seeking the Crawled URLs output.


    • Available as a sub-report option in the Reports >  Crawled URLs  (PDF, RTF, XLS, HTML)
    •     There are also selections for Offsite Links, Web Crawl Dump (raw site copy), and Site Tree Dump (raw site copy)
    • Other options you like like are under the QA Summary report (Broken Links, et al)
    • Available as an export under File menu > Export > Scan Details > URLs  (TXT or XML)
    • Export Site Tree to XML via right-click menu in  the Site Tree pane




    What is not normally included in the "scan results" are all the 404 responses and also any pages you specifically placed in Session Exclusions or similar restrictions.  If you wanted to have those included in future scans for your "whitelist" export, then you will have to adjust the Session Storage scan settings.  These would include the following.  Be warned that enabling or editing these will probably increase the size of your scan data set, and possibly the length of time on the scan.


    • Invalid Host
    • Exluded File Extension
    • Excluded URL
    • Outside Root URL
    • Maximum Foder Depth Exceeded  (see Crawl Details on the General settings panel)
    • Maximum URL Hits   (see General panel)
    • 404 Response Code  (see File Not Found panel)
    • Solicited File Not Found
    • Custom File Not Found
    • Rejected Response