Welcome Serena Central users! CLICK HERE
The migration of the Serena Central community is currently underway. Be sure to read THIS MESSAGE to get your new login set up to access your account.
Highlighted
Micro Focus Expert
Micro Focus Expert
12012 views

WebInspect Scan Configuration Tricks and Best Practices

Before answering many small queries in the forums, let's drop in a big notepad of ideas here.  Most configurations are only required after a particular scenario is encountered.

 Updates:

  1. SEPT 2019 - added Logout tricks
  2. MAY 2019 (minor)
  3. NOV 2016
  4. JULY 2017
  5. AUG 2017

 

 

 ================================================ 

General Practice:

 

Although running WebInspect with ‘out of the box’ scans settings might be the easiest way to start a scan, it is almost sure to produce unexpected results. Configuring any web application scanner is tricky, but by following these simple steps to fine tune the scan more accurate results will be generated.

 

Know your website

Performing a manual assessment of your website (before using any tools) will help you quickly spot mis-configured scans, tweak scan configuration parameters, and ensure more consistent results.

 

The first step is to become familiar with the site topology - directory structure, the number of pages, submission forms, etc. Perform a manual site survey and take notes. If you have access to the source, look at the file structure. If not, hover over the menu links and notice the site structure of the URLs. Are URL parameters used to drive the site navigation? If so, record them and use them to drive WebInspect.

 

It is also important to have some understanding of how the site operates behind the scenes. Different websites tend to handle common administrative tasks in unique and unexpected ways. For example, some websites require users to re-enter their passwords and a pass a 'CAPTCHA' test before assigning a new password. Other sites allow a password to be changed simply by entering a new password.

 

Knowing the basics of the site mechanics will go a long way toward heading off mis-configured scans, and getting familiar with the layout of your site is the best way to help WebInspect cover the entire site.

 

Protect your data

Web application security tools try to force websites to accept input data that they may not have been designed to handle. Therefore one side effect of auditing a website for vulnerabilities is that ‘garbage data’ can be injected into the database. On a database-driven site (like most blogs or CMS systems), this junk data will show up in other unexpected and very visible ways. After a scan, you may find that your default website language has been changed to Farsi, test files have been uploaded, the new blog color theme has been set to ‘Early 80s Disco’, or 13 new users have been added – complete with nonsense test Posts.

 

To minimize theses risks, scan a non-production version of the target website if possible. Sometimes audits are necessary on a complicated production server setup.  If this is the case, make a backup of the entire production database and verify the ability to successfully restore it before a scan.

 

Limit the scan

If the local server hosts multiple web applications, it is important to restrict the audit to the application of interest. For example, my local Apache installation hosts 12 web applications in the htdocs root folder. When I want to scan “Wordpress”, I often forget to restrict the audit, and end up with a noticeably longer scan time, and an unusually large numbers of vulnerabilities. A quick glance at the “site” tree in WebInspect will quickly show whether the scan has started crawling into folders that were not intended.

 

To prevent the scan from ‘running away’ (taking too long to complete), open the scan settings before the scan is launched, check the “Restrict to folder” option and select “Directory and subdirectories”. Take note: this option is not enabled by default, so this may be worth remembering. Also, make sure the start URL either contains a start page, or the initial directory ends with a trailing “slash”.

 

Login Macros

Login Macros are essential to correctly scanning a website, yet may unknowingly be the root of many failed scans. Before creating a new login macro to allow WebInspect to successfully gain entry to the actual site, choose a user with limited ability to modify the site. If one is not available, create a new user with the lowest role possible. For example, Wordpress allows 4 roles with varying degrees of ‘power’: Administrator, Editor, Author and Contributor. Scanning Wordpress as the Administrator user may result in any of several undesirable scenarios, including the destruction of the entire blog, while scanning as a ‘Contributor’ should only result in a few extra unpublished blog entries.

 

Check your login macros for errors during the scan. Often a login macro that is incorrectly recorded may fail to login to the site which causes the scan to produce invalid results. Symptoms include: abnormally short scan times, lack of vulnerabilities, or large numbers of errors in the error log.

 

Other times the macro may not be able to log back into the website during a scan – even after the first login has been successful. For example, login macros tied to a user account that is able to change its own password will prevent the macro from logging back into the site. Once this happens, the error log may fill up with errors and the scan may stop. It is important to monitor the scan periodically and assess the scan ‘health’.

 

 

Conclusion

Some users might be unaware of the unintended consequences of web application vulnerability scanning, while others users might need help understanding their scan process. Although these simple steps are not remedies to solve issues scanning complex sites, they will help rudimentary scans to produce more valuable results. The more information that is provided to WebInspect through the scan configuration settings, the better the outcome of the scan will be.

 

 

 ================================================ 

Profiling the Site:  

  • This section is for tips on deciding how to configure the scan based on what you see. The Server Profiler tool is nice, but what can the human do?
  • 3 manual pre-scan tests
    1. Log into the site, then paste in a known Bad URL, e.g. /hans111.html
      • If you get anything other than 404, add a Custom File Not Found signature.
    2. Log into the site, then paste in a known, Good, deeper URL
      • If you are prevented from jumping to a known, authenticated page, then the site may require specific click paths.  You may need to enable the Depth First crawler scan feature, and don’t neglect to also enable the sub-box to Follow the Paths for Audits too.
    3. Log into the site, then attempt a secondary login in a completely separate browser
      • If it blocks your attempt saying you are logged on elsewhere, then you may need to resort to Single Shared Requestors.  If you are lucky, the app will permit you to increase that Single Shared Requestor count.
      • (Recall that you can make a copy of this folder and run browser.exe as a 2nd browser, regardless of the client's ability to install or use any non-approved browsers.  😉  I suggest making a copy of C:\Program Files\HP\HP WebInspect\browser\ moved to here:  C:\Program Files\HP\browser\ )
  • Try an extensive recording with the Web Form Editor tool to see what is exposed.  (Maybe keep that running during all your manual probes.)
  • If the scan keeps re-authenticating (watch the Logout Count), but it is definitely getting deeper and deeper within the site, then just run with it.
  • If a lot of the pages are similar except for having different page text or product images, enable the Redundant Page Detection (RPD) and re-run the scan.

 

 

 

 ================================================

 ================================================

REMAINING ARTICLE IN ALPHABETICAL ORDER BY TOPIC

 

 ================================================

Non-English Target
      (Foreign language web site)

Many times web sites in foreign languages may still use English names for the fields and parameters. This section suggests edits to the WebInspect scan settings in general when the targeted web site has other languages built into it. Note that the language or release of WebInspect installed does not have to match that of the language on the target web site!

When a customer mentions “language” for dynamic testing, it tends to indicate the web site target consists of a non-English spoken language, not of a particular programming language. If this is indeed your concern, you may find your scans of non-English web sites will operate a little better with some adjustments. There are a few scan settings in WebInspect that naturally lean towards English, and you may wish to adjust for these. Whatever you set, you may want to save it as a New Default, set it as the Default Scan Settings, and then also review/update it with each WebInspect install/update. WebInspect updates will not force change your custom settings, and so if there has been an enhancement or improvement to a setting you have customized, your saved scan setting will be "stale".


1. Method scan settings - Customize the Web Forms input file. The dummy form names that the WebInspect Crawler seeks to fill may not precisely match the form names found on your non-English targets. Or your nation may have select forms that the U.S. has not accounted for, such as National ID fields. If you find many forms in a completed scan are all showing the Default Value (“12345”), then you might expand the attack surface area in subsequent scans by customizing this input file. Use the Web forms Editor tool to augment the defaults, and then specify your custom *.webforms file in the Methods scan settings panel of WebInspect.

2. HTTP Parsing and charset – You may want to alter the charset used by WebInspect at the bottom of the HTTP Parsing scan settings panel. It defaults to “Western European (Windows)”.

3. Session Exclusions – Reviewing the defaults here, you will note that by default WebInspect avoids accidentally logging out by avoiding pages with keywords such as Logoff, Signout, and Logout. Your targets may or may not use these keywords or names for the button/link that logs the user off from the site. Customizing/augmenting these entries may help prevent WebInspect from triggering those logouts in your nation/language.

4. Attack Expressions - CultureInfo – You may want to modify WebInspect’s management of CultureInfo to match the target’s nationality. This helps it alter the behavior for its (internal) regular expressions to some degree.

5. Attack Exclusions - Audit Inputs file – The Audit Inputs Editor tool is rarely used, but there are select checks in WebInspect that permit user alteration, primarily in their behavior and matching but not their attack logic. This tool is hidden on the attack Exclusions scan settings panel and also within the Tools menu of the Policy manager tool. The Audit Inputs Editor tool permits you to fine-tune select check behaviors. Non-English users may want to review this and create a custom Audit Inputs file for use in the Default Scan Setting file. Bear in mind that just as with Login Macros, Web Forms Inputs, and other secondary artifact files/inputs, this secondary inputs file is actually absorbed into the saved Scan Settings File as-is. these referenced inputs are not live links and so the saved scan settings file does not automatically update its internal copy when that named input file is edited elsewhere, later. To update those settings file inputs, the user must re-edit the saved scan setting file and re-import the specific, updated input file again, even if it has the same file name.

 

 

  ================================================

Post-Scan Analysis (PSA):  

  • Review the Site Tree to see if there are "crazy" findings or if absolutely every page request thrown is sticking all over the place.  Non-existent pages, more than the normal quantity of "Low" findings (blue icons)?  This can indicate you have custom error pages and need a Custom FNF signature (File Not Found).

 

  • Does it appear to have folders within the URI that are actually parameters instead, i.e. URL Rewriting?  Look into the Custom Parameters scan settings, read that panel's Help as well.

 

  • Check the Scan Log pane for any oddities.

 

  • Review a live or finished scan to see if the Default Crawler input value of "12345" is showing up "too much".
    • Consider augmenting the default Web Forms input (see WFE tool) by using its Record button to walk the target application and store valid field names and their matching values prior to using this input file in a scan.
    • This effort helps expose more attack surface area in subsequent scans.

 

  • Look for custom state-keeping parameters which will have values like random GUID.
    • May need to add these to HTTP Parsing to help manage state and not fuzz these fields too badly.
    • The Server Profiler may not authenticate, so it may have missed some of these initially.

 

  • Look for custom navigational parameters such as "ReturnURL" or various menuing and locator items.
    • May need to add these to HTTP Parsing to help maneuver the site intelligently.

 

  • Did you rediscover issues that you previously spent time marking as False Positives?  Use the False Positives panel to Import that prior work from that prior scan to filter out those same issues in today's scan..  This FP Import is also available in the Scan Wizard, but it can be done after the scan as well.

 

 ================================================

RESTful API web services scanning:

  • First, please refer to the section "Scanning a REST API Definition" found in the WebInspect User Guide for complete details on scanning REST API definitions.

 

  • Pre-visit the site with the included WISwag.exe CLI tool. May want to fetch latest copy from Fortify Ecosystem.
    Use output (-s is custom scan settings file, XML) from the WISwag tool to drive WebInspect (Workflow-driven scan).
    Requires Swagger-based API target.

    General Ideas:
    If the service truly is REST then you will need to scan it as if it were a web site rather than a web service. You should use either the Guided Scan wizard or the Basic Scan Wizard, but not the Web Service Scan Wizard (SOAP/WSDL only).

    One, old, fool-proof method is to capture traffic between a client and the service then use that in a Workflow-driven scan. Some suggestions on how to accomplish that are as follows.

    Using WISwag (optimal for 16.20+)
    The WISwag CLI tool is used to preview a Swagger-based REST API. The output from this review is either a WebInspect scan settings file or a Workflow Macro. The user would then use that output in the configuration of the WebInspect scan, preferably a Workflow-driven scan.
    In 17.10, WISwag endpoints were added to the WebInspect API, so the user is not trapped with using it only from their local command-line.
    WISwag may be included with the WebInspect installation, but (at least for 2017) the development team is enhancing this and other secondary tools continuously. During this time, the smart user may want to download its latest release from the Fortify Marketplace (https://marketplace.saas.hpe.com/fortify).

    Using a Client Application (preferred)
    If a client application is available, or if you are able to create one:
    Configure your client application to use the Traffic Viewer tool as a proxy
    Use your client application to send all combinations of requests with JSON or XML payloads to your RESTful service (therefore capturing them in the Traffic Viewer)
    Stop the Traffic Viewer and save all your requests into *.webmacro file: Save -> Save as Macro
    Start a workflow scan using the saved macro.

    Using a Browser
    Configure your browser to use the Traffic Viewer tool as a proxy
    Use your browser to exercise the RESTful service (therefore capturing the related traffic in the Traffic Viewer). If your RESTful service accepts only GET requests then you can use the browser as-is, but a service that accepts other methods (POST, PUT, DELETE etc) will require the installation of a RESTful client extension such as “Postman” for Google Chrome or “RESTED” for Firefox (just two examples - there are others).
    Stop the Traffic Viewer and save all your requests into *.webmacro file: Save -> Save as Macro
    Start a workflow scan using the saved macro.

    Using the HTTP Editor Tool
    Configure the HTTP Editor to use the Traffic Viewer tool as a proxy
    Use the tool to exercise the RESTful service (therefore capturing the related traffic in the Traffic Viewer).
    Stop the Traffic Viewer and save all your requests into *.webmacro file: Save -> Save as Macro
    Start a workflow scan using the saved macro.

    Using a Guided Scan
    This option can only be used if the RESTful service is limited to accepting GET requests.
    Start a Guided Scan in WebInspect and work through the steps until you reach “Enhance coverage of your web site”.
    Use the built-in browser here to exercise the RESTful service. (You will see “Explored Locations” being captured below the browser pane).
    Proceed through the rest of the Scan Wizard and run the scan. 

 

  ================================================

Scan Policies (attack templates):

 

  • Use the Standard unless:
    • You want only half of it - select Application or Platform policy
    • You want only the important items - select Criticals & Highs policy
    • You are scanning a Production system for the first time ever - select Passive policy
    • You want to use all available checks and are not worried about detrimental checks - select All Checks

 

  • Keep the Smart Scan feature enabled to save scan time.

 

  • Scan storage - If using a full-sized SQL Server (not SQL Express), move the database to a separate machine from WebInspect.  This speeds up the scanner machine's performance.

 

  • Enable the Traffic Monitor only when trouble-shooting issues.  Or during a POV in order to show off the product.
    • If drive space is not an issue, consider adding it to Default Scan Settings.  You can't turn it on after-the-fact.

 

  • For the Content Analyzers, consider enabling the DOM event parsing to show off the product.
    • Unless scan length will be a client concern.
    • May require this for sites with lots of script, to show the best results to the client.

 

  • Vulnerability Filtering - Consider enabling all the filters except the 403 Blocker, or just the Param Vuln Roll-up if nothing else.

 

 ================================================ 

SPA - Single-Page Application scanning:

 Ref:  https://community.saas.hpe.com/t5/Fortify-Discussions/DAST-scans-for-SPA-using-WebInspect-17-10/td-p/1506951

 

We released a new scan setting for Single Page Applications (SPA) in WebInspect 17.10 (spring 2017). This setting is found under the Default Scan Settings > Content Analyzers panel > Javascript/VBScript option > "Enable SPA support". It is still a Technical Preview, so it is disabled by default.

This setting is a culmination of research learned with our DOM Explorer tool (released with WebInspect 16.10, spring 2016) and our FOD SaaS team's actual field use, although the Dom Explorer tool will probably continue to be developed. With WebInspect 16.x, the user had to previsit the SPA site with DOM Explorer, then use the saved output (*.webmacro) from that to drive a Workflow-driven scan in WebInspect. With this new setting enabled, you should not need to previsit the application with DOM Explorer.

DOM Explorer is shown on the Tools menu in WebInspect, but it can also be downloaded from our Fortify Marketplace/Ecosystem: https://marketplace.saas.hpe.com/fortify/content/dom-explorer-for-advanced-js-applications

Separate from this SPA setting development, you may also find it useful to review the Link Sources scan settings panel. This is a change in the link parsing engine for WebInspect, but it is also still in a state of Tech Preview, i.e. disabled by default. The user can opt to turn off the older Pattern-based link parsing engine and enable the newer DOM-based parsing engine. This new engine offers the user sub-options to fine tune how aggressive the link parser will operate.

From the Help, <<Using DOM-based parsing, Fortify WebInspect parses HTML pages into a DOM tree and uses the detailed parsed structure to identify the sources of hyperlinks with higher fidelity and greater confidence. DOM-based parsing can reduce false positives and may also reduce the degree of 'aggressive link discovery.' >>

Additionally, there is an option under the Content Analyzers scan settings panel to "Create script event sessions". This will permit WebInspect to audit individual JS/script events, and these will appear on the Site Tree alongside the normal GETs and POSTs. From the Help, <<Fortify WebInspect creates and saves a session for each change to the Document Object Model (DOM).>>

This setting is disabled by default as it will make the scan much more thorough for sites heavy in such scripting, so those scans will therefore take a little longer to complete. If you enable this setting, I would also recommend lowering the nearby "Max script events per page" setting from 1,000 to something around 200 or 250.

 

  ================================================

Standard Web Site Scans:

 

  • Always browse the site prior to scanning, and generate some 404 page errors to understand if the site uses custom errors and may need a Custom FNF signature or two.

 

  • Always run the Server Profiler tool against the site, or at least ensure it is included in the Scan Wizard.

 

  • Always see if the site bridges multiple ports (80 & 443).  May need to add as an Allowed Hosts.
    • Can bloat some scans - consider running those separately and combining in Reports later.
    • If client says they are identical, focus on just one.

 

  • If the Server Profiler suggests lowering the Thread Count, I would ignore this as the default timeout and Request Retry Count generally takes care of slow-ish targets.
    • The pain of dropping to 1 Thread is probably not worth it unless you have already knocked this target off-line before.

 

  • If the Server Profiler suggests adding state-keeping or navigational items to HTTP Parsing, just accept them all.
    • They will still get tested, just not as brutally as ordinary parameters.

 

  • Using a user-level test account for authentication will provide better results than using no authentication.
    • Running a comparison report on these two scans can be interesting later.
    • Administrator-level accounts should be avoided until the site is better understood and the scan configuration properly prepped to minimize collateral damage.
    • Refer to the Help guide's entry under Getting Started > "Prepare Your System For Audit"  (Yes, WebInspect can "damage" a server/app, but mostly it will be the Crawler that is to blame.)

 

  • Use a Direct Connection proxy setting whenever possible, to avoid any proxy latency issues.

 

  • Always consider a Crawl-Only before performing any normal scans.
    • Helps identify authentication issues, and a host of others listed in the Post Scan section below.

 

  • Is there a RESTful API (Swagger-based)?  Run the CLI tool, WISwag.exe, against it ahead of time.  Save the output from that tool and use it to configure or drive the WebInspect scan against that API.

 

  • Is there a REST API?  Ask the developers to generate its WADL file, then import that into your Custom Parameters scan setting before the next scan.

 

  ================================================

Web Services Scans (SOAP):

 

  • The Web Services Scan wizard is meant only for SOAP and WSDL-based web services, not for RESTful web services.

 

  • Always, first run the Web Service Test Designer tool (formerly the SOAP Editor) against the WSDL.  Configure everything needed for a good test.

 

  • Always save the output **.WSD) from the Web Service Designer tool and include that file in the scan's settings.
    • This is akin to using the Web Forms input for a standard scan, except more important here.

 

 

  ================================================

User Account will Lock Out:

  • For the scenario where the test account can get locked out from too many false login attempts, first verify you have handle the risk of resetting the user's password with the dummy values.  You may need to use Session Exclusions for this, and perhaps the Web Form Editor tool.

 

  • See if the user or customer can have that lock-out feature disabled for that test account!!
    1. WebInspect's goal is to login and test forms, not to validate the login system nor the perimeter defenses.
    2. Keep asking for this special test account even after you have some workarounds later.

 

  • Common Alternatives:
  • Add a Session Exclusion for the logon page, e.g. “\/login\.aspx”.
    • Most common first idea on this issue.  This method unfortunately means that the forms on that page will never be Audited, so vulnerabilities in them will not be discovered.  If that is a concern, you might try a targeted scan of only the login page at a later time.
    • This configuration assumes that the scan's Starting URL is not the login page.
    • Secret on why this works - The Login Macro is permitted to go anywhere it needs to in order to log on, and yet its sessions will not be added to the scan scope, so this Exclusion should not affect the logon process overall.  If it does, fine-tune your Session Exclusion so that it still permits Crawl (Exclude) but not Audit (Reject).  See the WebInspect Help page (F1) for that particular scan settings panel for details and differences between those two terms.
  • Add the username/passwords forms to the Attack Exclusions scan settings panel.  This should permit the forms on that page to be audited while only excluding the two fields that would lock the account if they were fuzzed.
  • You might need to drop the Requestor threads to 1, "Single Shared Requestor".  The normal multiple threading could be aggravating the session state management.  However, be warned that this setting change can lengthen the scan time dramatically.

 

  • What if there is a concern that it did not properly logout when session state is lost?

                    (Very rare scenario, where re-attempting the login will lock the account if they did not officially logout.)

    1. When recording the Login Macro, start with it NOT in Record mode, only browsing.
    2. Once you have logged in, then start the Record function.  Now record the logout and then the login process, and save the macro per normal use.
    3. On Replay, this custom macro will attempt to log you out, or simply play some useless steps. You will want to mark those logout steps as Optional so they do not hold up the macro playback if they are unneeded later.
    4. Revisit your Logout Conditions, to ensure they are as accurate as you can identify.  The scanner should normally be logged out before the macro is called.
  • Try the additional stuff below...
  • If the logout is triggered by a specific value in a script or menu, review the Filters scan settings.  You may be able to foil the Logout link's action by dynamically replacing that value with a useless value.
  • Change the Requestor threads to a "Single Shared Requestor".
    • Guarantees that the scanner will browse like a single user rather than several.
    • Slowest possible scan.
  • Using a Login Macro can help identify a locked out user account.  Look in the Scan Log screen for any instances of "Expected vs Actual" that will high-light when the account gets locked out and the Macro Replay is failing.  this trick cannot fix the situation, but can alert the scan user.
  • Correct or modify all of the Username/Password variation entries in the Web Forms input file to have only the valid values for the current test account rather than the default dummy values.
  • You probably do not need to omit the username/password fields as Attack Exclusions, unless they are also accessible or can be manipulated after the user is logged in.
    • i.e. it is OK to attack them as an anonymous user, but not while logged on as our test account.
  • Enable the Traffic Monitor logging scan setting to assist with trouble-shooting the scan attempts until you get the settings right.  It is disabled by default to save on scan space/time.
  • Perhaps make a Custom Check (Keyword Search, in the Policy Manager tool) that identifies and flags on the page that announces, "dude, you have locked out your fricking account, please contact your administrator".
    • This will help notify the scan user when the scan begins getting those logged out (locked out) responses rather than the normal login responses.

 

  ================================================

(eof)


-- Habeas Data
Micro Focus Fortify Customers-Only Forums – https://community.softwaregrp.com/t5/Fortify/ct-p/fortify
Labels (5)
1 Reply
nanchun.xiang@h Absent Member.
Absent Member.

Re: WebInspect Scan Configuration Tricks and Best Practices

Wow~ fabulous summary and tips! Thanks a lot Hans.

0 Likes
The opinions expressed above are the personal opinions of the authors, not of Micro Focus. By using this site, you accept the Terms of Use and Rules of Participation. Certain versions of content ("Material") accessible here may contain branding from Hewlett-Packard Company (now HP Inc.) and Hewlett Packard Enterprise Company. As of September 1, 2017, the Material is now offered by Micro Focus, a separately owned and operated company. Any reference to the HP and Hewlett Packard Enterprise/HPE marks is historical in nature, and the HP and Hewlett Packard Enterprise/HPE marks are the property of their respective owners.