Scan of hybrid Mobile Application
I am trying to scan an application which has a fixed URL. For example, the URL is www.myapp.com/app
The problem is that the URL remains same while I browse to any page in the application.
The URL remains same if I browse to setting, contacts etc. So if I try to scan the application through manual mode, WI is not able to record the URLs.
I tried using work flow macro and faced some issues there too.
Please let me know how to scan such application.
This sounds like the entire web application consists of a single web page with a massive variety of parameters to provide the screens/data. Under most conditions WebInspect would only hit one, unique Session (Session = page plus parameters) a handful of times. So as the parameters change, the Crawler can "revisit" that page a handful of times for each combination. Regardless of this setting's value, this should not limit the Audit phase at all in terms of revisits and re-requests.
If you find this is Crawler feature not functioning for this site, i.e. it is not getting all permutations needed, then you would want to modify the General scan settings panel. In particular, you will want to increase the value of "Limit maximum single URL hits" from the default 5 to something much higher like 1,000, 2,500, or 3,000. (I'm just guessing at working values.) You would leave the sub-option for "Include parameters in hit count" as it is (True).
Also bear in mind that the targeted host may offer more web applications than this single web page. You may need to verify this and then enable the Restrict To Folder (RtF) option found in the Scan Wizard right next to the Starting URL field. When using this feature, the restricted folder is understood as the final item in the URI surrounded by forward slash marks ("/"), so be sure the Starting URL reads as (example) "//host/restricted folder/" rather than (example) "//host/restricted folder", because that second URI lacking that trailing slash mark would permit the Crawler to climb all the way up the web server and out of that folder.
Lastly, with a single page application, sometimes it is necessary to slow down the requests to put them in a "proper' click order. If your initial scan (with Traffic Monitor scan setting enabled) shows the request flows are dropping all over the place, you may need to enable the Depth First crawler setting found under the General panel, plus its Audit Details sub-option (bottom of the screen). This will add time to the scan as dynamically generated "mini-macros" run and rerun throughout the background.
If session state is also dropping, then you may need to set the Requestor scan settings to a Single Shared Requestor. This will cause a much longer scan as it becomes 1 thread scan instead of the default 11, but some web apps will permit increasing the number of threads under this setting. Test and see.
The Traffic Monitor feature can bloat the scan size with its logs, so normally it is disabled, but I would enable it while you are performing these tests. You might also change the scan type from Crawl-and-Audit to just Crawl-Only until you are sure it is discovering and parsing the site well.
-- Habeas Data
Micro Focus Fortify Customers-Only Forums – https://community.softwaregrp.com/t5/Fortify/ct-p/fortify