Skip functionality non-clickable

Dear Team

In the dashboard, there is a functionality called skip. 

However, I have never been able to use this feature. Skip is not at all clickable. 

My question is - how can it used ? 

Parents
  • When conducting a sequential crawl and audit, you can skip processing by whichever engine is running (if you selected Test each engine type per session) or you scan skip processing the session (if you selected Test each session per engine type). See the Help Guide (F1) description on "Scan Settings: Method (sequential behavior)" for more information.

    The default crawl and audit setting is not sequential, but simultaneous.  This is most likely why the Skip button is not currently available for your scan.  Can you check that under the Current Scan Settings?

     

    Crawl and Audit Mode

    The following options are available:

     

    1. Simultaneously - As WebInspect maps the site's hierarchical data structure, it audits each resource (page) as it is discovered (rather than crawling the entire site and then conducting an audit). This option is most useful for extremely large sites where the content may possibly change before the crawl can be completed.

     

    2. Sequentially - In this mode, WebInspect crawls the entire site, mapping the site's hierarchical data structure, and then conducts a sequential audit, beginning at the site's root.

    If you select Sequentially, you can specify the order in which the crawl and audit should be conducted:

    • Test each engine type per session (engine driven): WebInspect audits all sessions using the first audit engine, then audits all sessions using the second audit engine, continuing in sequence until all engine types have been deployed.

    • Test each session per engine type (session driven): WebInspect runs all audit engines against the first session, then runs all audit engines against the second session, continuing in sequence until all sessions are audited.

     

  • Hi HansEnders

    Thanks for the response. I scanned some applications using this setting. It worked fine for some of the applications.
    However, for big applications, I faced problem. My objective behind using the skip was to reduce the scan duration, especially for big applications.

    For an applications having 2k crawled sessions(and counting), I tried to skip the crawl sessions and force my scan to go to audit mode. But every time I clicked on skip, there would be a skip to the next crawl session.

    Finding the crawl taking too much time, I had to again start the scan, and with the simultaneous crawl audit method.

    My questions are

    1. How could I have skipped the scan to the audit mode?

    2.. What is the best strategy for a big applications ? Should I go for List Driven scan ? The challenge I am facing is that the scans are not being completed. I had thought of crawling the application using Burp, making a list of the crawled URLs, and doing 2-3 scans by dividing the list of URLs. Will this lead to effective scan ? And will it scan the whole application?

     

Reply
  • Hi HansEnders

    Thanks for the response. I scanned some applications using this setting. It worked fine for some of the applications.
    However, for big applications, I faced problem. My objective behind using the skip was to reduce the scan duration, especially for big applications.

    For an applications having 2k crawled sessions(and counting), I tried to skip the crawl sessions and force my scan to go to audit mode. But every time I clicked on skip, there would be a skip to the next crawl session.

    Finding the crawl taking too much time, I had to again start the scan, and with the simultaneous crawl audit method.

    My questions are

    1. How could I have skipped the scan to the audit mode?

    2.. What is the best strategy for a big applications ? Should I go for List Driven scan ? The challenge I am facing is that the scans are not being completed. I had thought of crawling the application using Burp, making a list of the crawled URLs, and doing 2-3 scans by dividing the list of URLs. Will this lead to effective scan ? And will it scan the whole application?

     

Children