The Target tool gives you an overview of your target application's content and functionality, and lets you drive key parts of your testing workflow. The key steps that are typically involved in using the Target tab are described below.
First, map the target application manually. To do this, carry out the following steps:
This manual mapping process will populate the Target site map with all of the content requested via the Proxy, and also (via passive spidering) any further content that can be inferred from application responses (via links, forms, etc.). This manual mapping process will build up a fairly complete record in the site map of all the visible application content, and also fully familiarize you with the application.
Note: Provided you have sufficient time, mapping applications manually in this way is generally much safer and more effective than moving directly to automated crawling techniques. It gives you direct control over what areas of the application are mapped, and what requests are made. If you encounter some "dangerous" administrative functionality (for example, which deletes user accounts or modifies critical application settings), you can proceed with caution and avoid any unintended consequences. Further, working manually enables you to sense check your progress within the browser, to ensure that multi-stage processes are correctly completed, that input validation routines are satisfied, etc.
When the initial application mapping is completed, this is a good time to define your Target scope, by selecting branches within the site map and using the "Add to scope" / "Remove from scope" commands on the context menu. You can then configure suitable display filters on the site map and Proxy history, to hide from view items that you are not currently interested in.
Review the site map for any items in your target that have been detected via passive spidering but have not yet been requested. These items are shown in gray in the site map. You can also quickly locate unrequested items by selecting the whole application in the tree view, and sorting the table view on the "Time requested" column (by clicking the column header) - unrequested items will then be grouped together. You should manually review these items (for example, by copying each URL into your browser) to confirm whether they contain any further interesting content.
If the application is very large, or you want to speed up the mapping process, you can perform automated spidering to fill out the site map's content. You should use this technique with caution, as it gives you less direct control over what gets requested.
Having mapped all of the application's visible content (i.e. that which can be observed by browsing the application and following all links), you can optionally carry out some automated actions to identify further "hidden" content that is not linked from visible content:
When you are satisfied that you have mapped all of the application's content and functionality, you should review the contents of the site map (together with the Proxy history) to understand the attack surface that the application exposes. You can use the following site map features to support this task:
Having fully mapped the application and assessed its attack surface, you can drive your detailed vulnerability testing workflow from the site map:
Get help and join the community discussions at the Burp Suite Support Center.
This release introduces a new scan check for second-order SQL injection vulnerabilities. In situations where Burp observes stored user input being returned in a response, Burp Scanner now performs its usual logic for detecting SQL injection, with payloads supplied at the input submission point, and evidence for a vulnerability detected at the input retrieval point.