login

Burp Suite, the leading toolkit for web application security testing

Burp Spider Options

This tab contains options for the basic crawler settings, passive spidering, form submission, application login, the Spider engine, and HTTP request headers.

Crawler Settings

These settings control the way the Spider crawls for basic web content:

Passive Spidering

Passive spidering monitors traffic through Burp Proxy to update the site map without making any new requests. This enables you to map out an application's content and functionality in a very controlled way using your browser.

The following options are used to control passive spidering:

Form Submission

These settings control whether and how the Spider submits HTML forms. Simply following linked URLs will achieve limited coverage of most applications. To discover all of an application's content and functionality, it is generally necessary to submit forms using realistic inputs.

The following options are available:

Application Login

These settings control how the Spider submits login forms.

Because of the function that authentication plays in web applications, you will often want Burp to handle login forms in a different way than ordinary forms. Using this configuration, you can tell the Spider to perform one of four different actions when a login form is encountered:

Note that you can also use the suite-wide session handling rules to deal with authentication while performing automated spidering. If you use session handling rules to maintain a valid session with the application, then you should configure the Spider not to submit login forms, to avoid disrupting your session. 

Spider Engine

These settings control the engine used for making HTTP requests when spidering. The following options are available:

Careful use of these options lets you fine tune the spidering engine, depending on the performance impact on the application, and on your own processing power and bandwidth. If you find that the Spider is running slowly, but the application is performing well and your own CPU utilization is low, you can increase the number of threads to make your spidering proceed faster. If you find that connection errors are occurring, that the application is slowing down, or that your own computer is locking up, you should reduce the thread count, and maybe increase the number of retries on network failure and the pause between retries.

Request Headers

These settings control the request headers used in HTTP requests made by the Spider.

You can configure a custom list of headers to be used in Spider requests. This may be useful to meet specific requirements of individual applications - for example, to emulate an expected user agent when testing applications designed for mobile devices.

The following options are also available:

User Forum

Get help from other users, at the Burp Suite User Forum:

Visit the forum ›

Thursday, November 27, 2014

v1.6.09

This release fixes a problem affecting some users of 32-bit systems with the new handling of temporary files that was introduced in v1.6.08.

See all release notes ›

Copyright © 2014 PortSwigger Ltd. All rights reserved.