Alternative For BURP Spider

,

I Was Going Through the BURP Lab in the PTS course, i saw that some of the features have been removed in the newer versions of BURP Suite, particularly the ‘Spider’ Feature. Rather Than Installing an Older Version of Burp, what alternatives can I use?

Yes, rather than installing an older version you can go with the many number of tools available on the internet. I Personally would recommend the OWASP Zap. You Can Find More Information On It on the given Links.

OWASP Zap - https://www.zaproxy.org/
Spider Feature - OWASP ZAP

2 Likes

Hi!
There is already a topic in the forum that could help you: Completing the exam without Burp? :grin:

4 Likes

I Completely Forgot about searching the forum first, Thank You For Your Replies @parzival & @Z3r0n37 . I Will Make Sure not to create duplicate Topics the next time.

1 Like

No problem!
If you want you can flag my response as the solution, in this way the post will be labeled as solved! :grin:

~Z3r0n37

1 Like

Hi Arche,

If you do want to use Burpsuite then you could try spidering with skipfish. Below I’ve copied the notes I made on this previously.

Spidering / Crawling a Website Through Burpsuite

Crawling is now a Pro version feature only, so need to use a separate crawler to do this. Can use skipfish . Since skipfish does not support proxies, must configure Burp to be an invisible proxy.

Then force skipfish to resolve the host to the proxy address.

Skipfish -F www.domaina.com=127.0.0.1 -O -o outputdir http://www.domaina.com

(Note: skipfish will submit LOTS of form requests – suppress this behaviour for spidering)

If the website URL is using an IP address you will need to also configure the Burpsuite proxy to redirect the traffic to the correct host.

Burp_TP

Skipfish -W -Y -O -o outputdir http://127.0.0.1

Additional skipfish notes
Throttling requests: If you fear overloading the server, you can take throttle the amount of requests per second using the -l option. For example, using “-l 10” will force skipfish to not send more than 10 requests per second.
In addition, you can lower the number of simultaneous connections using the -m parameter.

Limiting Scope of Spidering
-I string - only follow URLs matching ‘string’

Dictionary management options:
-W wordlist - use a specified read-write wordlist (-W on its own disabled use of dictionary
attack)
-S wordlist - load a supplemental read-only wordlist
-L - do not auto-learn new keywords for the site disables bruteforcing …I think…
-Y - do not fuzz extensions in directory brute-force

1 Like