URL crawling tool

I am looking for a tool to crawl a given URL for security vulnerabilities. I have done a bit with search engines and found a few, but most of them require something open source and my DEV environment is not public or they have an expensive solution that is bigger than what I need right now. I don't need anything super powerful as I'm just doing light testing right now and the QA people will run a more complex battery of tests later.

EDIT: Use case to clarify

  • I am transferring a scanning tool i.e. http://www.host.com/path/to/page.asp
  • It runs a series of tests on these pages to see if there are any possible security vulnerabilities it exposes. Examples include, but are not limited to, SQL injection, parallel side scripts, etc.
+2


source to share


3 answers


Assuming you want to scan your web application by pointing the application's "base" URL to a penetration testing tool, you will find the OWASP Live CD to be useful. Grendel-Scan , available on CD, may be the most useful as it seems to be the most mature among penetration testing tools on the list, Nikito and OWASP Wapiti Project are another penetration testing tools on Live CD.



In addition, the Watcher plugin for Fiddler is also capable of detecting certain vulnerabilities in the application, although this requires that individual pages in the application be visited with Fiddler as a proxy.

+3


source


There are two forms of tools for this: one type of tool contains a list of known problems (bug in IIS version 5.34 or whatever) and goes through the list that tries to fix each problem. Tools of this kind also use common filenames like robots.txt and web.config, etc. Nikito is an example of this type.



There is also a type that will look at all querystring / cookie / form parameters and adjust them to try and throw errors. I believe this is what will serve you best and for that I have recommended a proxy server. http://portswigger.net/proxy/ There is a free version and a pro version. Also in this toolkit are expensive things like IBMs appscan and Hps webinspect.

+1


source


Are you talking about scanning a URI that someone has requested from your site?

If so, you can use the .htaccess file to simply redirect to the 404 page any URI that doesn't exist or is not found in the database (depending on how you are building your site).

This way you can force requests to flow in a certain way and anything that won't automatically receive canned food.

0


source







All Articles