Nmap Development mailing list archives

Re: Web App Scanner - GSoC 2009


From: Fyodor <fyodor () insecure org>
Date: Mon, 30 Mar 2009 00:05:19 -0700

On Sat, Mar 28, 2009 at 02:31:56PM -0000, Rob Nicholls wrote:
This sounds like it would make a couple good NSE scripts

I've attached a script of mine that only checks a few files and folders at
the moment and would need to be expanded over time (especially for more
advanced checks when the server returns 200 for non-existent files - 
although that might just be a "nice to have" - and identifying folders
that allow directory listings), but I thought I'd share it and perhaps
others will have some suggestions/improvements in mind.

Nice.  Thanks Rob.

I've used quite a few web application tools over the last few years and
I'm not sure that a command line tool like Nmap would be the best place to
add some of the suggested functionality.

Tools like Nikto certainly have their use, and perhaps we can produce a
similar NSE script so Nmap can match most of the functionality (and
possibly licence CIRT's database?); but for "serious" web application
testing I'd personally want a dedicated tool that lets me do things like
script logins, complete forms interactively in a browser, can follow
JavaScript links, test sites that use multiple servers/subdomains, lets me
manually crawl a website in a browser, doesn't produce too many false
positives, and (most importantly) lets me see the request and response
when I'm writing a report.

I think both tools are extremely valuable.  Experts with time on their
hands will want a tool like you mentioned which has all that
interactive browsing functionality.  But there is also a lot of value
in an automated tool (nmap+NSE) which can learn and report on as much
interesting stuff as it can without requiring any extra work on your
part beyond reading the report at the end and maybe (for advanced
users) specifying some script arguments when it starts.

Being able to compare the results against some form of vulnerability
database sounds good, but medium-large sites often return several
gigabytes worth of traffic and this might not be healthy for Nmap (or the
host) to hold in memory and could be tricky to store into XML files etc.

Yeah, that is something to deal with.  I think it may be like with the
Nmap brute force scripts in that we have reasonably small default
resource limits, but the user can increase them if he wants a more
intense scan.

A separate tool, similar to how Zenmap/Ncat has been developed by Nmap
developers, might be better so results can be managed with a GUI and data
stored in a database, rather than trying to extend Nmap.

Separate tools are great too.  But I don't think they obviate the need
for better web functionality in NSE.

Cheers,
-F

_______________________________________________
Sent through the nmap-dev mailing list
http://cgi.insecure.org/mailman/listinfo/nmap-dev
Archived at http://SecLists.Org


Current thread: