Home page logo

nmap-dev logo Nmap Development mailing list archives

RE: Storing scan results
From: "Grodås, Ole Morten" <omgrodaas () fih mil no>
Date: Mon, 20 Jun 2005 20:25:22 +0200


I have to agree that adding db support to nmap will be a problem. After reading your mail I got another Idea that might 
work around the db problem. 

Having the GUI Frontend read xml output from nmap and saving it in a db is a good solution except when you want to run 
regular scans from cron.d. This problem can be solved by making a small nmap2db command line tool, using the same xml 
to db code that is used in the GUI Fronted 
You could then run nmap commands like this:

nmap [scan options] target | nmap2db -localdb
nmap [scan options] target | nmap2db -h [hostname] -u [username] -p [password] -d[database]

My opinion is that if you want the GUI to have advanced search, sort and compare functons the best solution will be to 
save results in a database

You talk about the problems with a static database structure. I my opinion this is not a big problem. Adding new 
metadata will of course be more problematic, but this is not something that happens very often in nmap. I have created 
suggestion on the database design, A MLU diagram can be found her: 

-----Original Message-----
Fra: Adam Jones [mailto:ajones1 () gmail com]
Sendt: 17. juni 2005 15:53
Kopi: nmap-dev () insecure org
Emne: Re: Storing scan results

Your mail has been scanned by InterScan VirusWall.

I think the better way to handle it is to make sure that any changes
to nmap assist in the XML to database conversion. Although I like the
idea of a database as an output format unless you standardize on one
table structure you will be going beyond the scope of what (IMO) nmap
does. A database upload tool that allows arbitrary structure and
supports whatever metadata or key relationships the user desires is a
huge project in and of itself. There is no easy way to allow the user
to tell nmap exactly how to store the data in tables, and maintain the
relationships correctly, and collect whatever other data is necessary
to perform the database work.

The other end of the spectrum is using a standard table structure,
much the way that nmap xml output is standardized. The problem here is
that xml can be said to be inherently transformable, where a database
is a more rigid structuring of data. A standard xml document is no big
deal, you are generally a small matter of xsl work away from the
correct document, especially when you consider that another xml file
can be used to supply metadata.

To that end I think the best way to handle this is to make nmap as
database friendly as possible. I would vote against having the tool
perform uploads to the database, as on one end we could not provide
enough support, and on the other there are better tools for the job.

Possibly nmap could have flags added to specify xsl translations after
the scan has finished. This would let nmap spit out data pre-formatted
into the most convenient structure for whatever database people are
working with.

On 6/15/05, "Grodås, Ole Morten" <omgrodaas () fih mil no> wrote:
Hi Anthony

While I do see that perl is well suited language to use for this XML to database conversion. It is my opionion that 
this should be handled by nmap and written in C++. I think it will be a mistak to make nmap dependet on perl only for 
simplifying this convertion. Including this as a function in nmap will give us a standardized way to save scanresults 
in databases. You then have the opportunity to use whatever tool you would like to do analysis on this results.

-----Original Message-----
Fra: Anthony Persaud [mailto:apersaud () gmail com]
Sendt: 15. juni 2005 17:59
Til: soc () insecure org; nmap-dev () insecure org; Grodås, Ole Morten
Emne: RE: Storing scan results

For administrators, who end up writing scripts in perl (for
crons..etc) to store and analyze nmap scans - there is an Nmap::Parser
Perl module that will either take parse the xml output file from nmap
(parsefile()), or perform a scan of its own and parse it

You can use this module, to manipulate the data however you want. I
have written an initial script (quick n' dirty), that will be
available in the next release of the module. (Fully documented, and
more features etc..)

It uses SQLite as the database interface and stored (or updated) the
entries of the table with IP,MAC,Status,Hostname,Open TCP Ports,
Filtered TCP Ports, OS Family, OS Gen and time of scan.

Hopefully others can find this useful. (Note that this is the initial
version of the script, but it should work) For more info:
http://www.nmapparser.com or


Anthony Persaud

Sent through the nmap-dev mailing list

Sent through the nmap-dev mailing list

Sent through the nmap-dev mailing list

  By Date           By Thread  

Current thread:
[ Nmap | Sec Tools | Mailing Lists | Site News | About/Contact | Advertising | Privacy ]