Home page logo
/

nmap-dev logo Nmap Development mailing list archives

Re: [NSE] http matching library
From: Patrik Karlsson <patrik () cqure net>
Date: Thu, 16 Jun 2011 07:35:15 +0200


On Jun 16, 2011, at 12:39 AM, Ron wrote:

hey,

this looks really cool! I disagreed with the concept initially, but I've seen limitations in http-enum.nse myself, 
and I like what you're doing. 
Thanks!


I have two questions...

First, can we in some way combine this with http-enum.nse? If they could share configuration files, that'd be handy. 
I think that's most likely possible.


Second, at some point I'd like to finish writing http-spider.nse. I think this could benefit in a two-way 
relationship - giving http-spider places to look, and using the results from http-spider. Scripts can't have two-way 
dependence (and shouldn't), though, so I'm not sure what the best conceptual way of doing that is. 
Have you read the RFC on "Improving NSE HTTP architecture" posted by Djalal the other day?
If not I think you should. It contains some great ideas and some parts about crawler/spider:
It's over here: http://seclists.org/nmap-dev/2011/q2/967

//Patrik


Ron

On Sat, 30 Apr 2011 08:49:54 +0200 Patrik Karlsson <patrik () cqure net> wrote:
Given the recent increase in http/web scripts I thought I would put
some work out on the list I did a while back. I started working on it
right about the time Ron did his big overhaul of the http-enum
script. [1] My idea was to implement what the http-enum script does
today, but de-couple the probes and matches from each other. The
response wasn't very positive at the time.

Anyway, I did some more work on it and ended up creating a http-match
library which pretty much does regexp matching based on rules created
on-the-fly or loaded from a file. There's a script called http-fp
that implements the decoupled probe and match approach on something
similar to http-enum. It does so by loading all the rules (probes and
matches) from a file (nselib/data/urls.txt). Once the probes have run
matchers are used to process the response. Each matcher don't
necessarily have to run for each probe as they can be restricted by
url or category. In addition to regexp's a match can contain Lua code
that will be executed on the http response received from the server.

To be clear, I'm not suggesting we add this script, but seeing the
increase of small scripts that do different types of matching lately,
I think someone may find the library useful. Also, please consider
the code for what it is (maybe something useful) as it is not as well
documented as what I usually put out to the list, or as finished as I
would like. In order to make some of the matches I needed I factored
the cookie code out from http.lua into cookie.lua. I'm attaching this
as well. To better understand how it all works, the http-fp script
may be useful. In order to get it to run you need to drop the url.txt
into nselib/data/urls.txt and copy both libraries (httpmatch.lua and
cookie.lua) into nselib.

Here's a few sample matches in order to give you an idea of how it
works:

-- Detect .NET applications
match { status="200", ['header.x-powered-by']="(ASP.NET)",
['header.x-aspnet-version']="(.*)", type="framework",
desc="#header.x-powered-by_1# #header.x-aspnet-version_1#" }

-- Output any cookies set by the application
match { status="200", ['header.set-cookie']="(.*)", type="cookie",
desc="#header.set-cookie_1#" }

-- Detect WordPress
match { status="200", body="\<meta name=\"generator\"
content=\"(WordPress.-)\"", type="app", desc="#body_1#" }

-- Output contents of robots file
match { path="^/robots.txt$", status="200", type="additional", desc=
function(r) 
     local tbl = stdnse.strsplit("\r?\n", r.body)
     tbl.name = "Robots content"
     return tbl
end
}

-- Check whether the session cookie is assigned as HttpOnly
match { status="200", type="debug", ['header.set-cookie']='.*', desc=
     function(r) 
             local cookies = cookie.parse(r.header['set-cookie'] )
             local result = {}
     
             for _, cookie in ipairs(cookies) do
                     if ( cookie:isSessionCookie() and
not( cookie:isHttpOnly() ) ) then table.insert( result,
("OWASP-SM-002: Cookie (%s) is not set as
HttpOnly"):format( cookie:getName() ) ) end end
             return result
     end 
}

-- Check if the cookie was assigned with the secure attribute
match { status="200", type="debug", ['header.set-cookie']='.*', desc=
     function(r) 
             local cookies = cookie.parse(r.header['set-cookie'] )
             local result = {}

             for _, cookie in ipairs(cookies) do
                     if ( options.ssl and
not( cookie:isSecure() ) ) then table.insert( result, ("OWASP-SM-002:
Cookie (%s) is not set as secure"):format( cookie:getName() ) ) end
             end
             return result
     end 
}


-- Calculate SHA1 hash
match { status="200", type="additional", desc = function(r) return
"SHA1 hash: " .. select(2, bin.unpack("H20", openssl.sha1(r.body)))
end}

//Patrik

[1] http://seclists.org/nmap-dev/2010/q4/112


_______________________________________________
Sent through the nmap-dev mailing list
http://cgi.insecure.org/mailman/listinfo/nmap-dev
Archived at http://seclists.org/nmap-dev/

--
Patrik Karlsson
http://www.cqure.net
http://www.twitter.com/nevdull77

_______________________________________________
Sent through the nmap-dev mailing list
http://cgi.insecure.org/mailman/listinfo/nmap-dev
Archived at http://seclists.org/nmap-dev/


  By Date           By Thread  

Current thread:
[ Nmap | Sec Tools | Mailing Lists | Site News | About/Contact | Advertising | Privacy ]
AlienVault