mailing list archives
From: George Chatzisofroniou <sophron () latthi com>
Date: Mon, 26 Aug 2013 15:48:36 +0300
The attached script tries to find out the technology behind the target website.
Currently supported tools are: Django, Ruby On Rails, ASP.NET, CodeIgniter,
CakePHP, Symfony, Wordpress, Joomla, Drupal and MediaWiki.
The script checks for certain defaults that might not have been changed, like
common headers or URLs or HTML content. The fingerprint data lies in
Each record in there has two callbacks:
rapidDetect - This is called by the main script in the beginning of detection
process. It takes the host and port of target website as arguments.
consumingDetect - This is called for each spidered page. It takes the body of
the response (HTML code) and the requested path as arguments.
The idea is to split the detection process into two phases. One that occurs at
the beginning and in this one the script performs some specific checks (like if
'anti-csrf' header exists for Django or if /wp-admin/ page exists for Wordpress)
and one that occurs after and in this one the script crawls through the website
and searches for specific patterns in headers or the HTML code.
By default both phases are enabled, but by turning the 'rapid' option on, we
actually disable the crawling.
I've tested it against a lot of websites and it seems to work good. It won't
always find the underlying tool but it will in many times. This mostly depends
on the used framework. For example, Django or RoR is easier to detect compared
to ASP.NET or CakePHP.
Also note, that while the script does some guessing, there's no way to determine
what technologies a given site is using.
You can try it by yourself like this:
./nmap -p80 -n -Pn --script http-devframework some-random-page.com -d1
Let me know what you think,
Sent through the dev mailing list
Archived at http://seclists.org/nmap-dev/
- [NSE] http-devframework.nse George Chatzisofroniou (Aug 26)