Home page logo

nmap-dev logo Nmap Development mailing list archives

Re: [NSE] script to measure the time a website takes to deliver its pages
From: Patrik Karlsson <patrik () cqure net>
Date: Fri, 23 Mar 2012 15:35:00 -0400

On Thu, Mar 8, 2012 at 3:21 AM, Gutek <ange.gutek () gmail com> wrote:

Hash: SHA1

Le 08/03/2012 02:01, Fyodor a écrit :
On Tue, Mar 06, 2012 at 02:03:57PM +0100, Gutek wrote:

It first uses httpspider to take an instant measure on each page, then
query each url 5 times with an anti-cache trick to measure an average
I'm not sure about a script name that could be consistent with the Nmap
scripts terminology, and about the categories as well: it's not DoS as
defined in the script categories, but it's DoS related...

Hi Gutek, I think this is a clever and useful script!  But I'm
wondering if it would make sense to just test the given page (e.g. the
default of "/" or the one specified by http-chrono.url) by default,
and then do the spidering by request.  Perhaps this could be done by
setting http-chrono.maxpagecount to 1 by default rather than 20.

That way the script gives an estimate of the web server's overall
speed by default (useful for comparing multiple web servers to find
the slow ones), but a user can easily specify a wider scan of many
paths on an individual webserver if desired.


Hi, thanks for the smart advice !

here is an updated version accordingly with a more documented @usage block:
- ---
- -- @usage
- -- Without any optional argument the script defaults to chrono only the
first page, which is root ("/") by default.
- -- With this in mind:
- -- o if you want to do a quick test on a bunch of webservers for their
overall speed without making much noise then try:
- -- nmap --script=http-chrono <target1> <target2> <target..>
- -- o you may already have a clue of a ressource-intensive page on your
targets list, let's say /forum/memberlist.php because it probably
stresses the database:
- -- nmap --script=http-chrono
- --script-args='http-chrono.url="/forum/memberlist.php"' <target1>
<target2> <target..>
- -- o when you have identified the slower one, then you may want to
analyse all known pages:
- -- nmap --script=http-chrono
- --script-args='http-chrono.maxpagecount=[max number of links to
chrono|-1 for any link found within maxdepth]' <target>
- --


Version: GnuPG v2.0.16 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/


Sent through the nmap-dev mailing list
Archived at http://seclists.org/nmap-dev/

Nice work. Sorry for taking so long to come around re-viewing it.
I ended up doing a few adjustments to this script, making two different
output options.
The first short output is used when only a single page is measured, the
long one is a tab-based one when more than one page is tested.
Also, the number of iterations performed while measuring is now accessible
as a script argument and the script now reports min, max and average
times. I've committed it as r28333. Let me know what you think.


Patrik Karlsson
Sent through the nmap-dev mailing list
Archived at http://seclists.org/nmap-dev/

  By Date           By Thread  

Current thread:
[ Nmap | Sec Tools | Mailing Lists | Site News | About/Contact | Advertising | Privacy ]