Home page logo
/

nmap-dev logo Nmap Development mailing list archives

Re: NSE: http-phpself-xss - Finds PHP files with reflected cross site scripting vulns due to unsafe use of the variable $_SERVER[PHP_SELF]
From: Paulino Calderon <paulino () calderonpale com>
Date: Tue, 05 Jun 2012 10:35:13 -0500

I've asked Patrik and this discussion has come up several times. The decision was to have separate crawlers and depend on the HTTP caching system. The main reasons were:

-Each spidering script is likely to need a different subset of pages
-With only one crawler, we will have to crawl everything and this might take a long time and by limiting we could also lose data.

It is worth noting that the caching system for HTTP does not work for pipelined requests and when using several spidering scripts we should consider that it is likely we repeat requests.

Cheers.

On 01/06/2012 02:11 p.m., King Thorin wrote:








Would there be a way or would it make sense to implement a method by which HTTP scripts can hook into a single crawler 
and test things page by page in order to avoid crawling/spidering the same content for all (or each selected) HTTP 
script over and over again?
                                                                                
_______________________________________________
Sent through the nmap-dev mailing list
http://cgi.insecure.org/mailman/listinfo/nmap-dev
Archived at http://seclists.org/nmap-dev/


--
Paulino Calderón Pale
Website: http://calderonpale.com
Twitter: http://twitter.com/calderpwn

_______________________________________________
Sent through the nmap-dev mailing list
http://cgi.insecure.org/mailman/listinfo/nmap-dev
Archived at http://seclists.org/nmap-dev/


  By Date           By Thread  

Current thread:
[ Nmap | Sec Tools | Mailing Lists | Site News | About/Contact | Advertising | Privacy ]