Home page logo

nmap-dev logo Nmap Development mailing list archives

RE: NSE: http-phpself-xss - Finds PHP files with reflected cross site scripting vulns due to unsafe use of the variable $_SERVER[PHP_SELF]
From: King Thorin <kingthorin () hotmail com>
Date: Fri, 1 Jun 2012 15:11:28 -0400

To: nmap-dev () insecure org
From: Paulino Calderon <paulino () calderonpale com>

Date: Thu, 31 May 2012 01:19:56 -0500

Hi list,

Here is a script for detecting reflected XSS in PHP files that don't 
sanitize the variable $_SERVER["PHP_SELF"]:
Crawls a web server looking for PHP files that use the variable 
$_SERVER["PHP_SELF"] unsafely.
This script crawls the webserver to create a list of PHP files and then 
sends an attack vector/probe to identify PHP_SELF cross site scripting 
PHP_SELF XSS refers to reflected cross site scripting vulnerabilities 
caused by the lack of sanitation of the variable 
<code>$_SERVER["PHP_SELF"]</code> in PHP scripts. This variable is
commonly used in php scripts with forms and when the current URI is needed.

Examples of Cross Site Scripting vulnerabilities in the variable 

The attack vector/probe used is: <code>/'"/><script>alert(1)</script></code>
-- @usage
-- nmap --script=http-phpself-xss -p80 <target>
-- nmap -sV --script http-self-xss <target>
-- @output
-- 80/tcp open  http    syn-ack
-- | http-phpself-xss:
-- |   Unsafe use of $_SERVER["PHP_SELF"] in PHP files
-- |     State: VULNERABLE (Exploitable)
-- |     Description:
-- |       PHP files are not handling safely the variable 
$_SERVER["PHP_SELF"] causing Reflected Cross Site Scripting vulnerabilities.
-- |
-- |     Extra information:
-- |
-- |   Vulnerable files with proof of concept:
-- |     
-- |     
-- |     
-- |     
-- |   Spidering limited to: maxdepth=3; maxpagecount=20; 
-- |     References:
-- |       https://www.owasp.org/index.php/Cross-site_Scripting_(XSS)
-- |_      http://php.net/manual/en/reserved.variables.server.php
-- @args http-phpself-xss.uri URI. Default: /
-- @args http-phpself-xss.timeout Spidering timeout. Default:10000

Paulino Calderón Pale
Website: http://calderonpale.com
Twitter: http://twitter.com/calderpwn



Would there be a way or would it make sense to implement a method by which HTTP scripts can hook into a single crawler 
and test things page by page in order to avoid crawling/spidering the same content for all (or each selected) HTTP 
script over and over again?
Sent through the nmap-dev mailing list
Archived at http://seclists.org/nmap-dev/

  By Date           By Thread  

Current thread:
[ Nmap | Sec Tools | Mailing Lists | Site News | About/Contact | Advertising | Privacy ]