Home page logo

nmap-dev logo Nmap Development mailing list archives

Re: Fwd: hadoop and hbase information gathering
From: David Fifield <david () bamsoftware com>
Date: Tue, 8 Nov 2011 08:03:44 -0800

On Mon, Oct 31, 2011 at 08:52:00PM -0700, David Fifield wrote:
On Sun, Oct 30, 2011 at 10:46:33AM +0100, John Bond wrote:
On 14 October 2011 00:14, John Bond <john.r.bond () gmail com> wrote:
can you send me the output of the script with nmap -ddd (this will
produce a lot of output, some of which you may want to scrub)

@david it is no a simple task to set up hadoop, not sure if you can
even run everything on the same box.  ill try to build a vm or some
guidlines this weekend. in the mean time the cloudera docs are good

After some feedback from patrick i have updated the port rule to
trigger on shorport.http.

Okay. I can see the reason for this. All these different scripts run
against different ports, but they are all HTTP. Patrick found that his
university's Hadoop ran on different ports than the default.

Using shortport.http should take these scripts out of default, I think,
because they will only get a response from a minority of web servers. I
might even modify the rule to be "got a service match for HTTP, but it
is *not* running on a common HTTP port." Then it could be default again.

I have committed all the scripts. What I have done is restore the
original targeted portrules and leave the scripts in the "default"
category. Unfortunately this means that they won't work for environments
like Patrick's where the ports aren't the default. I'm open to ideas to
fix this.

I'm still interested in findout out what plain -sV reports for these
Hadoop HTTP servers.

David Fifield
Sent through the nmap-dev mailing list
Archived at http://seclists.org/nmap-dev/

  By Date           By Thread  

Current thread:
[ Nmap | Sec Tools | Mailing Lists | Site News | About/Contact | Advertising | Privacy ]