Home page logo

nmap-dev logo Nmap Development mailing list archives

Re: crash report
From: "ITS" <ITS () ConniesPizza com>
Date: Sun, 25 Aug 2013 01:24:30 -0500

The scan was retried with nothing else running and after a fresh boot. RAM use never went above 6G on a 32G system. 
Attached is the screenshot of the crash.

I can breack the scan into 4 parts of /14 each, however when it comes time to do the scan 
breaking into units of /14 will take forever in terms of human monitoring and intervention.

Let me know if I can enable any debugging or logging features to assist in resolving this. I am happy to help any way I 



  ----- Original Message ----- 
  From: ITS 
  To: Henri Doreau 
  Cc: Nmap dev 
  Sent: Monday, August 19, 2013 10:50 PM
  Subject: Re: crash report

  2013/8/16 ITS <ITS () conniespizza com>:
  > Version: 6.40
  > Traceback (most recent call last):
  >   File "zenmapGUI\ScanNmapOutputPage.pyo", line 239, in _selection_changed
  >   File "zenmapGUI\ScanNmapOutputPage.pyo", line 270, in _update
  >   File "zenmapGUI\NmapOutputViewer.pyo", line 303, in set_command_execution
  >   File "zenmapGUI\NmapOutputViewer.pyo", line 328, in refresh_output
  > MemoryError

  looks like your machine ran out of memory. Were you conducting a large
  scan? If so you might want to split your scan into several smaller
  ones (see -oX to save the result to a file) and process the output
  afterwards. Alternatively, you can switch to a machine that has more



  Thanks for the advice. I split the large scan into 3 smaller units and monitored RAM use. The 32G machine never used 
more than 6G for all processes combined at any time, but the smaller scan stopped about the same time again, this time 
without error report I could send you. MS Windows just reported the nmap front end had a problem and had to close and 
offered option to report the issue to MS, which I did.

  Not sure what to do now. I can split into even smaller units but given the point where problem arose this time it 
would appear my original scan would need to be divided into 100 smaller sections or more. That gets tedious and 
requires a lot of human monitoring to start a new scan whenever one finishes.

  Anything I can do to provide some debugging info for you to help solve this?



Sent through the dev mailing list
Archived at http://seclists.org/nmap-dev/

  By Date           By Thread  

Current thread:
[ Nmap | Sec Tools | Mailing Lists | Site News | About/Contact | Advertising | Privacy ]