Intrusion Detection Systems mailing list archives
The CVE (WAS: RE: RE: Ramping up for another review)
From: gshipley () neohapsis com (Greg Shipley)
Date: Thu, 13 Jul 2000 12:24:18 -1000
Archive: http://msgs.securepoint.com/ids FAQ: http://www.ticm.com/kb/faq/idsfaq.html IDS: http://www-rnks.informatik.tu-cottbus.de/~sobirey/ids.html UNSUBSCRIBE: email "unsubscribe ids" to majordomo () uow edu au At 06:37 PM 7/11/00 -0400, Dug Song wrote:
I would expect to see CVE or something equivalent over time to be extended to the IDS area as well.it has, and i still believe it is a misguided effort, as it seeks to provide a nomenclature in the absence of any taxonomy - CVE participants vote upon what attack names should be in the database, but are left to their own devices in applying them.
Yeah, but the CVE is a start - and it is also my understanding that they are agreeing that the apple is indeed, the apple, and not the orange. As silly as this sounds, that's a lot of progress. Think about where the IDS and vulnerability identification scene was just one year ago. I think the CVE initiative is great - but I never expected it to provide anything more then a dictionary. Now, it remains to be seen how long it takes before vendors do anything USEFUL with the CVE. While I've been following the IETF working group a little bit (casually browsing the postings, not much more), just having a standardized and agreed upon method of saying "Hey, that's the Bind NXT attack!" is a good foundation. However, I think we are probably a few years out from seeing vendor X's vulnerability scanner interoperate with vendor Y's IDS (and Marty, yes, I know Hiverworld is trying to do this WITH ITS OWN PRODUCTS *grin*) and be able to pass information back and forth. But alas, I digress.... I guess my long-winded point is that I see the CVE as being a really good (and necessary) first step. And while I'm on a tangent, I've always found the stuff Max Vision was (is?) working on of interest concerning the public postings of basic signatures. You want to talk about hard-core 3rd party evaluation? Look at evaluating the accuracy of vendor signatures. RFP pointed something out to me last year that I found kind of amusing: someone released a CGI scanner that had a typo in the check list. This one vulnerable CGI (I can't remember which one it was, I'll have to go dig) was botched in this one scanner, and it has since shown up (botched) in several IDS products as well as vulnerability scanning products. (rfp being the king of exploiting CGI code - he catches these things). Anyway, this draws attention to two interesting points: 1. That (obviously) some vendors are horking things haphazardly 2. That the signature itself won't catch what it was supposed to catch Now, before someone points out "Hey, maybe they put the botched sig in to catch the botched scanner" well, hey, then there should have been two sigs: one for the botched CGI scanner, and one for the real vulnerable CGI. But there wasn't - just the botched CGI. Hmmmmm..... "Shipley, ok, shutup already, what's your point?" My point is that even the vendors mess up with their own sigs. But who the hell is going to test that? Certainly not I - I've got my hands full just trying to get through basic testing. Not to mention that many of these signatures are vulnerable to exploit mutation, but again, I digress... (this is one of those few times I'm sympathetic with the vendors - what a nightmare)
The IDS industry needs a standard for benchmarking the performance of IDS.we've been over this before - see Roy Maxion's excellent RAID presentations on IDS measurement and testing requirements for some idea of what's actually needed, as opposed to the usual benchmarketing...
Is this on-line anywhere? I'd love to read it....
Also, coming up with a common group of signatures that are turned on for performance testing for all IDS sensors can be tricky...or invalid, in the case of IDSs which do more than misuse detection.
Another place where the CVE can help - if all IDS vendors become CVE compliant you can make sure to turn on sig X,Y, and Z and know that those are the same across all products (or at least, that they are looking for the same attack) while you test. The CVE could also be another place of easy comparison: see how many entries vendor X has covered compared to vendor Y. Not something you want to base your complete analysis around, but it could serve as another point of differentiation. Just some thoughts.... -Greg P.S. Anyone who is interested in the CVE or wants to know what I'm babbling about, check out: http://cve.mitre.org
Current thread:
- The CVE (WAS: RE: RE: Ramping up for another review) Greg Shipley (Jul 13)
- RE: Tivoli Cross-Site for Security (was: RE: Ramping up for anoth er r eview) Lodin, Steven {IT S~Indianapolis} (Jul 14)
- Re: RE: Tivoli Cross-Site for Security (was: RE: Ramping up for anoth er r eview) Dave Whitlow (Jul 19)
- Re: The CVE (WAS: RE: RE: Ramping up for another review) Ron Gula (Jul 14)
- Re: The CVE (WAS: RE: RE: Ramping up for another review) Dug Song (Jul 15)
- NT Host Vulnerability Scanners Talisker (Jul 15)
- Re: NT Host Vulnerability Scanners mht () clark net (Jul 15)
- Re: NT Host Vulnerability Scanners Talisker (Jul 16)
- RES: NT Host Vulnerability Scanners Marlon Jabbur (Jul 17)
- Re: RES: NT Host Vulnerability Scanners mht () clark net (Jul 17)
- Re: NT Host Vulnerability Scanners mht () clark net (Jul 15)
- RE: Tivoli Cross-Site for Security (was: RE: Ramping up for anoth er r eview) Lodin, Steven {IT S~Indianapolis} (Jul 14)
- Re: NT Host Vulnerability Scanners Carric Dooley (Jul 18)
