mailing list archives
Re: The Static Analysis Market and You
From: "Dave Hull" <dphull () trustedsignal com>
Date: Tue, 14 Oct 2008 21:25:32 -0500
On Tue, Oct 14, 2008 at 9:53 AM, Dave Aitel <dave () immunityinc com> wrote:
-----BEGIN PGP SIGNED MESSAGE-----
One of the major problems with the technology is that you have to be a super
genius code auditor to decide if the vulnerabilities are real or not.
For the most part, I agree with what you're saying, but I disagree
with this statement. I work with Fortify. Maybe it's the code we're
producing, but thus far most of the false positives have been easy to
spot. What bothers me more is when developers follow the Fortify's
recommendations to correct issues, re-scan the code and Fortify
continues to flag it as vulnerable. We end up suppressing the issue
and in the developer's mind the tool is called into question.
I believe SCA tools should be considered another layer in the
defense-in-depth strategy. Train developers. Build security into the
development process, beginning with requirements gathering. Have
security participate in the architecture and planning of the project.
Do code reviews for security issues, both manually and with automated
tools. Pen test the final product with automated tools and manually.
I'm currently reading Brian Chess' and Jacob West's _Secure
Programming with Static Analysis_. The author's are from Fortify and
they lay out some ground rules early in the book. They refer to
something they call the "exploitability trap," wherein the SCA tool
flags a given piece of code as vulnerable and the developer says, "I
won't fix that unless you can prove it's exploitable." The authors
recommend avoiding this trap by categorizing vulnerable code as:
1. Obviously exploitable
3. Obviously secure
If something isn't obviously secure, it should be refactored until it
is. As a developer, that's annoying. As a security person, it seems
reasonable. It's more fun to refactor code than to send breach
notifications to millions of customers.
What keeps me up at night is not the false positives, but the false
negatives. I can't find it at the moment, but one review of the big
three products in this space said that they caught around 40% of the
vulnerabilities. Another concern is what Gary McGraw has referred to
as the Bug Parade. SCA tools may be useful for finding bugs, but they
don't help us find logic flaws.
The Nist Survey
Thanks for the synopsis. The web site says the results won't be
published until Dec. I can't wait to see them.
Those are not good signs for the technology field as a whole. One
possibility is that more research dollars will flood into the space
and the technology will get better and live up to its marketing.
Does anything ever live up to its marketing? I'd settle for continuous
product improvement and honest claims.
Public key: http://trustedsignal.com/pubkey.txt
Fingerprint: 4B2B F3AD A9C2 B4E1 CBDF B86F D360 D00F C18D C71B
Dailydave mailing list
Dailydave () lists immunitysec com