Home page logo
/

fulldisclosure logo Full Disclosure mailing list archives

Re: Static Code Analysis - Nuts and Bolts
From: "Debasis Mohanty" <debasis.mohanty.listmails () gmail com>
Date: Wed, 27 Jun 2007 23:23:05 +0530

What program(s) do you use in static code analysis? It doesn't matter if
you are a hardcore grep+editor researcher or if you use complex
frameworks: Tell me (and also the rest of the list) about it.

Secure code review is one of the most mis-guided field where many
security folks talks only about grep'ing for threat patterns.
Offcourse I do not rule out it as a starting point but there are lot
more to it.

I have my own approach to secure code review: A simpler and easy to go
approach is -

a) Build up a Taxonomy of security coding errors specific to various platforms

The taxonomy of coding errors defined by Gary McGraw (cigital.com) in
famous book "Software Security - Building Security In" is good
starting point to base line with.

There are several such taxonomies of coding errors floating around but
most of them seems to be flawed in some or the other way. I found
McGraw's classification for errors (i.e. security flaws) to be useful
and can be made a part of both manual and automatic code review.

A nice write up on various such taxonomies can be found here -
http://securesoftware.blogspot.com/2005/12/threat-vulnerabilities-classification.html

Moving further you can refer CWE
(http://cwe.mitre.org/data/dictionary.html) which presently seems to
be superset of all common software weaknesses.


b) Create a set of secure coding anti-patterns specific to various platforms

Secure coding anti-patterns are commonly used poor solutions to common
security problems. This comes handy in getting more accurate results
when you run the anti-pattern cheat sheet through the code.

For Example:
    *  Use of an unbounded copy
        char buf[1024];
        strcpy(buf, s);

    * Use of a bounded copy with incorrect calculations
        char buf[1024];
        strncpy(buf, s, 1025);

Both the above piece of code snippet is vulnerable. One more example
is - not always a check for NULL value will take care of a NP (Null
Pointer) dereference bug. i.e. there are many weak ways of preventing
NP errors which doesn't warrant a *hard to break* protection.

check out
https://www.blackhat.com/presentations/win-usa-03/bh-win-03-schoenfeld.pdf
http://developers.sun.com/learning/javaoneonline/j1sessn.jsp?sessn=TS-2594&yr=2007&track=5

google for secure coding antipatterns  to find more references.



c) Grep for anti-patterns or secure coding mistakes

Use you favorite editor here and grep for all security anti-patterns.
I am a great fan of SciTE which supports almost all languages. get it
here http://scintilla.sourceforge.net/

Though this is an important phase during code review but definitely
not an ultimate phase to find security holes. The important phase is
what comes next i.e. Manual Data Flow (DF) and Control Flow (CF)
analysis.


d) Manual Data Flow (DF) and Control Flow (CF) analysis

DF analysis - http://en.wikipedia.org/wiki/Data_flow_analysis

CF analysis - http://en.wikipedia.org/wiki/Control_flow_graph

Performing both DF and CF analysis manually takes lot of time but is
definitely most important part of code review. It helps identifying
accurate threats from security standpoint. This phase requires a
master code security ninja's hand to ensure actual issues are
captured.

For example: Not always you see request.getparameter (" ")  in java
can be flagged for a potential XSS vector. In other words it is
necessary to check
- whether the data can be directly or indirectly be tampered by an
malicious user at any given point of time to cause a successfull XSS
attack

- check whether the data is sanitised for malicious inputs before it
is directly written to user page.

- check whether the data is encoded before written back to user page

similarly there are lot of such factors based on which a code reviewer
will be able to decide whether the finding is actually a threat or
not.


Few code review optimization tricks -
- know languages like python or ruby so that you can write some
scripts to automate step "c" and "d".

- Every time you find unique anti-patterns do update your anti-pattern
cheat sheet

- For easiness decompose a big application into various modules for
code review.

- Prioritize the security anti-patterns based on your own skills to
identify them faster and accurately. For example, I am fast at finding
XSS anti-patterns compared to NP exceptions. So I'll push NP
anti-patterns towards the end.

- If at any point of time, a particular issue is taking more time to
investigate then tag it for future review and move on to next.


Thats all I have for now.... hope it helps.

-d








On 6/22/07, Paul Sebastian Ziegler <psz () observed de> wrote:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512

Hi list,

due to personal interest I'd like to ask on your opinion regarding best
practices for static code analysis.
I guess most of us are accustomed to this method. After all - if you
want to find a vulnerability that basically means that either luck,
fuzzing or statical analysis will have something to do in the process.

Now statical analysis of many languages can be quite fun. Take PHP and
Python for example. You can mostly read the code like a book and mark
down interesting passages to further analyze later on. Grep and a good
editor are about all we need.

However other languages often tend to become really nasty. Let's say we
want to analyze a 2MB C-source split up into several thousand files.
"cat * | grep strcpy" will most probably return about a hundred results.
I just did a lot of static analysis lately and sometimes it took me more
than half an hour to trace back _one_ of the strcpy()-calls and check if
 the copied bits could be controlled in some way.

Of course not every dangerous call takes this long to check (also I
might be a little slow), however I think that you all know what I'm
talking about here.

So after not having slept for about a week I started to search for tools
to ease working on my projects. (Yes, I did drop my plans of auditing
2MB C-sources using only vim and grep...)
Now this is where I'd like to open up an exchange on best practices and
tool-combinations.
What program(s) do you use in static code analysis? It doesn't matter if
you are a hardcore grep+editor researcher or if you use complex
frameworks: Tell me (and also the rest of the list) about it.

I took a quick look at flawfinder and rats. However they do nothing that
grep couldn't accomplish as well. For browsing the code and finding
references to functions or declarations of variables I am currently
using redhat's source-navigator.
It is by no means perfect and has been unmaintained for a while -
however it is still a great help.
That is just my two cents.

Any remarks/hints/ideas/concepts/nuts would be greatly appreciated by me
as well as a lot of other people interested in the matter. (At least I
hope so.)

So please share your knowledge.

Many Greetings
Paul
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.7 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFGe/BlaHrXRd80sY8RCgUSAJ9Y9+LCr4hZ1vs6gOrZHa6O9Wv91wCgypM9
1fxdotQfIdgcpXJg9RAP0xs=
=ni/j
-----END PGP SIGNATURE-----

_______________________________________________
Full-Disclosure - We believe in it.
Charter: http://lists.grok.org.uk/full-disclosure-charter.html
Hosted and sponsored by Secunia - http://secunia.com/


_______________________________________________
Full-Disclosure - We believe in it.
Charter: http://lists.grok.org.uk/full-disclosure-charter.html
Hosted and sponsored by Secunia - http://secunia.com/


  By Date           By Thread  

Current thread:
[ Nmap | Sec Tools | Mailing Lists | Site News | About/Contact | Advertising | Privacy ]
AlienVault