mailing list archives
RE: Threat Modelling
From: "Mark Curphey" <mark () curphey com>
Date: Sat, 22 May 2004 16:55:33 -0400
To quote ...."The tools used for Risk Management in certification &
accreditation (NIACAP/DITSCAP) are very effective for threat modeling."
Maybe I am missing the point here so please help me out.
How would these generic tools help me methodically expose the fact that an
application developer chose to send a password in clear in an unprotected
SOAP message across an untrusted network?
How would these generic tools help me expose an application that used DNS to
authenticate a components location?
How is a generic tool going to help me expose an application that is not
validating input from a 3rd party web services data feed ?
I think there maybe confusion between what I think of threat modeling and
risk assessment. Threat modeling to me is about helping design a better
technological solution. See Building Secure Software, Writing Secure Code or
Threats and Countermeasures for my definition examples. Risk Assessment is
generally about a better management solution (where the definition of
management is the same as that from ISO17799). CRAMM is of negligible use
whatsoever in designing a secure application. Actually any security tool
that stores security information in an access database and is riddled with
security holes itself is of no use to me at all but that's another story. It
may have a use in modeling the business operations of a web environment but
it will not help me design a secure system from a technological perspective
and that's what I use threat modeling for. Maybe that's one explanation, I
(or you) are confusing operational security with engineering a software
And I am sorry but modeling dollar amounts is .......well even NIST 800-30
explicitly says don't bother, it's a pipe dream. Take an online brokerage.
At market the HART's (Hourly Average Revenue Trades) will be totally
different from at 9pm. If I model the system from a monetary perspective
should I change my security model at market open from the evening? Do I
insist on digital certs when by average customer balance hits a 1 million
dollars per account....that sort of modeling would probably justify Bill
Gates using RACF to trade at discount broker X but guess what....
When I first left college I used to have to do RA's using CRAMM (I am
originally from the UK btw)... I would happily bet that I could write an
application that in a controlled environment (i.e. it won't fail from not
having a policy, backup etc etc) would pass a CRAMM review with flying
colors and could be hacked totally in under a minute. The internet and
software technology is just too complex to try and model security
conceptually using simple wizards. I personally think that security threats
and countermeasures have moved at light speed compared to the technology
used to support RA. Just my personal opinion.
I think someone else had a good point in that RA tools are generally high
level. Building software is both a high level process AND a low level
procedure. The devil can be in the details and RA tools generally can't find
From: brennan stewart [mailto:brennan () ideahamster org]
Sent: Saturday, May 22, 2004 1:31 PM
To: webappsec () securityfocus com
Subject: RE: Threat Modelling
The tools used for Risk Management in certification & accreditation
(NIACAP/DITSCAP) are very effective for threat modeling. Some of them are
high level, and others can be technical. The problem with them though, is
their extreme price tags, proprietary content, lack of component
re-usability, and perhaps some information wouldn't be to the technical
level security professionals would require. They also don't have the level
of integration that is really vital.
While I know the initial thread was discussing Threat Modeling, it appears
there is a huge gap in the comprehensive risk assessment/threat management
arena (even with commercial software)
It would appear that an open source solution would fit the bill for this. My
ideas would take it far past mere threat modeling though for a more
complete, quantitative picture of risk, mitigations, dollar amounts,
residual risk, etc.
Some sample requirements:
Asset detailing, currency value assignment Complete threat listing, in DB
Attacks\exposures\etc matched to the OSVDB (maybe the legacy CVE/ICAT
Logic to understand system configurations (Linux/Unix/Windows/Cisco/etc)
preloaded with sample hardening, and scoring mechanisms (NIST 800
Logic to understand policies + DB
Logic to understand legal requirements + DB (swap requirements by
Then, some nice reporting functions to top it off
I know many of these data sources exist already individually.
On Fri, 2004-05-21 at 04:58, Brewis, Mark wrote:
From: Mark Curphey [mailto:mark () curphey com]
CRAMM is a general / generic Risk Assessment tool for information
For those who don't know, CRAMM is a high-level tool designed to model
risk at the physical, policy and procedural level, rather than the
technical. Early versions were difficult to use, and even harder to
interpret. The ISO 17799 aligned version is far more powerful, although it
needs someone skilled to drive it.
A more technical, network-level risk assessment/threat modelling tool back
in the late 1990's was the L3 Network Security Expert/Retriever, a (for the
time) sophisticated network mapping and risk analysis system . It was bought
by Symantec about 2000 and fairly promptly disappeared. If I remember
correctly, you were able to define any type of custom threats and
countermeasures, and model them with a reasonable level of granularity. I
only ever used it to model systems, rather than applications, but it was a
really interesting hybrid tool.
Both tools use/used some variation of the standard:
* Define Assets
* Define Vulnerabilities
* Define Threats
* Define Mitigation Strategies
Neither of these addresses your requirements (particularly L3, since it
appears to have gone), although I think the L3 tool(s) came closest. There
isn't anything I know of that even comes close to doing some of this, never
mind everything. Most of the case and sequence diagrams I've seen have been
manually defined and Visio drawn (paradoxically, probably the main utility
that helped kill off L3 Expert/Retriever). Risk modelling has been
extrapolated from those, in a generally ad hoc fashion.
In many respects, I think you've answered your own question - there is a
gap in this area. If Symantec still have the L3 code base lying around (and
it didn't metamorphose into the Vulnerability Assessment product) it might
be worth dusting down.
UK Information Assurance Group
Tel: +44 (0)1908 28 4013
Mbl: +44 (0)7989 291 648
Fax: +44 (0)1908 28 4393
E@: mark.brewis () eds com
This email is confidential and intended solely for the use of the
individual(s) to whom it is addressed. Any views or opinions presented are
solely those of the author. If you are not the intended recipient, be
advised that you have received this email in error and that any use,
dissemination, forwarding, printing, or copying of this mail is strictly
Precautions have been taken to minimise the risk of transmitting software
viruses, but you must carry out your own virus checks on any attachment to
this message. No liability can be accepted for any loss or damage caused by