Educause Security Discussion
mailing list archives
Re: Measuring security
From: "Basgen, Brian" <bbasgen () PIMA EDU>
Date: Wed, 5 Nov 2008 15:56:27 -0700
Gary has a great answer for data security, and I think it is contingent on 'meaningful' numbers. Trying to encapsulate
the raw dump of a Nessus scan isn't going to be a successful endeavor. Vulnerability analysis, for example, should
occur after scans have been vetted and some remediation efforts have occurred. Thus, among your data points are X
vulnerabilities remediated for Y detected at Z time.
FWIW, on the information security side of the house, the "metrics" discussion falls into the realm of risk, which can
be qualified, measured, trended over time, etc.
Pima Community College
From: The EDUCAUSE Security Constituent Group Listserv [mailto:SECURITY () LISTSERV EDUCAUSE EDU] On Behalf Of Gary
Sent: Wednesday, November 05, 2008 3:35 PM
To: SECURITY () LISTSERV EDUCAUSE EDU
Subject: Re: [SECURITY] Measuring security
I recommend measuring vulnerabilities on campus systems, as measured by a network vulnerability analyzer (or a skilled
assessment team for non-network pen-testing), along with measuring campus audience awareness. Why?
1) They are direct predictors of a security "event" happening
2) We can directly influence them (determined by money, support, time, etc)
3) The tools which measure them are relatively stable, yet evolve as threats change, and are generally used more
places than just here, so we're not an island of data
4) An increase in the vuln count, when it happens, is due either to increased threats, or decreased vigilance.
Both of these, and what can (or cannot) be done about them are readily understood by a non-technical audience
5) A decrease in the count of vulns, and/or increase in awareness, means risk is going downward
Counting attacks, conversely, can be very misleading or easily misunderstood. It indicates any one of several things,
many of which are outside your control (e.g. threats evolving, smarter attackers, etc).
Counting the presence of fortifications (like, systems patched, antivirus deployed), likewise, tells you what you've
built, but is only an indirect indicator of the likelihood of an incursion. (e.g. "look how tall our wall is!" sounds
nice - except if the back gate is hanging open)
I'm not saying these latter counts aren't also useful, just that as executive metrics (where brevity is very much the
soul of wit) they are too hard to explain meaningfully, and their trend can be confusing. But as indicators read by a
security professional or a CIO they are very meaningful in their own right.
From: The EDUCAUSE Security Constituent Group Listserv [mailto:SECURITY () LISTSERV EDUCAUSE EDU] On Behalf Of Heather
Sent: Wednesday, November 05, 2008 5:06 PM
To: SECURITY () LISTSERV EDUCAUSE EDU
Subject: [SECURITY] Measuring security
Hi all -
I've been asked to create some measurable target goals for data security. This is proving to be a tricky set of
metrics to define! What I've realized so far is:
1 - trying to go by how many holes or warnings are found by nessus won't work; way to many false positives
2 - trying to go by what a third-party penetration test might find won't work; what they are measuring varies too much
and there have so far been way too many false positives or things we considered completely acceptable (yes, a domain
controller is going to act as a time server to anyone who checks)
3 - trying to go by "well, doesn't look like we've been hacked recently"... not quite the business metric I'm looking
Is anyone out there finding any particular set of metrics working for you and your campus leadership?
Director, System Administration
heatherf () stanford edu<mailto:heatherf () stanford edu>