Dailydave mailing list archives

Re: Fwd: [ISN] Security experts hit out at "unethical" bugfinder


From: Jan Muenther <jan.muenther () nruns com>
Date: Mon, 14 Mar 2005 18:42:05 +0100

G'day,

There needs to be some way to create economic incentive for software vendors to fix bugs before the product is delivered and installed, and there are really just two choices:
   1. Hold software vendors liable for damages incurred from intrusions
   2. Create a market for vulnerabilities

In the first case, vendors will have to purchase insurance for their software, which will be a lot more expensive for widely used software than for niche products. The insurances will get richer. The closed source vendors would have a good argument why customers should choose closed versus open source, too: For OSS, there would be nobody to sue.

Well, strictly speaking, that's not necessarily the case, since liability always affects the initiator / author of any given good - and of course, open source software has authors, too. Currently, the vendors get away with their "Get out of jail for free" cards, yet if jurisdiction should decide to change that at some point, I'd think for OSS projects and possibly some free giveaways it might still work out this way. By the way: Software isn't automatically completely liability-free, at least not in Germany (only legal system I'm rather common with). These "signoffs" do not hold up when you sell custom made software - here, the usual laws kick in. I've seen liability lawsuits ruining small software companies, although the bugs in discussion were not security-related (but that wouldn't matter, really).

The second case is what is (to an extent, and perhabs not in the most optimal way) VSC does: MS would have an incentive to spend the membership fee to learn about bugs early, and the money that MS pays could be used to make it more attractive to give up the exploits to the vendor. If the vendor decides he doesn't care (which would be a viable position to take if he is very certain of his code quality) then he does not need to sign up.

Problem is... when you give the vendor a heads up about vulnerabilities in their products and then tell them to join your VSC to get the details, they might feel they're being blackmailed and just play stubborn. Those who believe in full disclosure (I don't) would argue that the restricted circle of people who get access to the exploit code and vulnerability details keeps the pressure on the vendors to fix their issues low. The question that keeps popping up in my head when these things are being discussed is: Cui bono - who benefits from which way of handling bugs? Interestingly, a good share of the recent Win32 worms were based on vulnerabilities that eEye came up with. As of the allegation that "criminals" and possibly terrorists buy their way into the VSCs... they're already buying people, hiring them for single hit jobs, and for them, it's not the 0day that counts, but rather the result. Believe it or not, during the plethora of pen tests I already conducted, I more often than not succeeded in penetrating without even whipping a single piece of exploit code out. 0day-hunting only starts when it's needed, and memory corruption vulnerabilities are only one part of the game, anyway.

Personally, I'd say software vendors are liable for their products, period. However, I see that liability restricted to, uhm, the limits of due diligence. When some wunderkind comes up with a new bug class, well, how could you have prevented that? However, well-known dangerous constructions or obvious lack of security precautions in design must hurt, financially, too. Another thing that just makes me boggle is how few (legally binding) security requirements exist for software that is used in integral parts of civil infrastructure. The only thing that springs to my mind is the health industry, and there, the use of "validated systems" rather makes them a heck of a lot more vulnerable than secure, since patching is actually prohibited, as is switching off services, since the system has been approved of with them running. Argh. Talking about energy, banks, transportation etc... there's just nothing in place, all that is done is on a free will basis or because private companies fear for their existance if a security hazard strikes them. I've recently stumbled across a bug in a piece of software that's used in, uhm, certain (rather rare) parts of civil infrastructure, and it was rather trivial to find and rather cheap to exploit. I'm 120% sure this thing would not exist if the vendor had been legally obliged to get it checked before it's deployed. Frankly speaking, I was personally pissed off by that.

Cheers, j.

_______________________________________________
Dailydave mailing list
Dailydave () lists immunitysec com
https://lists.immunitysec.com/mailman/listinfo/dailydave


Current thread: