Re: [SystemSafety] OpenSSL Bug

From: Peter Bernard Ladkin < >
Date: Thu, 17 Apr 2014 07:59:18 +0200

> On 17 Apr 2014, at 00:52, Todd Carpenter <todd.carpenter_at_xxxxxx >
> If we fix the person, then wouldn't part of the problem stop reproducing itself? :)
>

>> On 4/16/2014 5:32 PM, Steve Tockey wrote:
>> Instead of blaming the person, how about we blame the process? And then take active steps to fix
>> the process?

Todd, Steve and Bertrand have elegantly recapitulated the various organisational reactions to a serious accident in one sentence each.

The interchange highlights the trope that, in complex systems, cause and mitigation aren't necessarily well correlated. In a complex critical system with many distributed, as well as variously replicated, control functions, the computer where the fix is installed is not necessarily the computer in which the functional problem was exhibited. To internetworking specialists this is easy to explain - the spam filter sits on your email client, but the cause of the problem lies with and behind spam-propagation machines. But it seems to be harder to explain to people involved in control-system forensics.

The RISKS Forum Digest edition of 16 April 2014 has of course some comments from long-time contributors.

Indeed, Martyn's contribution illustrates the above trope: a SW fix for spider infiltration http://catless.ncl.ac.uk/Risks/27.84.html#subj1

Henry Baker is just as disgusted as I am about memory-insecure programming practice. He also cites someone who found another vulnerability. http://catless.ncl.ac.uk/Risks/27.84.html#subj3

Jonathan Shapiro discusses memory "safety" in some depth (I wish people wouldn't call it "safety" - either "reliability" or "security" are appropriate words, but getting people to use technical terms precisely in informatics appears to be just as easy as getting them to use memory-secure technology). He says that the "many eyes" theory of inspection concerning open-source SW has been profoundly discredited, but one might wish for citations to the literature. He also mentions an unreferenced Columbia PhD thesis which showed that independent programming teams working from the same specification made correlated errors, which exhibits yet another kind of memory fragmentation, namely that apparently one can obtain a PhD nowadays by recapitulating well-known fundamental work http://catless.ncl.ac.uk/Risks/27.84.html#subj5

Finally, the Cloudflare Challenge is worth knowing about, especially if you are one of those security "specialists" saying "we don't know that it's been exploited yet" http://catless.ncl.ac.uk/Risks/27.84.html#subj4

PBL Prof. Peter Bernard Ladkin, University of Bielefeld and Causalis Limited



The System Safety Mailing List
systemsafety_at_xxxxxx Received on Thu Apr 17 2014 - 07:59:28 CEST

This archive was generated by hypermail 2.3.0 : Sat Feb 23 2019 - 09:17:07 CET