Re: [SystemSafety] Another unbelievable failure (file system overflow)

From: Matthew Squair < >
Date: Fri, 5 Jun 2015 15:09:51 +1000


Les,

The problem is the stopping rule, and the very different approaches that the "Model WHS Act 2011" and AS/IEC 61508 have to that question. The discussion below reflects the direct non-transferable responsibilities of the 'Designer' under the act.

I think we'd agree that there's an infinite amount of 'things we could do' in terms of assuring safety, the real issue is how do we decide that we've done enough?

In the 61508 world view one establishes the risk, from that you derive a safety requirement in the form of a SIL and you then perform the activities required by the SIL. That is deemed to be enough. Note that we're using the 61508 standard in a risk acceptance framework, and implicit in this is that we stop when we achieve an acceptable risk.

The WHS act legislates what's called the So Far As Is Reasonably Practical (SFAIRP) principle for deciding when to stop, and no this is not the same as the ALARP principle of the HSE. You start by characterising the hazard and identifying possible controls. You then proceed through each control in turn and apply it as long as it's reasonable and practical to do so (there's guidance on both terms in the codes of practice). The rule for not implementing a specific control is Lord Asquith's time honoured one of 'gross disproportion', e.g a cost benefit analysis. All this is laid out in the law, regulations and associated codes of practice.

The key point being that if you reject one control you still consider others until you've worked your way through the set. The act also requires that you sequence this in the classic hierarchy of control fashion. Thus the act requires you to consider each control in turn and the stopping rule is having no more controls that are reasonable and practical to implement. NOT risk level.

You can see that if we consider a hazard that results from a software design fault and, having done our risk assessment, we settle on doing say SIL 2 on the basis of risk then there are clearly practical things we could still do in the form of SIL3 and 4, else they wouldn't be in the standard. So under the act, how do we argue that these are unreasonable? Purely on the basis of risk? Nope, that won't work because the stopping rule is one of gross disproportion. So how do we show that the next SIL up would be grossly disproportionate in terms of cost?

If we treat each SIL level set of activities as a package it is I think a reasonable supposition that the standards organisation wouldn't have defined it unless they thought that it was practically 'doable'. While that doing might be expensive there's no evidence to indicate that the cost differs grossly across SILs, noting that your risk return per level is defined by the standard.

If you ignore the above questions and fall back to "well that's obviously the intent of the standards working group", you're resting your position on an appeal to authority style of argument, and that's a tenuous argument at best.

If we tease apart each SIL level and look at the set of activities in each then the situation gets worse. Take for example formal methods, which is drawn in at the higher SIL levels (from memory). Now there's a body of professional opinion that actually these techniques are at worst cost neutral and at best can save you time and money, that's one of the planks that the 'correctness by construction' manifesto rests upon after all. So how do you justify NOT carrying out an activity that could safety time, money AND reduce safety risk? If you don't like formal methods them take another SIL associated activity that it can be argued is cost/time positive, again it doesn't have to be a unanimous opinion of the industry just a significant body of opinion.

Then just for the moment think not about writing up a plan for the project manager in which you articulate how you'll apply 61508 to your project based on the functional hazard analysis you've conducted, but rather put yourself in the witness box trying to defend your use of the standard in the wake of an accident which has caused serious injury, against a prosecuting counsel who has the benefit of hindsight and experts who'll be happy to articulate what you could and should have done. This may sound dramatic but I use it to illustrate that the (english) law with it's adversarial approach has an entirely different focus than standards such as 61508 or risk management such as ISO 31000, it should also give you an idea of the degree of probative value you should be looking for from evidence to support your position.

So in Australia at least the law has mandated a quite specific methodology, including the requisite stopping rule, that you must follow in order to be able to claim that you were duly diligent and therefore not negligent. Applying a risk acceptance process and thinking that it complies with the law is unfortunately not defensible and means that 61508 cannot be used in the way that it was intended (within a risk acceptance framework). And of course one cannot set aside the law.

As a BTW the risk acceptance and SFAIRP disconnect also affects such varied risk based things as State government siting codes of practices for major facilities and dams, fire safety standards, high voltage safety standards and so on. Similarly there is an impact upon legislated regulators and their risk management strategies, such as ALARP.

As I've said elsewhere I don't think that any of the above was the intent of those who framed the legislation, but there it is.

On Fri, Jun 5, 2015 at 9:42 AM, Les Chambers <les_at_xxxxxx

> Matthew
>
> Would you care to elaborate? Sounds like an interesting moral dilemma.
>
> Les
>
>
>
> *From:* Matthew Squair [mailto:mattsquair_at_xxxxxx > *Sent:* Thursday, June 4, 2015 2:07 PM
> *To:* Les Chambers
> *Cc:* Steve Tockey; martyn_at_xxxxxx > systemsafety_at_xxxxxx >
> *Subject:* Re: [SystemSafety] Another unbelievable failure (file system
> overflow)
>
>
>
> It's a pity that you can't use 61508 in Australia and still comply with
> the WHS act then. :)
>
>
>
> On Wed, Jun 3, 2015 at 11:10 PM, Les Chambers <les_at_xxxxxx >
> Martyn
>
> In my experience the presence of IEC 61508 has had a positive effect when
> attached to a contract as a compliance constraint. It forces organisations
> to clean up their act. I've seen the same thing happen with EN 50128 in
> rail projects.
>
> I think we get too tied up in the details sometimes and forget about the
> overall positive impact of these standards.
>
> I do believe that the pricing of IEC 61508 is an immoral act of greed and
> a clear violation of clause 3 of a well known standard, common to many
> faiths in many civilisations over millennia.
>
> Refer: http://en.wikipedia.org/wiki/Ten_Commandments
>
> "Thou shalt not make unto thee any graven image."
>
> Just as this standard will never stop greed, or murder for that matter,
> the existence of a functional safety standard will not make any system
> totally safe. It all lies with the people working within the FS framework.
> How committed are they? It's exactly the same as being committed to a
> faith. Faith fills a need in most of us. We like to believe (without proof)
> that we are part of a master plan for which we do not make the rules. Some
> of us like to reinforce it by attending a church/mosque/synagogue once a
> week and reflecting on it for an hour or two. In the Middle East I worked
> with people who reflected five times a day. Many Westerners would view this
> as an unproductive waste of time but I remember thinking at the time that
> it wouldn't hurt us all to reflect with that kind of frequency, on
> something positive. The more reflection, the stronger the faith and the
> higher the probability of righteous action when our faith is tested. This
> is why I keep pushing this barrow of constant reflection on the safety
> discipline for those whose actions could cause harm to others.
>
>
>
> We should all cheer up. "The faith" had a good day today. Sepp Blatter
> resigned and the US Congress wound back the Patriot Act. Things are looking
> up for global moral standards.
>
>
>
> Cheers
>
> Les
>
>
>
> *From:* systemsafety-bounces_at_xxxxxx > systemsafety-bounces_at_xxxxxx > Tockey
> *Sent:* Wednesday, June 3, 2015 3:36 AM
> *To:* martyn_at_xxxxxx > systemsafety_at_xxxxxx >
>
> *Subject:* Re: [SystemSafety] Another unbelievable failure (file system
> overflow)
>
>
>
>
>
> Martyn,
>
> I can't speak for IEC 61508, but I do agree that in general the weaknesses
> you point out are at least borderline ethical issues.
>
>
>
>
>
> -- steve
>
>
>
>
>
>
>
>
>
> *From: *Martyn Thomas <martyn_at_xxxxxx > *Reply-To: *"martyn_at_xxxxxx > martyn_at_xxxxxx > *Date: *Monday, June 1, 2015 1:34 AM
> *To: *"systemsafety_at_xxxxxx > systemsafety_at_xxxxxx > *Subject: *Re: [SystemSafety] Another unbelievable failure (file system
> overflow)
>
>
>
> Les/Steve
>
> Thanks for this. There's little discussion of professional ethics in any
> forum that I read.
>
> Do you think there's any hope that we might be able to make a small
> advance in a focused area, such as IEC 61508? The standard isn't fit for
> purpose, in that it largely ignores cybersecurity issues and does not
> provide a sound basis for assessing whether safety-critical systems are
> safe enough for their proposed application. It's also too long,
> inconsistent, too expensive, and can't be copied/republished for use in
> teaching, research or professional debate. I see these weaknesses, in the
> central international standard for the safety of computer-based systems, as
> an ethical issue. Do you agree?
>
> Regards
>
> Martyn
>
> On 31/05/2015 05:14, Les Chambers wrote:
>
> Steve
>
> Thanks for referencing the code of ethics. It should be brought up more
> often. Unfortunately, for me, it makes depressing reading. Especially when
> you come upon paragraphs such as:
>
>
>
> 3.12. Work to develop software and related documents that respect the
> privacy of those who will be affected by that software.
>
>
>
> Although he has probably never read it, there is a man, who will probably
> never see his homeland again because he took these sentiments to heart and
> attempted his own corrective action. And what of the thousands of
> scientists, engineers and technologists who contributed to the construction
> of the software, the existence of which, he exposed to the world?
>
>
>
> My point is that non-compliance with this code of ethics is massive and
> almost universal. In fact, any engineer maintaining strict compliance with
> every paragraph of this code would be unemployable in our modern world.
>
>
>
> Reading these paragraphs through the lens of experience I am blown away by
> their flippancy. From personal experience I can tell you that screwing up
> the courage to implement even one of these items can be a massive life
> changing event. This sentence would be lost on a graduate. They're all
> perfectly reasonable statements of how one should behave. Much like, "Thou
> shall not kill, thou shall not commit adultery ...". The issue lies in the
> moral courage to implement.
>
>
>
> There is no quick fix to this problem as we are a decentralised,
> unorganised and generally fragmented lot. We don't have the luxury of the
> medical profession that deals with a single organism. We can't simply state
> and righteously comply with the notion of, "Do no harm." In fact, for us,
> the opposite is true, many of us work in industries where the primary
> purpose is to kill other human beings, and with high efficiency (fewer
> soldiers kill more enemy).
>
>
>
> One thing we can do is deal with the problem at its root:
>
>
>
> We are graduating incomplete human beings from science and engineering
> courses. There is insufficient focus on the moral issues surrounding the
> impact of our machines on humanity. For example, a study of applied
> philosophy, including ethics, should be a nonnegotiable component of all
> engineering courses. Not just a final year subject, but a subject for every
> year with a weekly reflection on the content. Much like the weekly safety
> meetings I was forced to attend in the chemical processing industry.
>
>
>
> I'm sure there will be howls of laughter at this, but, let me tell you
> it's the only thing that caused me to back a senior manager about five
> levels above my pay grade into a corner - he could physically not escape me
> short of punching me out and stepping over my body - and berate him until
> he promised to properly train his operators in the emergency procedures for
> a safety critical system.
>
>
>
> Popping a few paragraphs up on the web would never have done the trick.
>
>
>
> That experience was trivia compared to where we are headed. The massive
> computing power now available means that our software is beginning to take
> higher level decisions away from human beings. Some of these decisions are
> moral ones (refer my previous post on lethal autonomous weapons systems).
> "Shall I kill all humans associated with this structure, or no?"
>
>
>
> At a recent engineering alumni meeting I asked the head of my old
> engineering Department how much philosophy is taught to undergraduate
> engineers. He chuckled. "It is available as an elective but less than one
> percent participate," he said.
>
>
>
> I plan to speak to him again soon.
>
>
>
> Cheers
>
> Les
>
>
>
>
>
>
> _______________________________________________
> The System Safety Mailing List
> systemsafety_at_xxxxxx >
>
>
>
>
> --
>
> *Matthew Squair*
>
> MIEAust CPEng
>
>
>
> Mob: +61 488770655
>
> Email: MattSquair_at_xxxxxx >
> Website: www.criticaluncertainties.com <http://criticaluncertainties.com/>
>
>
>

-- 
*Matthew Squair*
MIEAust CPEng

Mob: +61 488770655
Email: MattSquair_at_xxxxxx
Website: www.criticaluncertainties.com <http://criticaluncertainties.com/>



_______________________________________________ The System Safety Mailing List systemsafety_at_xxxxxx
Received on Fri Jun 05 2015 - 07:10:05 CEST

This archive was generated by hypermail 2.3.0 : Sat Apr 20 2019 - 01:17:07 CEST