Re: [SystemSafety] Static Analysis

From: Michael Jackson < >
Date: Mon, 03 Mar 2014 13:34:06 +0000


Yes. I suppose that the throwaway missions you mention are non-critical, in the sense that human life is not endangered and the cost in money and resources is affordable in the context of the whole program of multiple missions.

Flight testing a substantially new aircraft design must involve a risk to human life (the test pilot's). I suppose everything possible is done to minimise this risk, but it is real, isn't it? The first test flight is the earliest instance of 'putting the system into operation', with the maximum circumspection and the strongest ingredient of further testing.

At 10:57 03/03/2014, Matthew Squair wrote:
>I'm thinking that the empirical approach is a necessary part of any
>endeavor with a modicum of innovation in it. Sometimes you really do
>need to suck it and see, to paraphrase Clarke's second law.
>As an example that's what precursor missions are for, try the
>technology with a cheap 'throwaway' mission. Sojourner for
>Discovery, Pioneer for Voyager. And when you don't, as in the case
>of Hubble, the resultant cost and schedule overruns speak volumes.
>Though I'd be reluctant to apply that routinely to the traveling
>public en-masse of course, to pick up on Peters point.
>Matthew Squair
>MIEAust, CPEng
>Mob: +61 488770655
>Web: <>
>On 3 Mar 2014, at 8:51 pm, Michael Jackson
>>I think Patrick Graydon's point is that in any system involving the
>>physical world
>>(including human behaviour) there are inescapable concerns that lie
>>beyond the
>>reach of mathematical and logical reasoning and demand tests and experiments
>>for their investigation. For these concerns testing can show the
>>presence of error
>>but not its absence: infinite testing is not an option. Accepting
>>this point we must
>>at some stage decide that no more testing is practicable, and that
>>the system is
>>now to be put into operation.
>>It is uncomfortable to characterise this decision as
>>'try-it-and-see' but it is correct
>>in principle. Of course, for a critical system we are obliged to
>>analyse the design
>>and implementation very thoroughly, and to test very long and very
>>hard. Then the
>>system is put into operation with a circumspect realisation that
>>there may be, and
>>indeed probably are, some residual safety risks that have been
>>detected neither
>>by analysis not by testing. 'Putting the system into operation'
>>therefore becomes
>>itself a careful and gradual process embodying a strong ingredient
>>of further testing.
>>The phrase 'try-it-and-see' sounds like a sneer; but perhaps it is
>>a valuable reminder
>>that mathematical certainty of safety is simply not achievable.
>>-- Michael
>>At 07:46 03/03/2014, you wrote:
>>>On 3 Mar 2014, at 08:02, Patrick Graydon
>>> >
>>> > Hmm. While my (possibly ill-informed) opinion is that the
>>> non-safety world over-uses a try-it-and-see approach, I wonder
>>> if we can categorically say that try-it-and-see is /never/
>>> appropriate in safety.
>>>Most obviously, you are constrained by the regulatory environment.
>>>If it is for rail in Germany, then the kit must be approved for
>>>use by the regulator. It is replacing some kit or other, usually,
>>>so it must be demonstrated and documented to be at least as safe
>>>as that which it is replacing. It's the law. You don't get to "try it and see".
>>>Similarly, development according to IEC 61508 and "derivatives"
>>>(which often aren't really) requires that you demonstrate that the
>>>requirements of the standard have been met. In some jurisdictions
>>>(not all European countries, but some), you can be criminally
>>>liable if your kit breaks and you hurt someone, and you didn't
>>>develop according to IEC 61508 provisions. Indeed, there is a
>>>European Directive from 2008 about products which might cause
>>>harm. It is required a risk assessment be performed to determine
>>>if the risk is acceptable or unacceptable. The directive issues
>>>from 2008, but it usually takes a year or two for it to make it
>>>into national laws (Germany was 2011). There, you don't get to
>>>'try it and see' either.
>>>Now, exactly how far people conform to all this is, as usual, a
>>>matter for social negotiation. But if you want to 'try it and see'
>>>for safety-critical kit of almost any description, then that had
>>>better be tinkering inside an already-acceptable risk situation or
>>>you risk prosecution if something goes wrong, modulo the
>>>enforcement situation. In Britain, you also have ALARP to worry about.
>>>Broadly speaking, Les's observation that no, you can't do that
>>>with safety-critical kit is thus ensconsed in European practice
>>>and law. How far that situation actually governs what people do is
>>>another matter. Like the treaty (then law) which says you can only
>>>run an annual budget deficit of 3%, broken within three years by
>>>France, then Germany............
>>>Prof. Peter Bernard Ladkin, University of Bielefeld and Causalis Limited
>>>The System Safety Mailing List
>>The System Safety Mailing List

The System Safety Mailing List
systemsafety_at_xxxxxx Received on Mon Mar 03 2014 - 14:34:11 CET

This archive was generated by hypermail 2.3.0 : Tue Jun 04 2019 - 21:17:06 CEST