Re: [SystemSafety] systemsafety Digest, Vol 21, Issue 13

From: Rod Chapman < >
Date: Tue, 15 Apr 2014 15:14:20 +0100


 >Chris Hills Wrote:
>the Open Source members of the group refused to used static analysis and
MISRA-C because the static analysis tools were not FOSS or free* and MISRA-C was not Free.

FWIW, the SPARK language and toolset have been "free" (meaning FOSS, permissively licensed, and a "zero cost" (but unsupported) toolset available) since mid 2009.

This hasn't led to an explosion of its use in the FOSS community, but has supported a number of interesting research projects, such as John Knight's work, the Muen microkernel, the Tokeneer system release, and the SPARKSkein reference implementation.

Secondly, I often hear (this week in particular, owing to the heartbleed bug) the assertion that "this code is low-level and has to be fast, so we have
to write it in C..." I would cite SPARKSkein as a counter-example - in some cases it out-performs the C reference implementation, but is still readable, portable, and subject to a complete proof of type safety.

On 15 April 2014 11:00, <systemsafety-request_at_xxxxxx
> wrote:

> Send systemsafety mailing list submissions to
> systemsafety_at_xxxxxx >
> To subscribe or unsubscribe via the World Wide Web, visit
>
> https://lists.techfak.uni-bielefeld.de/mailman/listinfo/systemsafety
> or, via email, send a message with subject or body 'help' to
> systemsafety-request_at_xxxxxx >
> You can reach the person managing the list at
> systemsafety-owner_at_xxxxxx >
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of systemsafety digest..."
>
>
> Today's Topics:
>
> 1. Re: OpenSSL Bug (Derek M Jones)
> 2. Re: OpenSSL Bug (Chris Hills )
> 3. Re: OpenSSL Bug (Patrick Graydon)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Mon, 14 Apr 2014 22:59:25 +0100
> From: Derek M Jones <derek_at_xxxxxx > To: systemsafety_at_xxxxxx > Subject: Re: [SystemSafety] OpenSSL Bug
> Message-ID: <534C5A3D.70908_at_xxxxxx > Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>
> Peter,
>
> > I find a discussion about "empirical evidence" beside the point.
> >
> > Suppose it is known that people make lots of mistakes of type X.
>
> That is empirical evidence,
>
> > Suppose technical methods T are known to avoid, definitively, mistakes
> of type X,
>
> more empirical evidence,
>
> > and T are practical.
>
> and yet more empirical evidence,
>
> > Suppose there is an area of engineering, SCS, in which mistakes of type
> X have potentially very serious consequences
>
> I would tot up the various costs and benefits, using the empirical
> evidence and make a decision based on the weight of evidence.
>
> Others might prefer to ignore the evidence based approach
> and read tea leaves, use astrology or some other method much
> loved by superstitious folk.
>
> >
> > Then we say: in SCS, using T is essential/best practice/the way to avoid
> lawsuits/whatever.
> >
> > What would be the relevance of any "empirical evidence" that some subset
> of T is "effective" in avoiding E?
> >
> > Since programming is a human endeavor, any "empirical evidence" that
> some set A of programmers in some artificial environment E using some
> subset of T produced programs P with instances of error E less than, or
> marginally less than, some other subset B of programmers in E who didn't
> use T is subject to question on a number of fronts. What training/culture
> did A and B have in common? How does one determine that all relevant
> characteristics of E were taken into account? If it is possible to avoid E
> without using T, how do we know that most people in A weren't all cognisant
> of how to avoid E and few people in B were so cognisant, quite independent
> of using T? Did people in A+B know they were being assessed in avoiding E?
> If not explicitly, were they able to infer it covertly? And the people in A
> more capable of so inferring than those in B? And how do we determine that
> people in A and B didn't covertly find out what the point of the test was
> and determine to justify it by, re
> sp
> > ectively, paying more attention and paying less attention to what they
> were doing? You can go on for ever.
> >
> > It is much easier with statistical methods on human populations to show
> that something you presumed didn't or shouldn't matter actually does
> matter. As with much experimentation, discovering a negative is
> straightforward and proving a positive almost impossible.
> >
> > PBL
> >
> > Prof. Peter Bernard Ladkin, University of Bielefeld and Causalis Limited
> > _______________________________________________
> > The System Safety Mailing List
> > systemsafety_at_xxxxxx > >
>
> --
> Derek M. Jones tel: +44 (0) 1252 520 667
> Knowledge Software Ltd blog:shape-of-code.coding-guidelines.com
> Software analysis http://www.knosof.co.uk
>
>
> ------------------------------
>
> Message: 2
> Date: Tue, 15 Apr 2014 09:09:31 +0100
> From: "Chris Hills " <safetyyork_at_xxxxxx > To: <systemsafety_at_xxxxxx > Subject: Re: [SystemSafety] OpenSSL Bug
> Message-ID: <00ea01cf5882$08259ff0$1870dfd0$_at_xxxxxx > Content-Type: text/plain; charset="utf-8"
>
>
>
>
>
> On Behalf Of Peter Bernard Ladkin
>
> On 14 Apr 2014, at 22:43, "Martin Pugh" <martin.pugh_at_xxxxxx > wrote:
>
> I take my hat off to the open source community for their efforts.
>
>
>
> Me too in general. But it's a problem that we can't seem to persuade them
> to use established high-reliability programming methods for code for which
> high reliability is essential.
>
> PBL
>
>
>
> [CAH] I have to agree with PBL. In another place discussing MISRA-C and
> Static analysis the Open Source members of the group refused to used static
> analysis and MISRA-C because the static analysis tools were not FOSS or
> free* and MISRA-C was not Free.
>
>
>
> *I have since discovered there are several FOSS static analysis tools.
>
>
>
> I repeatedly get told by FOSS people that if MISRA-C was a serious tool
> ?we?(?) would give it to ?them? (?) for free. Also they can?t do a FOSS
> MISRA-C checker because they want to quote all the MISRA-C rules in their
> checker without paying for a license to do so. I was discussing this
> earlier this week at a conference and the main thrust of the discussion was
> how to avoid the 15 GBP cost of a copy of MISRA-C and how to use all the
> rules for free.
>
>
>
> When I pointed out that all they had to do was list the rule numbers and
> then users could refer to their copy of MISRA-C this was seen as
> unacceptable as the users should not have to spend 15GBP on the MISRA-C
> standard?.
>
> Apparently using anything you pay for is not permitted for FOSS on
> religious grounds.
>
>
>
> NOTE: To make any sense of MISRA-C you really need the whole document not
> just the headline rules.
>
>
>
> I have found this sort of thinking re Open Source far more prevalent than
> any form of Good Practice. Let alone Best Practice.
>
>
>
> OTOH I understand the use of static analysis in commercial Sw may have as
> much as 25% penetration! So the commercial world is not much better and
> they don?t have the excuse of their Religion. J
>
>
>
> Regards
>
> Chris Hills
>
>
>
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <
> https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20140415/6737d5a5/attachment-0001.html
> >
>
> ------------------------------
>
> Message: 3
> Date: Tue, 15 Apr 2014 11:35:22 +0200
> From: Patrick Graydon <patrick.graydon_at_xxxxxx > To: systemsafety_at_xxxxxx > Subject: Re: [SystemSafety] OpenSSL Bug
> Message-ID: <C09B8139-983E-4142-9815-64C8110E7792_at_xxxxxx > Content-Type: text/plain; charset=windows-1252
>
> Leaving the religions of libre and gratis aside, does anyone know of any
> evidence that shows that adhering to MISRA-C specifically would improve the
> quality of FOSS*? Les Hatton?s work has been critical of many of the rules
> in the standard [hatton2004saferlanguagesubsets,hatton2007language]. But
> the most direct work I know of on the value of MISRA-C in
> non-safety-critical software is a study that attempted to correlate the
> locations of defects in video playback software with MISRA-C rule
> violations found an overall *slightly negative* correlation (i.e. the rules
> were worse than useless) [boogerd2008assessing]. Is there any specific
> evidence that would outweigh this**?
>
> ? Patrick
>
> * There are good reasons to adhere to a coding standard that have
> nothing to do with code quality. For example, developers using a tool that
> is incompatible with a language construct must strictly avoid use of that
> construct.
>
> ** Precluding certain coding constructs because someone finds them
> suspect and no-one has showed them beneficial might actually be harmful.
> For example, developers changing code to fix a rule violation might
> actually introduce a critical defect. Before we tell developers to *never*
> tolerate the use of a given construct (as opposed to avoid its use in new
> code where practicable) we should have evidence that the construct?s use
> brings dangers that are worse than the probably consequences of modifying
> code to eliminate it.
>
>
> _at_xxxxxx > Author = {Hatton, Les},
> Journal = {Information and Software Technology},
> Pages = {475--482},
> Title = {Language subsetting in an industrial context: {A}
> comparison of {MISRA C 1998} and {MISRA C 2004}},
> Volume = {49},
> Year = {2007}}
>
> _at_xxxxxx > Author = {Hatton, Les},
> Journal = {Information and Software Technology},
> Number = {7},
> Pages = {465--472},
> Title = {Safer language subsets: an overview and a case history,
> {MISRA C}},
> Volume = {46},
> Year = {2004}}
>
> _at_xxxxxx > Author = {Boogerd, Cathal and Moonen, Leon},
> Booktitle = {Proceedings of the IEEE International Conference on
> Software Maintenance (ICSM)},
> Month = {October},
> Pages = {277--286},
> Title = {Assessing the value of coding standards: An empirical
> study},
> Year = {2008}}
>
>
>
> ------------------------------
>
> _______________________________________________
> systemsafety mailing list
> systemsafety_at_xxxxxx > https://lists.techfak.uni-bielefeld.de/mailman/listinfo/systemsafety
>
>
> End of systemsafety Digest, Vol 21, Issue 13
> ********************************************
>



The System Safety Mailing List
systemsafety_at_xxxxxx Received on Tue Apr 15 2014 - 16:14:35 CEST

This archive was generated by hypermail 2.3.0 : Thu Apr 25 2019 - 23:17:07 CEST