One needs to be careful in interpreting the evidence from the C-130J static
code analysis. I seem to remember there were about 7 categories of defect,
ranging from functional defects that affected safety of flight at one
extreme to spelling mistakes in supporting documentation at the other
extreme. Also, in many cases, the static code analysis was carried out
before the software testing had been completed. The software that Andy
German reported to have a defect rate of 4 anomalies per thousand lines of
code was the mission computer software, which was specified using Parnas
Tables and implemented in SPARK by Lockheed with help from Praxis. I believe
they did program proof of the SPARK code against the Parnas Tables. I
therefore believe the 4 anomalies per kLOC reported by Andy German is
consistent with the < 1 post-delivery defect per KLOC reported by Praxis on
other SPARK projects, given the anomalies reported on C-130J would have
included spelling mistakes and other non-functional defects, and that the
static code analysis was done in parallel with the software development.
Incidentally, Lockheed's positive experience of developing the mission
computer software in SPARK was documented in Jim Sutton's book, Lean
Software Strategies
(http://www.amazon.co.uk/Lean-Software-Strategies-Techniques-Developers/dp/1
563273055).
When comparing defect rates, we need to be careful to define whether we mean just those functional defects that affect safety, all functional defects, or all defects including non-functional defects, and also whether we count defects found prior to entry into service, or only defects found after entry into service.
I think what was clear from the C-130J experience was that the (only) SPARK program exhibited a much lower defect rate than the average Ada program, which exhibited a much lower defect rate than the average C program. Having said that, there was considerable variability between suppliers (the best C program had a lower defect rate than the average Ada program). As it happens, I helped conduct the static code analysis on the worst program (500 anomalies per thousand lines of code according to Andy German's article). The code didn't match the design, and the design didn't match the requirements! I understand that Lockheed cancelled the contract with that particular supplier and sourced an alternative part from another supplier.
Yours,
Dewi Daniels | Managing Director | Verocel Limited
Direct Dial +44 1225 718912 | Mobile +44 7968 837742 |
Email ddaniels_at_xxxxxx
Verocel Limited is a company registered in England and Wales. Company
number: 7407595. Registered office: Grangeside Business Support Centre, 129
Devizes Road, Hilperton, Trowbridge, United Kingdom BA14 7SZ
-----Original Message-----
From: systemsafety-bounces_at_xxxxxx
[mailto:systemsafety-bounces_at_xxxxxx
Peter Bernard Ladkin
Sent: 23 November 2014 05:19
To: systemsafety_at_xxxxxx
Subject: Re: [SystemSafety] Autonomous Vehicles and "Hacking" Threats
Dewi Daniels, who did much of the code inspection on this project I understand, is on this list.
For an unquantified list of what was discovered see slides 16-20 of http://www.rvs.uni-bielefeld.de/publications/Talks/LadkinIETSysSafe2012.pdf
PBL On 2014-11-22 23:30 , Martyn Thomas wrote:
> I think that his numbers are just the discovered anomalies. > > On 22 Nov 2014, at 22:03, Brent Kimberley <brent_kimberley_at_xxxxxx > <mailto:brent_kimberley_at_xxxxxx >
Prof. Peter Bernard Ladkin, Faculty of Technology, University of Bielefeld,
33594 Bielefeld, Germany
Tel+msg +49 (0)521 880 7319 www.rvs.uni-bielefeld.de
This archive was generated by hypermail 2.3.0 : Sun Feb 17 2019 - 16:17:07 CET