Re: [SystemSafety] Autonomous Vehicles and "Hacking" Threats

From: Dewi Daniels < >
Date: Tue, 25 Nov 2014 11:41:21 -0000


I helped carry out static code analysis of the airborne software for the Lockheed C-130J in 1995 while I was working at Lloyd's Register. We used MALPAS to verify software written in Ada, C and Assembler. This is the project described in Andy German's CrossTalk article (http://www.crosstalkonline.org/storage/issue-archives/2003/200311/200311-0- Issue.pdf). I later spent 6 months at the Lockheed factory in Marietta, GA, helping to write the software safety case for the C-130J. Before working for Lloyd's Register, I'd worked for Program Validation Limited, where I helped to develop the SPARK Examiner. After Lloyd's Register, I went to work for Praxis, who had purchased Program Validation Limited and rights to the SPARK toolset.

One needs to be careful in interpreting the evidence from the C-130J static code analysis. I seem to remember there were about 7 categories of defect, ranging from functional defects that affected safety of flight at one extreme to spelling mistakes in supporting documentation at the other extreme. Also, in many cases, the static code analysis was carried out before the software testing had been completed. The software that Andy German reported to have a defect rate of 4 anomalies per thousand lines of code was the mission computer software, which was specified using Parnas Tables and implemented in SPARK by Lockheed with help from Praxis. I believe they did program proof of the SPARK code against the Parnas Tables. I therefore believe the 4 anomalies per kLOC reported by Andy German is consistent with the < 1 post-delivery defect per KLOC reported by Praxis on other SPARK projects, given the anomalies reported on C-130J would have included spelling mistakes and other non-functional defects, and that the static code analysis was done in parallel with the software development. Incidentally, Lockheed's positive experience of developing the mission computer software in SPARK was documented in Jim Sutton's book, Lean Software Strategies
(http://www.amazon.co.uk/Lean-Software-Strategies-Techniques-Developers/dp/1 563273055).

When comparing defect rates, we need to be careful to define whether we mean just those functional defects that affect safety, all functional defects, or all defects including non-functional defects, and also whether we count defects found prior to entry into service, or only defects found after entry into service.

I think what was clear from the C-130J experience was that the (only) SPARK program exhibited a much lower defect rate than the average Ada program, which exhibited a much lower defect rate than the average C program. Having said that, there was considerable variability between suppliers (the best C program had a lower defect rate than the average Ada program). As it happens, I helped conduct the static code analysis on the worst program (500 anomalies per thousand lines of code according to Andy German's article). The code didn't match the design, and the design didn't match the requirements! I understand that Lockheed cancelled the contract with that particular supplier and sourced an alternative part from another supplier.

Yours,
 
Dewi Daniels | Managing Director | Verocel Limited Direct Dial +44 1225 718912 | Mobile +44 7968 837742 | Email ddaniels_at_xxxxxx
 
Verocel Limited is a company registered in England and Wales. Company number: 7407595. Registered office: Grangeside Business Support Centre, 129 Devizes Road, Hilperton, Trowbridge, United Kingdom BA14 7SZ

-----Original Message-----
From: systemsafety-bounces_at_xxxxxx [mailto:systemsafety-bounces_at_xxxxxx Peter Bernard Ladkin
Sent: 23 November 2014 05:19
To: systemsafety_at_xxxxxx Subject: Re: [SystemSafety] Autonomous Vehicles and "Hacking" Threats

Dewi Daniels, who did much of the code inspection on this project I understand, is on this list.

For an unquantified list of what was discovered see slides 16-20 of http://www.rvs.uni-bielefeld.de/publications/Talks/LadkinIETSysSafe2012.pdf

PBL On 2014-11-22 23:30 , Martyn Thomas wrote:

> I think that his numbers are just the discovered anomalies. 
> 
> On 22 Nov 2014, at 22:03, Brent Kimberley <brent_kimberley_at_xxxxxx
> <mailto:brent_kimberley_at_xxxxxx
> 

>> How does Andy et al estimate the volume of undiscovered anomalies?
>>
>> On Saturday, November 22, 2014 10:47 AM, Martyn Thomas
wrote:
>>
>> On 21/11/2014 16:38, Stefan Winter wrote:
>>> I had hoped for some better estimate of defect densities for the
>>> latter. The best approximation I had come up with so far is the
>>> product of "lines of code in a modern car" (100 million for a
>>> premium car in 2009) and "defect count per line of code in really
>>> critical software" (10^^-4 ). I had taken these numbers from an IEEE
>>> spectrum publication and a short paper from Gerard Holzmann, hoping
>>> that critical NASA software contains in average less bugs than
>>> common automotive code and the calculation, hence, gives me a
>>> conservative estimate. If anyone has a better idea or wants to share
>>> more accurate numbers, please let me know. :-)
>>
>> Andy German's Crosstalk article
>> (http://www.crosstalkonline.org/storage/issue-archives/2003/200311/20
>> 0311-0-Issue.pdf) decribed the analysis of a range of software in a
military aircraft. He reported a range of "anomaly"
>> densities, ranging from 4/KLoC to 250/KLoC.

Prof. Peter Bernard Ladkin, Faculty of Technology, University of Bielefeld, 33594 Bielefeld, Germany
Tel+msg +49 (0)521 880 7319 www.rvs.uni-bielefeld.de



The System Safety Mailing List
systemsafety_at_xxxxxx

The System Safety Mailing List
systemsafety_at_xxxxxx Received on Tue Nov 25 2014 - 12:41:42 CET

This archive was generated by hypermail 2.3.0 : Thu Apr 25 2019 - 10:17:06 CEST