Re: [SystemSafety] How Many Miles of Driving Would It Take to Demonstrate Autonomous Vehicle Reliability?

From: Les Chambers < >
Date: Tue, 19 Apr 2016 09:58:19 +1000


... And furthermore

I like to point out an immutable fact that, if an item of equipment depends on software to deliver safety functions, changing one line of code negates any previous claim of safety integrity proven in use. The Tesla model S can receive upgrades in your garage overnight. You wake up in the morning and hay presto, a new cool feature is available. From the Tesla site:
"Model S regularly receives over-the-air software updates that add new features and functionality. When an update is available, you'll be notified on the center display with an option to install immediately, or schedule the installation for a later time. The average software update takes 45 minutes to complete. Connect your Model S to your home's Wi-Fi network for the fastest possible download time."

How cool is that? The latest cool is delivered in upgrade 7.1. It's worth quoting from Tesla's website:
"Last Fall, Tesla Version 7.0 software introduced a range of new Autopilot active safety and convenience features to give you more confidence behind the wheel, increase your safety on the road, and make highway driving more enjoyable. The release of Tesla Version 7.1 software continues our improvements to self-driving technology. This release expands Autopilot functionality and introduces the first iteration of Summon ..."

In Tesla's case, from a safety perspective, every time you receive an upgrade you have a new vehicle that has not been driven an inch and whose safety integrity cannot be claimed through a proven in use argument.

Any claim of proven in use is therefore transparently fraudulent. Any claim that your safety has been enhanced is highly questionable because it has not been validated in use. I'm not generally in favour of the Nanny State but I do believe we should actively oppose and legislate against such grossness, the claim I mean, not the vehicle.

In general I'm stoic about this development. It reminds me of my time working on strike jets. The best story I can offer is actually from the book Bogies and Bandits. This is the story of a US Navy training Squadron teaching new recruits to fly the FA 18 - ultimately from an aircraft carrier. The guy at the top of the class took off one day and noticed anomalous behaviour during take off. He radioed the tower and the technical people advised him to take it up to 20,000 feet and hit the reset button, which he did. The anomaly went away. No amount of testing or investigation could reproduce or uncover the cause of the behaviour. They put it down to experience and moved on.
There was another guy on the team who had a wife beside herself in fear that her husband would be killed. After all, this is the most dangerous job in the world. Near the end of the training the Squadron had a family day where families were invited to come to the airbase to watch their husbands and fathers do their stuff. This lady parked her car and was walking across the car park as the pre-anomalous jet was coming into land. The anomaly re-manifested. The aircraft inverted and crashed into the ground literally in front of her killing the pilot. She survived. Then a strange thing happened. She was transformed by this experience and ceased to fear her husband's demise. Psychologists knock themselves out analysing this, my simple theory is that she had a very strong dose of "what is". That is, a realisation, an insight into the true nature of a situation that lies before you. It is often associated not with facts but with conflicting truths. Something we engineers find difficult to deal with. Yes, flying aircraft off a carrier is a dangerous job but also she loved this man and could not leave him. There was no solution but to endure, now at least she understood the problem and could live with it.
Elon Musk hails from the "cool" school of silicon valley. If cool is the dark side of his nature and safety integrity the light, one can only wonder at the struggle that must go on. This gives rise to 2 conflicting truths: 1. Tesla will probably continue to provide world beating functionality in the driving experience.
2. The public, knowingly or unknowingly will accept a higher risk of dangerous vehicle failure to experience it.

I should point out though that the wonderfully brave fast movers driving the strike jets have signed up to risk their lives for their country. You have to ask yourself, in buying a Tesla am I signing up to risk my life for a dump from one of Musk's black boxes? (alas poor PBL ... we knew him) Personally I'm okay with this at least PBL would have died choking on his own "cool" blood.

This is the "what is" of new motor vehicle industry. So suck it up.

Cheers
Les

-----Original Message-----
From: systemsafety
[mailto:systemsafety-bounces_at_xxxxxx Peter Bernard Ladkin
Sent: Tuesday, April 19, 2016 4:43 AM
To: systemsafety_at_xxxxxx Subject: Re: [SystemSafety] How Many Miles of Driving Would It Take to Demonstrate Autonomous Vehicle Reliability?

On 2016-04-18 17:34 , Derek M Jones wrote:
> This philosophical stuff is all very interesting. But...

Philosophy, my hat!

Consider: you are an ambulance-chasing lawyer, and Joe Blow has been killed by an autonomous road vehicle which implemented the above algorithm according to the regulations (which are law). Everyone's going after everyone else in court. You're so good that they've all asked for you to represent them, the Government (the regulator), Joe Blow's family, the car manufacturer, the insurance companies representing all those, and probably more.

Who do you pick as client and why?

PBL Prof. Peter Bernard Ladkin, Faculty of Technology, University of Bielefeld, 33594 Bielefeld, Germany Je suis Charlie Tel+msg +49 (0)521 880 7319 www.rvs.uni-bielefeld.de



The System Safety Mailing List
systemsafety_at_xxxxxx Received on Tue Apr 19 2016 - 01:59:05 CEST

This archive was generated by hypermail 2.3.0 : Thu Apr 25 2019 - 10:17:07 CEST