Re: [SystemSafety] The VW Saga - 30 managers...

From: Peter Bernard Ladkin < >
Date: Fri, 16 Oct 2015 08:37:20 +0200


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA256

On 2015-10-16 02:38 , Matthew Squair wrote:
> Well, 'mistakes were made but not by me' does seem to be the standard refrain in such
> circumstances. :)
>
> People do seem to have an amazing ability for self deception and self justification when
> dealing with situations of cognitive dissonance. That seems to be just how we roll as human
> beings.

All the usual human and organisational issues are raised by this event.

Diane Vaughan coined the term "normalised deviance" for the circumstance in which problems and issues are framed in such a way that decisions and actions seem reasonable that are not reasonable when framed in what an outsider would take to be the pertinent environment. It seems hard for me to phrase that quite right. It is relative, and "relative" doesn't work: maybe the outsiders are all wrong and the "right" view is the one that would be termed the normalised deviance by this characterisation. The gladiator who refuses to slay his disarmed opponent, for example. Maybe someone else would like to have a go (at the definition, I mean, not the combat...)?

Decision-maker D: "Do you have concrete data which shows how this substance behaves at these temperatures, that enables you to conclude that it will fail?" Engineer E: "No, we don't. [Our concerns are based on general engineering knowledge: that this compound is brittle at these temperatures; that it needs to be flexible, not brittle, to seal; that we have had problems with sealing at higher temperatures than this; and that if a seal fails it could be catastrophic.]"
D: "OK, you don't have that concrete data. The criterion is that a launch abort decision must be backed up by concrete data. The data is not there. The criterion is not fulfilled. Hence we launch."

Suppose E says, "this is nuts". And management M says, "we say it's not. We say you have a job to do and we define what that job is. It is not your role in this organisation to say it's nuts."

What does E do? After a certain amount of effort working inside the organisation to (from hisher point of view) remedy matters, E decides to go "outside" and tells people outside the organisation. E is likely in breach of contract and likely gets fired. (I just read that well over half the "whistleblowers" in GB in recent years have lost their jobs. They find it hard to get a new one, I imagine because no one wants to hire someone with a history of having breached a confidentiality agreement.)

There are two cases here.

Case 1. Suppose E is right. There is an obvious argument that in the Challenger case, E was right. But what if Challenger had not blown, E had still said "this is nuts", and had still gone outside to get the decision-making discussed? In other words, everything the same, just no accident. If we buy Vaughan's exhaustive analysis, we could say he would have been justified. Justified is justified, whether Challenger blows up or not. But NASA would have said, "look, we have tens to hundreds of these every launch. If we make decisions according to *your* criteria, no shuttle ever gets off the ground!" Is that wrong? I think it is a good question whether we might consider Vaughan's analysis so sympathetically had Challenger not blown up.

Case 2. Suppose the honest whistleblower is just wrong? I was peripherally involved in such a case a few years ago, as I was contacted by someone who had left a small, indeed fledgling, aerospace company with advanced technology because of his concerns over the safety assessment of a piece of kit that this company had been contracted to supply for a new commercial airliner. I couldn't understand his concerns at all. The piece of kit was not not classified as safety-related, and in fact it was not safety-related (the system of which it was part had other mechanisms which ensured malfunctions of the piece of kit were benign. I guess he didn't know enough about the general system to see that, but he also seemed resistant to being so informed). His going public caused a huge amount of worry for the small company. If that had been my company, I would have been worried about my existence: the end client (a couple times removed) could have cancelled the contracts because of the unwanted adverse publicity and I would thereby have lost my market.

There are books worth of organisational issues arising from the VW case. Bruce Schneier thinks that current SW business, and the rise of what is called the "Internet of Things" is going to make it easy to "cheat", and cheating will be hard to detect. He wonders what will become of us and advocates open inspection, which involves making all SW open source, not just (in some sense) critical SW. https://www.schneier.com/crypto-gram/archives/2015/1015.html#1

PBL Prof. Peter Bernard Ladkin, Faculty of Technology, University of Bielefeld, 33594 Bielefeld, Germany Je suis Charlie
Tel+msg +49 (0)521 880 7319 www.rvs.uni-bielefeld.de

-----BEGIN PGP SIGNATURE----- iQEcBAEBCAAGBQJWIJsgAAoJEIZIHiXiz9k+rRkH/0t2eEd+OuhBnS5eaRYvFO+w IPOOyr04OctJE8jTJzt3UqARc941ylxEztVLhL5XUMTasM6tc17Y73dm6o7QNNO1 5LAT6PSokvqe4cezj+y5619Di8N91d7aJEe/MZF5KphM6N3DfO6mdBsJC7do4ntC wndJPG/knBrky7c596ZRCWITR48qoddk7So4H+JLZWOw4dWVXexbLlEKvDTLe3Pz j9GjC/wiSH7xo7RSvXFmg8PQRIfJojc2Fe8PnoMkos7Odg1eDBo3K3vWc5bYOyB/ SwHkbvOqWOBaqbj0brX0fL/i3JRQ4rmuZy2RIfjpwTmbdTFG6Qe837GrFFeANZg= =unLi
-----END PGP SIGNATURE-----



The System Safety Mailing List
systemsafety_at_xxxxxx Received on Fri Oct 16 2015 - 08:37:28 CEST

This archive was generated by hypermail 2.3.0 : Fri Feb 22 2019 - 15:17:08 CET