Re: [SystemSafety] NYTimes: The Next Accident Awaits

From: Les Chambers < >
Date: Mon, 3 Feb 2014 11:03:12 +1000


Nancy

Could we drill down here. I feel it would be instructive for many of us.  

Could you please describe the attributes of a goal-oriented regulatory regime that distinguish it from a prescriptive regulatory regime.  

The source of my confusion: On my case study project the hazard analysis identified safety requirements. Compliance with those safety requirements became the goal. This was prescribed by the standard. Which was prescribed by the rail authority. Some of the hazards we worked on were actually allocated to us by the rail authority (our system was part of a larger system - rail transportation). The level of risk tolerance was also well documented. I still have a mouse pad with the risk matrix printed thereon. So you could say our hazard analysis and goal setting was well integrated with that of the customer. I'm struggling to find fault with this approach.  

Further, I repeat my assertion that any regulatory regime is vulnerable to confirmation bias in that it involves a developer making pronouncements and providing evidence to a regulator. Hopefully the evidence is based on fact but facts can be selective. Could you please respond: how does a prescriptive regulatory regime reduce the probability of confirmation bias over a goal-oriented regulatory regime? Could you provide a real world case study?  

It's interesting to note that the computer security community is developing approaches very similar to that of functional safety (evidence the emerging "bake security in" paradyme). There are a bit behind us though. For example, Windows Vista was the first Microsoft operating system fully "touched" by their security development life-cycle (SDL). One thing they have that we don't is a more insightful perspective on human weakness. They expect to deal with dishonesty as the normal case. We trust but verify (and too often hope for the best). I was intrigued by one of their rules of thumb: the 10 - 80 - 10 rule. 10% of people will never do the crime. 10% of people view crime as their occupation. The rest of us are opportunists.  

Les  

From: Nancy Leveson [mailto:leveson.nancy8_at_xxxxxx Sent: Monday, February 3, 2014 9:01 AM
To: Les Chambers
Cc: Tracy White; systemsafety_at_xxxxxx Subject: Re: [SystemSafety] NYTimes: The Next Accident Awaits  

Writing a final report about what was done for a prescriptive regulatory regime is not what I or others are talking about when we talk about safety cases in a goal-oriented regulatory regime where the regulee determines how they will achieve the goal.  

Let's not confuse the issue here. Writing a final report is done by everyone and does not need a special name nor does it require any special argumentation.  

Nancy  

On Sun, Feb 2, 2014 at 5:57 PM, Les Chambers <les_at_xxxxxx

Hi Nancy

Like Tracy I am having a problem understanding your opposition to safety cases. It may be to do with the way they are used in various countries and application domains. My experience is in rail. I wrote the safety case for an environmental control system in an underground Asian rail network. The project's compliance requirement was: CENELEC EN 50128 (still in draft form at the time). I used the safety case format specified in EN 50129. In the rail environment the safety case was only a component of the overall safety program. The CENELEC standards are highly prescriptive of engineering process, the level of "ceremony" required being a function of SIL level. I am assuming this is what you mean by "prescriptive regulation". So in this project we had both a safety case and prescriptive regulation. In fact the safety case was part of the prescriptive regulation. It was prescribed that we create one. And the job fell to me (for my sins).

The thrust of the safety case was as follows:

  1. Our initial safety plan indicated that we would do these things ...
  2. This is what we actually did ...
  3. Here is the evidence ...

If you like, the safety case was a wrapup of the safety program. Most of the detail wasn't in the safety case document, it referred to the hazard log which was the core artefact for safety on the project. Hazards were identified on commencement and throughout the project, corrective action taken to reduce risk, evidence that the corrective action had been taken recorded and, at the end of the project someone (me) went through the hazard log and verified that all the hazards had been closed out (that is, evidence existed that corrective action had been taken). This project had 50 people and ran for three years.  

The safety case was a very clean way of communicating with the rail authority. They were pleased and astounded that we went to the lengths we did. It was also a neat way of wrapping up the overall safety program and making sure that we had all the bases covered. Over three years with people coming and going it is easy to forget where you're up to with a safety program.  

RE: your comments on confirmation bias, I couldn't agree more. It's a natural human failing ... And for that reason it's prevalent in all regulatory frameworks, not just safety case creation. My concept of prescriptive regulation is that you require developers to follow various processes and produce evidence that the work has been done. This is a very coarse-grained way of gaining visibility into the real workings of any project. The fact that a review was run or a test performed and beautiful paperwork created to record the event often has no bearing on the effectiveness of the review or the visibility a test might have given into the real quality of the work product. All regulation does is make sure that at least the developer is going through the motions. For this reason, if there happens to be a devil in the details, it is highly unlikely that a regulator will find it no matter what they do.  

My personal view is that the job of a regulator is to find gross departures from recognised best practice and put them right. If they just concentrated on that the world would be a better place. Examples of gross behaviour falling through the regulatory net are many:

Fukushima: books written on the risk of backup generators being flooded by a tsunami before the tsunami hit

Air France: don't train your pilots to fly with no air speed indication

Road tunnels: open the tunnel before you've finished testing the safety systems (this is an Australian favourite I've personally experienced)

Deepwater Horizon: why didn't you have an easily deployable recovery plan for a combination riser pipe and valve failure on the seabed (that would be a simple question for a regulator to ask wouldn't it?)

Taiwan high-speed rail: ignore conditions of contract relating to safety compliance and do what you like (another company, not mine)  

The problem of visibility gets worse as these projects become larger and more complex. I worked on the Taiwan's high-speed rail project where we were five companies down on the food chain from the rail authority - we were a subcontractor of a subcontractor of a subcontractor of a subcontractor of a prime contractor. The only way to handle situations such as these is to get contractors to demonstrate as much commitment to safety as possible and that can only be achieved through the act of communicating in writing through safety cases and other project artefacts. Beyond that I am in furious agreement with the comment on SUBSAFE - it comes down to the culture of the company and the people they employ. If they are not producing paperwork they probably don't have the culture. If they are producing the paperwork and it looks bad they don't have the people. This should be when the alarm goes off and the regulator moves in. The fact that this doesn't happen too often
(especially in banking) means we've got a long way to go with the very
basics (forget about the minute details of whether or not a design is safe).  

I had an extreme experience recently that refocused me on this people issue. I joined 10 guys and one lady on a 51 foot yacht for a short sail across the Atlantic (2,970 NM). The highly effective teamwork that materialised almost instantly amongst a bunch of people who did not know each other was beautiful to behold (my blog entitled "Atlantic 13: Professional yachtsman and meddling bastards" is in the works - the skipper gave me the title - that's how he reviews engineers). The motivation for teamwork was something to do with the obvious consequences that would flow from bad actions or inaction (shades of SUBSAFE).

My conclusion: Somehow we've got to find better ways to reconnect the non-safety community with nature, help them draw the obvious line between a bad helmsman, a big sea, a 50 knot gust, a crash jibe, a ripple of energy up the mast, a crack at the spreaders, tons of aluminium and flogging sails hitting the deck and a bunch of very sad (potentially injured) people clinging to a drifting boat, dead in the water, in the middle of a very large ocean.

Until then - maintain the rage.  

Cheers

Les  

From: systemsafety-bounces_at_xxxxxx [mailto:systemsafety-bounces_at_xxxxxx Nancy Leveson
Sent: Monday, February 3, 2014 5:52 AM
To: Tracy White
Cc: systemsafety_at_xxxxxx

Subject: Re: [SystemSafety] NYTimes: The Next Accident Awaits  

Tracy White wrote:

   "I am slightly confused and a little perturbed by an argument that a 'safety case' in someway replaces any regulatory control (or government interference)."  

I haven't seen anyone on this list say that.  

Nancy  

On Sat, Feb 1, 2014 at 7:39 PM, Tracy White <tracy.white_at_xxxxxx

I am slightly confused and a little perturbed by an argument that a 'safety case' in someway replaces any regulatory control (or government interference). Even more that a safety case would not include a subclaim to have conducted a 'rigorous hazard analysis' program ... or to have applied appropriate 'procedures and standards'.  

Anybody who thinks that 'safety cases' in anyway replaces some form of regulation is ignorant of its purpose. I work in a regulatory environment and the 'safety case' is the primary communications medium with that regulator, elements of which will talk to hazard identification and compliance with standards and codes considered representative of engineering 'good practice'. I would agree that there are good and bad safety cases and I think that 'industries that do not 'have a good historical culture in terms of safety' are as ignorant of purpose of the safety cases as they of the need for safety in general.  

Regards, Tracy  

On 01/02/2014, at 12:48 AM, Nancy Leveson wrote:  

It is very difficult to characterize the U.S. In general, the country is so physically large that there are extreme differences in culture and politics
(generally but not always physically bounded). Much of the central
government in the US and European worlds seem to be moving toward libertarianism, but I am probably mischaracterizing Europe based on biased news reports. The individual U.S. states show extreme differences. At the extremes, Texas and California may as well be in different worlds, let alone countries when it comes to safety regulations (and lots of other things irrelevant to this list). There are also such different cultures in different industries that it is difficult to make general statements. Mining and civil aviation are examples of such extremes.  

But I will make one general statement that is only my personal experience. Because of my paper arguing against safety cases, I am getting many calls from government employees and company lawyers as well as individual engineers. Some of the companies pushing the "safety case" in the U.S. are those who don't want any government interference and see the safety case as a way to get around the rigorous procedural standards that now exist here in many industries. They seem to feel that they will be able to get rid of the procedures and standards that exist now and can write anything they want in a safety case and therefore save money and time in the rigorous hazard analysis now widely required while using any design features they want. These are primarily in industries that do not have a good historical culture in terms of safety.  

Nancy.  

On Fri, Jan 31, 2014 at 4:08 AM, RICQUE Bertrand (SAGEM DEFENSE SECURITE) <bertrand.ricque_at_xxxxxx

Hi Nancy,  

Concerning France you are right, and in that case I think that the cultural aspect dominates. There is no safety culture in the population as in UK, as acknowledged after AZF accident. The risk stops at the fence of the plant and you can safely build your house on the other side … The regulations have changed since but not the cultures. The safety engineers concerned by the new regulations live a nightmare as the choices are more or less, dismantle the plant versus dismantle the town … I think that the safety cultures have more impact on the final result than the competence of the safety community.  

Bertrand Ricque

Program Manager

Optronics and Defence Division

Sights Program

Mob : +33 6 87 47 84 64 <tel:%2B33%206%2087%2047%2084%2064>

Tel : +33 1 59 11 96 82 <tel:%2B33%201%2059%2011%2096%2082>

Bertrand.ricque_at_xxxxxx      

From: systemsafety-bounces_at_xxxxxx [mailto:systemsafety-bounces_at_xxxxxx Nancy Leveson
Sent: Thursday, January 30, 2014 8:59 PM To: systemsafety_at_xxxxxx Subject: Re: [SystemSafety] NYTimes: The Next Accident Awaits  

It would be nice to actually introduce some data into the discussions on this list. First, although it is very true that the U.K. has excellent comparative occupational safety statistics, this exceptional performance predated safety cases by at least 100 years and is as much a cultural artifact of the U.K. as any current practices. While the rest of the world was suffering the results of steam engine explosions in the late 1800s, for example, Great Britain was the first to implement measures to reduce them.
(I wrote a paper on this once if anyone is interested.) Although the British
citizens on this list know more about the history of the UK HSE, I believe they were the first country to require companies to have safety policies, etc., after the Flixborough explosion. Safety cases, I believe, came into being only after the more recent Piper Alpha explosion.  

Trying to tie accident rates in different countries to particular ways of regulating safety is dicey at best. First, there are significant differences between the engineering, agricultural, industry, and service rates of accidents in countries, often related to technical differences. Some have high agricultural accident rates but low service accident rates. For example, accident rates are going to be very different in a country with high tech agricultural techniques compared to those still plowing fields with a pair of oxen. Politics plays an even more important role. For example, western countries often put very dangerous processes and plants in third world countries or governments in these countries do not have laws that require manufacturers to use even minimal safety practices in manufacturing, for example, and they will not as long as they need the revenue and jobs. The safety culture in these countries will not change magically by using one type of regulatory regime.  

Note also, that there are vast differences in industries. Those with the very safest records, such as the U.S. SUBSAFE program, do not use safety cases. (And they have managed to have an incredible safety record despite being in the U.S. :-)). If we want to compare the effectiveness of different regulatory regimes, then we need to provide scientific evaluations and not just misuse statistics (which may involve factors that have nothing to do with the actual regulatory regime used).  

Also, as Michael Holloway noted, culture differences will make different types of regulation more or less different in different countries and industries.  

Finally, I would like to point out to those who are making some national comparisons and putting down the U.S. in comparison with France, for example, that the fatal occupational accident rate in the U.S. is less than that of France. Perhaps we can avoid mixing politics and chauvinism with science on this list.  

Nancy  

On Thu, Jan 30, 2014 at 8:50 AM, Martyn Thomas <martyn_at_xxxxxx

I'm a non-exec Director at the UK's Health and Safety Laboratory
(www.hsl.gov.uk <http://www.hsl.gov.uk/> ). We carry out the basic research
that underpins the UK's regulation of occupational health and safety, ranging from reducing accidents on construction sites and improving the tethering of loads on lorries, through to reproducing and analysing major explosions (such as Buncefield - http://www.buncefieldinvestigation.gov.uk/) and destruction-testing the physical integrity of tankers and rolling-stock.

We also undertake commercial work that uses our unusual experimental and analysis capabilities and very strong science base.

The UK is unusual in having a goal-based, safety-case regulatory regime and a regulator (HSE) with its own expert research establishment (HSL). We are getting an increasing number of approaches from Governments in the Far and Middle East who see the UK's good performance in occupational Health and Safety and who want to investigate setting up similar goal-based regulation.

Maybe there is something in the HSE/HSL approach that the US chemical industry could benefit from.

Regards

Martyn
Martyn Thomas CBE FREng

On 29/01/2014 22:05, Peter Bernard Ladkin wrote:

A worthy opinion piece from the Chair of the US Chemical Safety Board. Note his suggestion that identifying hazards and mitigation is just well-established best practice. I can say from experience that it is not yet in Europe in all industries with safety aspects, even though he holds Europe up as having a factor of three fewer chemical accidents as the US.  



The System Safety Mailing List
systemsafety_at_xxxxxx  
-- 
Prof. Nancy Leveson
Aeronautics and Astronautics and Engineering Systems
MIT, Room 33-334
77 Massachusetts Ave.
Cambridge, MA 02142

Telephone: 617-258-0505
Email: leveson_at_xxxxxx
URL: http://sunnyday.mit.edu <http://sunnyday.mit.edu/> 

#
" Ce courriel et les documents qui lui sont joints peuvent contenir des
informations confidentielles, être soumis aux règlementations relatives au
contrôle des exportations ou ayant un caractère privé. S'ils ne vous sont
pas destinés, nous vous signalons qu'il est strictement interdit de les
divulguer, de les reproduire ou d'en utiliser de quelque manière que ce soit
le contenu. Toute exportation ou réexportation non autorisée est
interdite.Si ce message vous a été transmis par erreur, merci d'en informer
l'expéditeur et de supprimer immédiatement de votre système informatique ce
courriel ainsi que tous les documents qui y sont attachés."


******
" This e-mail and any attached documents may contain confidential or
proprietary information and may be subject to export control laws and
regulations. If you are not the intended recipient, you are notified that
any dissemination, copying of this e-mail and any attachments thereto or use
of their contents by any means whatsoever is strictly prohibited.
Unauthorized export or re-export is prohibited. If you have received this
e-mail in error, please advise the sender immediately and delete this e-mail
and all attached documents from your computer system."
#

 





 

-- 
Prof. Nancy Leveson
Aeronautics and Astronautics and Engineering Systems
MIT, Room 33-334
77 Massachusetts Ave.
Cambridge, MA 02142

Telephone: 617-258-0505
Email: leveson_at_xxxxxx
URL: http://sunnyday.mit.edu <http://sunnyday.mit.edu/> 

_______________________________________________
The System Safety Mailing List
systemsafety_at_xxxxxx

 


_______________________________________________
The System Safety Mailing List
systemsafety_at_xxxxxx





 

-- 
Prof. Nancy Leveson
Aeronautics and Astronautics and Engineering Systems
MIT, Room 33-334
77 Massachusetts Ave.
Cambridge, MA 02142

Telephone: 617-258-0505
Email: leveson_at_xxxxxx
URL: http://sunnyday.mit.edu





 

-- 
Prof. Nancy Leveson
Aeronautics and Astronautics and Engineering Systems
MIT, Room 33-334
77 Massachusetts Ave.
Cambridge, MA 02142

Telephone: 617-258-0505
Email: leveson_at_xxxxxx
URL: http://sunnyday.mit.edu





_______________________________________________ The System Safety Mailing List systemsafety_at_xxxxxx
Received on Mon Feb 03 2014 - 02:03:37 CET

This archive was generated by hypermail 2.3.0 : Tue Jun 04 2019 - 21:17:06 CEST