Re: [SystemSafety] Logic

From: Philip Koopman < >
Date: Sun, 16 Feb 2014 12:18:56 -0500

I'm all for software that actually works going into products via whatever approaches are effective (mathematics, peer reviews, etc. -- they can all play an important role). While I haven't spent a lot of time trying to get formal methods adopted, I have spent a lot of time trying to get organizations to do peer reviews/inspections as a baby step toward crawling out of the muck of chaotic software development practices.

What I've found is that they often just can't bring themselves to put emphasis on an activity that is not directly contributing to the specify=>implement=>test path. (Sometimes they even skip "specify" so they can get right down to the business of writing buggy code, but that's another story.) I can speculate that the higher level managers assign value to creating code and testing activities. They assign essentially no value to defect prevention. These higher level managers seldom have training in software engineering (or even computer science/engineering).

 From what I've seen our students act the same way. It is all about getting the code written, "tested," and slipped past the grading gatekeeper, however messy that process is. Essentially no thought or value is placed on avoiding defects in the first place. This approach appears to have been trained into them in intro programming courses.

I run a senior/MS course that pushes students through a project that is difficult to survive unless you practice bug prevention. Most of them get the message and are running effective peer reviews by the end of the course. I fancy that by the time they've completed the course and learned the lessons, they'd be ready to adopt more formal practices (but not before -- they are skeptical even of peer reviews for several weeks). I touch briefly on formal methods, but the math is more than I can squeeze into my course on top of everything else I need to cover. Perhaps if this sort of experience happened early in their education instead of at the end it would help motivate them to learn and practice the right mathematical skills, and they'd be eager to take a course on that topic.

But to effect change, IMHO first we have to convince our non-software-engineering/non-safety critical colleagues that this is something worth doing. I've never had much success doing that. Part of it is probably that as researchers we mostly specialize in throw-away non-critical code. It's tough to convince someone that teaching a topic is important if they've never found it important themselves.

(BTW, I think that the argument that thesis code quality isn't a big deal isn't as compelling as some think. Maybe the student won't die if the code has a bug. But the student's experimental results might well be wrong due to bugs! I've seldom seen anyone take that issue seriously.)

On 2/16/2014 11:58 AM, John Knight wrote:
> Peter,
>
>> obviously I agree with much of what you say. But I am discussing with people who believe that we
>> constitute an exception to much of it.
>
> I think we are talking about different things. Research projects need
> software rapid prototypes to support investigation in areas such as AI
> and robotics. These are "throw-away" prototypes that should never
> make it into production and usually don't.
>
> I am talking about software products that are part of engineered
> computer systems which will subject others (possibly the general
> public) to risk. Higher education has a responsibility to prepare
> professional engineers to perform that engineering. That education
> needs to make it clear that:
>
> * Engineers are responsible for what they do.
> * Engineering is a profession not some amateur activity.
> * Mathematics is an essential component of professional computer
> engineering.
>
> In response to the comment from Les Chambers:
>
> "We must find a way to bring formal methods out of the lab and into
> general use."
>
> I generally agree. But I note that we have industrial strength
> systems such as SPARK Ada, industrial scope use of such systems such
> as the NATS iFACTS system, and substantial evidence from Peter Amey
> and his colleagues that applying such technology is cheaper and better
> than the informal alternatives.
>
> -- John
>
>
>
> _______________________________________________
> The System Safety Mailing List
> systemsafety_at_xxxxxx

-- 
Phil Koopman -- koopman_at_xxxxxx





_______________________________________________ The System Safety Mailing List systemsafety_at_xxxxxx
Received on Sun Feb 16 2014 - 19:53:42 CET

This archive was generated by hypermail 2.3.0 : Tue Jun 04 2019 - 21:17:06 CEST