background2

Compliance is often the most basic place to start working on safety. Rules are often needed to set some basic standards of what society or the organisation sees as an acceptable standard. Rules are also needed to weed out the really bad organisations; by fining them, or even shutting down their business.

Lots of things to say about compliance, as you can see from this introduction (if you want more, it is discussed among others in Myth 41 of my book), but right here, right now I’d like to focus on just one thing: many people seem to live with a clear misunderstanding that compliance is the same as safety. It is not.

What few people seem to realise is that rules are almost always compromises between different agendas. Also keep in mind that no rule is applicable in each and every situation. Rules often deal with common situations, specifying one best way of how to deal with a situation. The real world is much messier than that, alas, and will throw specific situations right in your face where the rule does not fit so well.

History has taught us that even if you comply with rules, even so an accident may occur. Sometimes the combination of several acceptable factors (especially if they are bordering the threshold of acceptance) can cause accidents. Erik Hollnagel calls this functional resonance. Even though all factors are within acceptable limits, their unfortunate combination together can cause an accident.

The other day, my former colleague John Awater told me a fabulous way to illustrate the issue. The example below was inspired by this:

Say your system exists of four different elements that work together. Each of them is to uphold high standards. The norm says that they are to operate at a minimum of 75%. The system as a whole will be clearly unsafe when it drops below 50%. 

Say that the individual components are functioning really well, at 80%, 80%, 85% and 85%, so not even close to the threshold. The total effectiveness of the system, however, is not better than 0,8 * 0,8 * 0,85 * 0,85. Which is just above 0,49 - but below our safety threshold of 0,5!

This is of course not the REAL way things work (if only because it will be hard to put simple numbers on something so complex) and just a gross approximation, but it is a nice little model to show you why compliance can fail.

It is also a good illustration to demonstrate that reductionism is often not helpful. In general, it is not the separate parts that create safety, but the parts in interaction. So (back to last week’s post) why so much focus on behaviour?

 

Also published on Linkedin.