In the December issue of NVVK Info the review of James Reason's latest (last?) book has been published:
English translation below:
No need to introduce professor James Reason to safety professionals. At least, I hope I don’t need to, and I sure hope that all have at least something from him (preferably either his seminal “Human Error” or “Managing The Risks Of Organizational Accidents”). Here’s his latest, thinnest (only 128 pages) and probably last book on human error. It kind of sums up his work since the 1970s and follows its development until today with mainly new examples.
The book is kind of a personal retrospective for Reason on things that have shaped his thinking and guided his work through the years rather than a technical or scientific text - one might check his other books or papers for that. Some chapters even include a ‘further reading’ section to help the reader.
The book thus starts in the early 1970s with a funny incident involving cat food and a teapot which prompted the young James Reason to move into the field of human error and gave him the insight that much of our behavior is under control of the immediate environment, leading to (unintended) activation of action sequences that may or may not be appropriate for the circumstances. Many things happen unconsciously.
Reason then goes on discussing plans, actions and consequences; noting that errors (often seen as deviations from plans) aren’t necessarily bad, but their unwanted outcomes often gain them the label ‘undesirable’.
An important element of Reason’s work is of course the connection to Rasmussen’s three performance levels (Skill based, Rule based and Knowledge based) and the way he distinguished between to three different error types corresponding to those performance levels.
Chapters 4 to 7 deal with absentminded slips and lapses which are typically errors that are made in situations that we master (this may be counter-intuitive, but think about it). One issue related to these absentminded errors is that humans just cannot do everything consciously all of the time - we would go nowhere and results would probably disastrous (try to descend a stair consciously, if you dare, for proof). Quite funny is the case in chapter 6 where Reason was asked to be an expert witness in a case of alleged shoplifting where the accused claimed to have absentmindedly overlooked paying for some items. Chapter 7 then deals with a special case of absentminded slips: Freudian slips. Another amusing side-track with a more close-to-earth conclusion than that Sigmund reached.
After this Reason returns to planning errors starting by discussing the major components of the planning process and the various sources of bias leading to planning failures. Interesting, but alas all too brief. The same goes for the failures related to collective planning which include phenomena like satisficing, self-serving biases and groupthink.
No book on human error without a discussion of violations, here found in chapter 9. Reason stresses that in general ‘violators’ don’t intend the bad outcomes that may be the result of deviating from a rule. For this he spends some space on the ‘economics of violations’ - something that regrettably is a little understood mechanism and often is dealt with in the wrong way (i.e. by more control and harsher penalties instead of e.g. increasing the benefits of compliance).
The next chapter deals with organizational accidents, a term coined by Reason. It’s the briefest synopsis of his seminal 1997 book (only skimming Swiss Cheese here) and I recommend reading that one anyway. This leads in a chapter on Safety Culture, a subject discussed in the 1997 book as well. What’s interesting here is that the focus is on factors that resist change. A truly important discussion, a good addition to the 1997 book and probably the chapter to read if you only have little time (and in that case, make sure to read the brief postscript too!).
Chapters 12 and 13 deal with medical error, the field that Reason has been active in most recently. An important factor here is the legal aspect. The entire 13th chapter deals with pros and cons of disclosing error. This is focused on medical error, of course, but with lessons for other domains (although these are hardly discussed).
The final chapter sums up the most important points from the book and adds two lessons for people wanting to explore error: 1) learn as much as possible about how people work in their domains, and 2) never be judgemental.
Those who have read Reason’s previous books will find little really new here, but it may serve as a fine refresher. Personally I miss a bit the lessons from his previous book (“The Human Contribution”) with the human more in the role of hero who saves the day instead of the source of error. But who knows… maybe there is another short book from professor Reason in the pipeline after all?
N.B. Those who are wondering if the editorial errors are intentional (and if so, why not explained/commented): according to Ashgate they aren’t. Presumably wrong references to chapter numbers (on most, or all, occasions), and the most embarrassing and huge mistake that slipped into the David and Goliath story will be fixed in a new edition...