Print

This a summary one of the holy grails of safety literature. Copies of the original book are almost impossible to find, or at least it is near-impossible to find one that you actually want to pay the price for. 

Enjoy. It’s a fascinating book that was clearly ahead of its time in many ways!

1978, Wykeham Publications, ISBN 0 85109 750 2 (first edition)

Chapter 1: Need to understand

This book intendeds to offer a way of formulating general rules and principles about the emergence of disasters, derived from examination of evidence available from past disasters and large accidents. (p.3)

Looking for general principles, which may help to understand the creation of disasters. Not purely technical causes. Better to think of the problem of understanding disasters as a socio-technical problem. Disasters arise from an absence of some kind of knowledge at some point. Understanding some aspects of disaster can be gained by discovery how knowledge and information related tp events that provoke disasters are distributed. (p.3) 

Scale and likelihood of disasters in modern days are affected by energy accumulation, environmental disturbances and there are administrative influences as well. Page 4 tells about the inter-relationship of power and officially approved knowledge. “Disasters always represent failures of intention” and are outcomes of misdirected energy.

In our rational Western society, the occurrence of a disaster indicates that there has been a failure of the rational mode of thought and action that is relied upon to control the world. (p.5) Therefore we must examine the rational model. Influence of organisational decision making has grown over the past centuries. There is a centralisation towards the ‘large players’, but these are to a degree restrained by their environment (which also changes).

Consequences of false assumptions (p.6): high quality ‘intelligence’ is necessary for correct action.

Chapter 2: Attempts to understand disaster (1)

Starts with a general discussion of attempts to study, list and classify disasters, including the comment that looking at the number of fatalities is little useful, because the number of fatalities of disasters is very much a function of population density. The area to study is the period before the disaster breaks: preconditions, which may be technical, social, administrative or psychological. (p.17)

One possible approach to understand is found in the investigation of possible technical failures by engineers. Failures in this sense have to be measured against goals and functions of the technical system. A disaster (failure) means that things don’t continue as normal. This notion can also be used for non-technical failures. Some systems have a built-in degree of forgiveness and compensate for failures, but function as intended as a whole.

Engineers often work under limited knowledge of conditions, properties and the like. Many failures from designs extended beyond the knowledge or experience of the designer. There is a lot of trial and error and learning from experience.

Page 24: “Many areas of design, especially the assessment of risks, cannot avoid the reliance upon human judgement. The natural sciences may radiate an impression of distilled rationality, but even if science were like this, in all of the practical and concrete applications of such science, it would have to become a part of the engineering design process, and thus become subject to all the restrictions and limitations outlined above”.

The next page then tells about tacit knowledge.

"Engineering, applied science and design generally are very much human activities in which aspirations towards an ideal of rationality have to be tempered by the limitations both of human abilities and of the ‘state of the art’.”

Paragraph 2.4 is about accident studies and management of (industrial/occupational) safety. Much attention is paid at the pre-accident phase, but it is mainly focused on smaller accidents and near misses with psychological causal models and human error.

Chapter 3: Attempts to understand disaster (2)

This chapter looks at previous studies of disasters, most of which have been mainly concerned with the event itself, or its aftermath, including recovery. Carr (1932) made the interesting distinction between ‘instantaneous’ and ‘progressive’ disasters and whether disasters were ‘focalised’ or ‘diffused’ with regard to the area of impact. 

An interesting comment is that one study speculates on the possibility that society may need disaster to ”vent internal pressures and transform anxious dread into a mass target-phobia”.

In post-War studies on disaster, much attention has been spent on warnings and their efficacy. Chapter 3.2 deals with discussion of threats and warnings. It is here that Turner starts working towards the importance of information (e.g. a threat can be treated as a piece of information that is reacted upon sufficiently, or not, because of info overload, distortion, etc.). One important factor can be how authorities choose to issue warnings (p.44-46). Warnings should be accurate, unambiguous and reliable. This may be different, however, when the predicted danger not has been experienced for a long time and the signals of danger cannot be perceived directly.

Chapter 4: Three Disasters Analyzed

To explore what previous studies have not (the pre-disaster phase, common patterns in the build towards disasters), Turner looks at three disasters with good government documentation. First, he discusses ‘variable disjunction of information’: a (complex) situation in which a number of parties handling a problem are unable to obtain precisely the same information about the problem, so that many differing interpretations of the situation exist (p.50). This concept is further discussed in chapter 4.2 before 4.3 takes on the three disasters: the 1966 Aberfan rubbish tip slide, the 1968 Hixon level crossing accident and the 1973 Douglas, Isle of Man Summerland fire. (A minor point of criticism: Turner might have spent some more time explaining the accidents because I found his discussion of Aberfan confusing - especially when you have no previous knowledge about the case, which I luckily had).

While the accidents are very different, there are similarities between the three cases:

  1. Rigidities in perception and beliefs in organisational settings: the accurate perception of the possibility of disaster is inhibited by cultural and institutional factors. Perceptions are not merely individual failures, but may be created, structured and reinforced by the set of institutional, cultural or sub-cultural beliefs and their associated practices. 
    “Part of the effectiveness of organisations lies in the way in which they are able to bring together large numbers of people and imbue them for a sufficient time with a sufficient similarity of approach, outlook, and priorities to enable them to achieve collective, sustained responses which would be impossible if a group of unorganised individuals were to face the same problem. However, this very property also brings with it the dangers of a collective blindness to important issues, the danger that some vital factors may be left outside the bounds of organisational perception” (p.58) (there are indeed some parallel’s to Vaughan’s points about structural secrecy and culture of production!)
  2. The Decoy Problem: attention is paid to some well-defined problem or source of danger. This is dealt with, but it distracts attention from still dangerous, but ill-structured problems in the background. It may be an “institutionalized response” or that when “a new problem becomes clearly defined, the possibility arises that it may be obscuring some further problem. A way of seeing is always also a way of not seeing”.
  3. Organisational exclusivity: disregard of non-members. An attitude that those within the organisation know better about the hazards than outsiders. Often they do, but they can also have blind spots…
  4. Information difficulties are often associated with ill-structured problems. Information difficulties happen all the time and often they don’t lead to disaster. There is a great variety of such problems and their relation to the possible emergence of disaster will be explored through the book. Problems are handled by many actors, each with his own ‘theory’ about the nature of the situation since the problem is ill-structured, responsibility for handling is often distributed in a vague and unclear manner. The actors all have their own views of how they information is, is likely to affect the total situation, and they generate their own rules of action from these views. Interesting comments about idealistic views of top management that are unlikely to be followed-up, except in case of disaster on page 63. Information can be lost in transmission or not be reacted upon for a number of reasons.
  5. The involvement of strangers, especially on complex ‘sites’. One problem is that information only circulates within some of the groups involved, or is distributed incomplete. Strangers are difficult to define as a group. Besides, many of this amorphous group will never need the information or will not know that they need it at the time. Brilliant quote on page 69: “Human ingenuity is endlessly resourceful in finding ways of manipulating the objects in a concrete situation in a manner unforeseen by the designers of one abstract aspect of that situation.” Likewise designs will almost always have unintended and unpredicted sides, like that safety features may also have unwanted side-effects. For example an emergency exit that gives access for intruders.
  6. Failure to comply with regulations. Including use of waivers, compensatory measures to restore standards of safety and controls of controls (or their failure).
  7. Minimizing emergent danger - failing to see or fully appreciate the magnitude of the danger. Among other through underestimating possible hazards, minimizing emerging danger (undervalue), changed awareness of dangers and failure to call for help.
  8. Nature of the post-disaster recommendations: they are concerned to deal with the problem which caused the disaster as it is now revealed, and not to deal with the problem as it presented itself to those involved in it before the disaster. (p.74) From the Summerland report: “Not every failure which is obvious now, would be obvious before the disaster”.

Chapter 4.5 presents a list of factors that may combine to produce disaster, and it sums up the first lessons from the analysis so far.

Chapter 5: The Incubation of Disasters

An expanded examination of ten more disasters confirmed the findings presented above.

5.1 discusses definitions of disaster. Chooses a definition of a limited kind of disaster that identifies events that are relevant to the sociological study of causes and preconditions: 

An event, concentrated in time and space, which threatens a society or a relatively self-sufficient subdivision of a society with major unwanted consequences as a result of the collapse of precautions that had hitherto been culturally accepted as adequate.

5.2 shows a developmental sequence of six stages (p.85):

  1. Notionally, reasonably normal state: Initially culturally accepted beliefs about the world and hazards. Precautionary norms in laws, codes of practice or traditional folkways.
  2. Incubation period: accumulation of an unnoticed set of events at odds with accepted beliefs about hazards and the norms to control them. Although there may be dissident beliefs…
  3. Precipitating event. Shit hits fan. Transforms general perception. Surprise. Cultural disruption.
  4. Onset. Consequences become apparent. Collapse of cultural precautions.
  5. Rescue and salvage.
  6. Full cultural re-adjustment. Investigation. Beliefs and precautionary norms are adjusted to fit newly gained understanding of the world (“this must never happen again”).

This is then explained with help of one of the studied cases. The chapter concludes with a suggestion to learn from weak signals: “It may be fruitful to look at the circumstances of ‘near-miss’ disasters as a source of comparative data.”

Chapter 6: Errors and Communication Difficulties

The incubation period is the most important phase in this study. Sometimes disasters have their origin in the phase before, when existing norms are violated. But norms can also be violated because they are perceived as inadequate or out-of-date.

There are four main types of disaster that develop during the incubation period:

  1. Events go unnoticed or misunderstood because of erroneous assumptions. Including a comment on page 101 (one of many in the book) that is truly ahead of its time: “Each organisational unit or sub-unit will have developed its own distinctive sub-culture and its own version of rationality”.
  2. Events go unnoticed or are misunderstood because of difficulties in handling information in complex situations - especially when with ‘strangers’, or when tasks are vague, hasty, large-scale, etc.
  3. Effective violations of precautions passing unnoticed because of cultural lag in existing precautions.
  4. Events unnoticed or misunderstood because of reluctance to fear worst possible outcome.

Additionally, there can be issues with boundaries and communication networks. It is important to look not only at the total available information, but also at its distribution, to the structures and the communication networks within which it is located, and the boundaries that impede its flow (again, parallel to Vaughan’s structural secrecy). Seven cases illustrate this.

Chapter 7: Order and Disruptions

Accidents and disasters are measured in terms of the disruption of an order which was intended or at least anticipated in the future. Living creatures act purposefully, with an explicit goal, or not. It is said that living matter strives towards ‘negentropy’ - an opposite to entropy.

It’s not enough to have the intention and act purposefully. People never act in a vacuum: we must achieve our goals in the environment we find ourselves in, and success and failure depends to a great extent upon the degree to which our environment is congenial to our goals (p.128).

We don’t need to have a perfect or complete knowledge of the environment to be able to act in it, and to move towards our desired goals. All we need is knowledge that is accurate enough, although we never can say with absolute certainty that we know enough until after the event. Instead of perfect knowledge, humans operate with a variety of maps and models that organise available knowledge about the environment and direct the collection of new knowledge. The problem of determining just what level of accuracy of knowledge is required for goal attainment (or survival) is one which never can be finally solved. All maps and models and other forms of organising information have assumptions built into them about the relevant portions of the environment which must be attended to, and about the portions which must be ignored. (Major) shifts in the environment may render existing maps useless and it is the resulting discrepancies between the way the world is believed to be, and the way it really is that contains the seeds of disaster.

(These passages from pages 128-129 are some of the essential stuff from this book, I think)

Turner then goes on to discuss rationality and the pursuit of goals. Rational planning is perceived necessary to reach goals. It helps to make things more predictable by modelling things and spending resources wisely. But merely organizing resources and knowledge in a rational manner is no guarantee for success and the appearance of rationality can be deceptive.

Sociologist Mannheim distinguishes between functional and substantial rationality. Functional is often found in organisations where rationality is distributed over many players. This division of labour and responsibility may lead to that only a few in top-positions are expected to think intelligently and independently. The inability or unwillingness of individuals concerned with routine jobs to question the logic of the system may lead to disasters (p.130).

Rational behaviour has its limitations. A large organisation with an impressive functional rationality is no guarantee of substantial rationality. It is therefore wise to have a critical stance towards the model that rational organisations use to guide their actions. This criticism may help to transform functional rationality into substantial rationality and help to prevent disaster. Ideally, the ‘radius of foresight’ should exceed the ‘radius of action’, but it rarely does because we lack knowledge about many aspects of the environment and unintended consequences spring from our rational plans.

7.3 discusses Herbert Simon’s bounded rationality that has some parallels to what was discussed above. Humans have only limited knowledge of the problems and the environment. Because of these limitations, we don’t go for ‘best’ or ‘maximum’ solutions but rather satisfice (go for acceptable, good enough).

Chapter 8: Information, surprise and disaster

This chapter is very well characterized by its title. There is a rather theoretical discussion of what information is (according to information theory) and how information reduces uncertainty. The difference between communication and observation. How surprise is created by new information (and expectation).

Then Turner discusses informational discontinuities, which also affects surprise. Unexpected events (surprises) often constitute the kind of event that leads to a change in the interpretation of the environment and to a change in the kind of ‘messages’ which the environment is thought of to ‘send’ to us.

There is ‘normal’ surprise and surprise of a ‘higher order’: anomalies, serendipity and catastrophes. All have their origin in a difference between the view of the world and how it really is. None of these reduces uncertainty, at least not at first. Anomalies are things that may raise a flag, but usually not are seen as to require immediate action. The other two are basically of the same kind, but one fortunate and the other clearly not - the difference often attributable to randomness.

8.3: Surprise and the differential awareness of hazard: knowledge is not evenly spread which affects how different groups perceive risks and its acceptability - and the surprise when disaster ‘strikes’. There may be differences between the ‘official’ definition of the risk and others.

The chapter concludes with a discussion of ‘catastrophe theory’, drawing on Thom’s theory of mathematical catastrophe (abrupt changes in behaviour), Bateson’s jumps in learning processes (p.157) and Platt’s process of ‘hierarchical restructuring’ (p.158) in which order emerges in jumps and new information is brought in. Platt characterizes these jumps by and overall cognitive dissonance/increased state of uncertainty through awareness of anomalies, sudden restructuring and resulting in a new structure that is conceptually: a simpler and more general way of organizing available information that the old structure.

Chapter 9: Disasters and rationality in organizations

This chapter combines much of the (abstract) information in the preceding chapters. 

Only rarely the emergence of disaster can be fully attributed to the blunders, errors or misunderstanding of a single individual. It is important to take into account the contributions of others and how the behaviour of individuals is shaped by the institutions and organisations within which they act.

Attaining goals in open or closed systems. Limited problems: we can approach full rationality. In open systems, however, we are not able to gain full knowledge and there we must make the best possible decisions given the available information and resources. In many complex real-life situations, decision makers still try to approximate the fully rational behaviour. The systematization of such disciplined ordering of issues may often be helpful, but “where the number of imponderables is great, all that may result is the cloaking of ignorance with a layer of false precision” (p.162).

We need to assess any given error in two ways at two different levels:

  1. Was the decision ‘reasonable/rational’ given the information available at the time? It is necessary to be careful to judge something retrospectively as ‘incompetent’ or ‘poor judgement’ because knowledge of outcome markedly affects the assessment of likely outcomes - without people being aware of this.
  2. Would other competent persons likely make the same decision? If so, then the behaviour is not individual, but likely to be guided by organisational procedure and habits.

Organisations often struggle with integration. It needs to act as a single entity, but the integration of its members is hindered by 1) the diversity of the individuals with their individual beliefs, fears, rationales etc., and 2) the necessary diversity and specialization that the organisation needs to cope with the complexities of the world. Organisations balance these things through hierarchy of authority and decision-making; the distribution of formal power and through beliefs and culture.

Turner poses that the organisation itself ‘binds’ the rationality of the people in it - both individuals and groups. Groups can form a bounded decision zone: a zone with shared bounded rationality defined by a common set of perceptions and decision premises, which define a field of likely, and thus to some degree anticipated events (p.169). These bounded decision zones can find themselves placed in a hierarchy (p.170).

Chapter 9 then illustrates the theory with some examples. These are used to discuss details. Among these that consequences of errors will differ, depending on the hierarchical level they are made on: “unintended consequences produced within organisational settings make non-random use of the rules or organisations in their propagation” (p.180).

The final part of the chapter links together the concepts of information and energy (usually contained within, or channeled by an organisation) or the transformation of energy.

Chapter 10: The Origins of Disaster

If you are short on time, read chapter 10 as it effectively sums up the main parts of the book (well, provided you manage to track down a payable copy of the book, of course). We start with the general principle: Disaster equals energy plus misinformation.

It is necessary to study both social and non-social sources of both elements that combined can produce disaster. (Interestingly, also lack of energy will be able to produce a disaster, of course, a point that Turner does not discuss in his book!).

To prevent disasters, we would need to be aware of the points in the developing incubation networks. Alas, we do not have complete information and perfect foresight. Turner’s conclusion is that the action called for is less physiological or psychological (‘operator error’) than social. Especially when incubation periods extend over months or years, there is less need for split-second action than for good advance information. Often relevant information is not available at the time in a form people can use. Problems with information are:

  1. Completely unknown information. (these are rare)
  2. Prior information is noted, but not fully appreciated.
  3. Prior information is not correctly assembled, e.g. because it drowns in other information, or is distributed among several organisations, or is withheld.
  4. Prior information does not fit existing models or categories.

Disasters will almost always involve organisations because they have a near-monopoly on the necessary energy to cause disasters. While organisations are rational, they never can be fully rational. They have bounded decision zones that create the possibility that information is ignored (etc.) and disasters occur.

“The relationship between institutions and the world may be regarded as a continual cycle of assumptions; exposure of the limits of assumptions; the subsequent revision of the assumptions; and the replacement by new ones.”

The final two paragraphs of the book:

“The problem of understanding the origins of disaster is the problem of understanding and accounting for harmful discharges of energy which occur in ways unanticipated by those pursuing orderly goals. The problems of explaining the events preceding and surrounding disasters may then be seen as accounting for biases and inadequacies in the habitual ways of handling information relating to impending energy discharges. We can, and we do, pursue such problems with the intention of using rational means of solving them and subduing disasters. Some of the problems may arise because of our attempts to manipulate the world by guided processes, but we cannot abandon the rational linking of knowledge to action. Our only option, particularly in a world dominated by large-scale organisations and dependent upon large-scale technology, is to come to a better understanding of what we can and what we cannot achieve with our knowledge; so that on one hand we do not come to rely upon it in the blind hope that it will perform miracles for us.

However comforting the promise of an infinite tidiness offered to man by the older rationalist notion of the possibility of arranging our affairs always on the basis of the anticipation which our conscious knowledge offers us, we must recognize that we are in a contingent universe, in which ultimately there are limits on our ability to reduce uncertainty, to master all of the open-ended and perverse qualities of our environment, and upon our ability to prevent disaster. If we start by recognizing that instability lies at the heart of the world, then we may come to realize that the optimism and the assertion of certainty which enables life to create and spread order cannot completely overcome this instability. We may come to realize that, even when our strategies are successful, they are still dependent upon the munificence of the environment and upon the mutability of fortune.”