A diver had an oxygen toxicity seizure because an incorrect gas was filled in a cylinder by a dive centre. A baby died because the wrong dose of medication was injected. Who is to blame for the error and how do we try to make sure that these types of incidents aren’t repeated?
Contributed by
Some of the readers may remember an article I wrote on this subject a couple of years ago, but this one will go into much more depth and give examples of the issues faced in both the scuba diving community and other environments, which have more established safety management system programmes and cultures.
As a quick recap, a safety culture is made up of five component parts: a just culture, a learning culture, a reporting culture, an informed culture and finally, a flexible culture. Each one contributes to the wider improvement in safety, and to a certain extent, without each piece of the jigsaw puzzle being in place, a safety culture will struggle to develop and survive.
Developing a safety culture is a pro-active process and needs to be led from the top down, although pressures from below may influence the speed at which it is adopted and develops.
So what is a just culture?
Sounds like some wooly description which means that people can get away with anything, i.e. a ‘no-blame’ culture in which errors and poor behaviours are accepted as the norm without recourse. This isn’t the case. The Royal Air Force defines a just culture as “an atmosphere of trust where people are encouraged, and even rewarded, for providing safety related information and where it is clear to everyone what is acceptable and unacceptable behaviour” (www.maa.mod.uk/linkedfiles/regulation/manualofairsafety.pdf).
This document contains details of the safety management system in place within military aviation, including a slightly modified and more detailed version of the flow diagram (on the following page), which describes how errors, mistakes and violations are dealt with in terms of culpability and responsibility.
You may argue that an operational organisation which has millions of pounds of equipment and personnel to deal with and a very formal organisational structure within which to operate, has very little relevance to recreational diving.
I would argue there is considerable relevance, if only because the fact that there are regulations and a structure in place means it is easier to ‘draw the line in the sand’ as to what is right or wrong. However, as will be shown, the lack of clarity of right and wrong certainly makes it harder to determine how to deal with errors, mistakes and violations.
A just culture is a difficult concept to grasp for the majority of people because our society is developing into one in which we are always looking for someone to blame and that personal responsibility is diminishing. The following examples will hopefully put just culture into context and maybe adjust your perspective on ‘right and wrong’.
Exhibit A.
A nurse gave an eight-month baby which had been diagnosed with severe heart problems 1.4 grams of calcium chloride instead of the correct dose of 140 milligrams. It was the only serious medical mistake that she had ever made in her 24-year career. Overnight, she realised the mistake and reported it. Unfortunately, the baby died five days later.
There were a number of contributor factors: poor handwriting in the medical notes by the doctor; the staff were tired; there was a change of shift, so there was poor communication between staff; and then there was the general poor health of the baby. After the baby died, the nurse was escorted off the hospital site and then fired a few weeks later.
After a number of harrowing court cases in which she tried to defend her innocence, the nurse committed suicide. A nurse is only one part of a much wider system covering doctors, other nurses, shift pattern schedulers, and equipment designers and manufacturers.
Unfortunately, where to draw the line for accountability and responsibility is not clear, especially when a fatality is concerned.
Exhibit B.
Now consider this incident. A dive centre was running two courses from the same boat: an OC advanced nitrox and decompression procedures course, which was using 80% deco gas; and a CCR Mod 1 course, which had air in the diluent and bailout cylinders. At the end of the day, the OC divers went to one end of the kitting up area in the dive centre, and the CCR divers went to the other. Everyone dekitted and left their cylinders in situ for filling ready for the next day’s diving. The lead instructor told the dive centre staff member who was going to fill the cylinders, that all of the Ali7s were to be filled with 80%.
The following day, the dives were undertaken with the CCR divers conducting bailout drills at around 35m. One of the divers, after bailing out, didn’t feel quite right so went back onto the loop. At this point, his loop pO2 went really high, so he bailed out. Again, he felt wrong and went back onto the loop. Again, he had high pO2 in the loop. He bailed and then had an oxygen toxicity seizure. Fortunately, his instructor lifted him to the surface and he survived.
Once back on shore, they analysed the situation. It transpired that the staff had filled all of the Ali7s, which included the one attached to the rig belonging to the CCR diver who had the seizure. A staff member had turned the cylinder off, depressurised the regulator, bled the cylinder down, filled it with 80%, put the reg back on and repressurised it, and put it back where he found it without marking the cylinder or letting anyone know that this had been done.
None of the CCR divers, including the instructor, had analysed their bailout gases before diving, and therefore, the issue was not picked up before they got in the water.
Now, how would you treat the staff member who had done what he had been asked to do but didn’t necessarily understand the consequences? What about the instructor potentially not following standard procedures* by not analysing gas before each dive? (*I don’t know which agency in this case, so it might not be in the standard procedures, but the majority of agencies state that gases must be analysed once one is dealing with nitrox or trimix in any of the gases being breathed.)
So, even in diving and non-fatal incidents, there isn’t a clear cut answer about what is right or wrong, and who should be to blame. Trying to understand the reasons why the incident occurred is the first step in reducing the emphasis on ‘blame’ and trying to work out how to make things safer the next time around.
Determining culpability
Professor James Reason of the University of Manchester recognised this problem and proposed a decision tree for determining the culpability of an unsafe act—the aim being to try to determine whether an action was an honest mistake, or whether there was likely to be some responsibility for the outcome. The diagram on this page shows the original version of this decision tree, but a more updated version is shown in the afore-mentioned link to the RAF site.
Bear in mind that for such a decision tree, or substitution test, to work properly, the analyst must not know what the outcome was (hard, I know) for a variety of reasons. This is because of hindsight and confirmation biases.
Note: when you come to the box entitled “Pass substitution test?” use the question “Would three other individuals with similar experience and in a similar situation and environment act in the same manner as the person being evaluated?”
- If the answer is “Yes”, the problem is not the individual, but more likely the environment that would lead most individuals to that action. (Proceed to the question, “History of unsafe acts?”)
- If the answer is “No”—if similarly experienced individuals would not have acted in a similar manner—it’s more likely that the individual being evaluated is more culpable or accountable and in need of action—whether it is counselling or removal or whatever. (Proceed to the question, “Deficiencies in training and selection or inexperience?”)
This picture makes it all appear so easy when looking at culpability, but Dekker, in a number of pieces of work, describes the fact that “the legal characterisation of behaviour as negligent is extremely complex, subject to many judgment calls, and in reality an after-the-event social construction. Those evaluating the behaviour are subject to bias, particularly outcome bias and hindsight bias.” (Dekker SWA, Just Culture: Balancing Safety and Accountability. Ashgate, Aldershot, 2007.)
So in actuality, it is only after the effect that you can determine whether an error or violation has taken place, and it is a subjective exercise when it comes to motivation.
Now back to the real world where there are significant shades of uncertainty and we are dealing with real people, some of whom may have been injured, how do we improve matters and create the environment where divers can talk about their mistakes, either anonymously or in public?
Barriers
We need to understand what the barriers are to preventing a just culture from developing.
Given the emotional roller-coaster we ride in the event of a serious incident or fatality, it is easy to see why it is difficult to discuss fatalities in an immature safety environment. Those involved are grieving for those who have been lost. There is a need or want to protect the dignity or reputation of those involved (even if they did make a silly mistake that cost them their lives). And finally, there is often a lack of detailed data to understand what happened and why.
Lack of data creates uncertainty, which invariably leads to speculation. This is not useful when trying to determine lessons learned. Furthermore, the only person or people who really understood the decision-making process are no longer with us.
So, what prevents us from discussing non-fatal incidents when there are survivors and there isn’t the same level of raw emotion as there is in a fatality?
I believe the following are all high up on the ladder of reasons: emotion, fear, pride, the litigious nature of society, lack of structure or process to allow the other complementary cultures to develop. Personal pride has been developed over time and the fear of its loss comes from the following linked factors:
- the majority of people don’t like to discuss their personal failures,
- the majority of scuba training is delivered through positive reinforcement such that people are always told that they are great (even if they aren’t), and
- there is a significant personal investment in both terms of time and money, and people don’t want to feel that that investment was wasted.
Standards?
Interestingly, the majority of research and published literature looking at just culture considers the formal disciplinary or accountability approach in how to deal with the individual(s) or the group(s) that have made the mistake, error or violation. However, the majority of diving that takes place is done outside a formal organisational structure. Indeed, there are very few actual rules with the majority of the basis for ‘safe diving practices’ defined as guidelines or best practice.
Whilst diver training organisations do have their own standards which instructors have to adhere to, and national legislative bodies like the Health and Safety Executive (HSE) in the United Kingdom have their legal regulations, these don’t impact the majority of divers.
Indeed, in the UK, you could walk into a dive shop, buy a complete set of scuba equipment, fill the cylinders with air, and then go and dive to whatever depths you like without any training or certification.
Even though there is a national governing body (The British Sub Aqua Club—BSAC), they have no governance or authority over any of the other diver training organisations operating in the UK or any diver diving outside of a BSAC club-environment. Consequently, this means that the judgement of what is right or wrong is very difficult to define—even harder than the case of the nurse above.
Negative criticism
Unfortunately what sometimes occurs when incidents are published online in a public (non-anonymous) manner, is that they are dissected and criticised in terms of equipment configuration, training route or favoured training organisation, decompression profile, etc, in a negatively critical fashion rather than understand why the diver made the errors or decisions they did and address lessons that could be learned as a consequence.
This negative criticism appears to more vociferous if the ‘incident’ diver in question doesn’t conform to the respondents own ‘norm’, which ironically, could be a long way from best practice but they have ‘always been done this way’ and therefore must be right!
Fortunately, over the last few years, this attitude has started to be tempered but it is still prevalent in some quarters which reduces the opportunity to learn from others’ mistakes.
In non-diving environments, punishment has legal or professional connotations, but in a recreational activity, this could be personal or professional reputation and/or pride. This public criticism of detailed incidents is the “punishment” which needs to be managed with a just culture in sport diving. The matter is further complicated when instructors publicly talk about their incidents as this could be used against them in potential future cases where a dive did end up with fatal consequences.
Making diving safer
So how do we improve things to make diving safer? The first step is the normalisation of the reporting of incidents. An incident must not be seen as a failure, but rather it is an opportunity to learn. The stigmatisation of reporters must be recognised and reduced to give the confidence that others can report their incidents without fear of ridicule or negative criticism—people don’t get up in the morning and decide to make a string of mistakes that could (nearly) cost them their lives!
The reporting of mistakes and errors should be promoted throughout training, across the full range of diving from recreational through to advanced technical diving. This reporting shouldn’t just be in the form of report forms to prevent litigation but to allow all to learn, and anonymous reporting systems outside of the organisation should be used if there is an issue with regards to stigmatisation.
Reporting should be considered the norm, not the exception and investment made to support such reporting systems as a consequence.
Secondly, the community needs to recognise that everyone’s level of acceptable risk and specific configuration is unique (broadly) to them and will address feedback in that context. I have my views of what is acceptable or not, but when I provide feedback on an incident and its causality, I couch it in terms of what that diver’s likely knowledge, skill set, configuration and culture is, rather than my views.
We are always learning, irrespective of our experience, skill set and knowledge. However, the ability to learn from others’ mistakes can only happen when those mistakes (and their mitigations or strategies) are exposed in a manner which promotes honesty and prevents negative criticism; that is what a just culture is about. ■
Gareth Lock is an accomplished technical diver based in the United Kingdom. Currently serving in the Royal Air Force, Lock is undertaking a part-time PhD examining the role of human factors in scuba diving incidents. For more information, visit the Cognitas Incident Research & Management website at:
Cognitasresearch.wordpress.com