How you can improve your performance and safety by understanding why we make good (and bad) decisions.
Gareth Lock is an accomplished technical diver based in the United Kingdom.
Currently serving in the Royal Air Force, Lock is undertaking a part-time PhD examining the role of human factors in scuba diving incidents.
For more information, visit the Cognitas Incident Research and Management website at:
Cognitasresearch.wordpress.com
There is a significant body of evidence which shows that divers involved in diving incidents often make poor decisions—sounds obvious, doesn’t it? So if it is so obvious, why do we continue to make them? Simple decisions such as continuing a dive when they should have ended it, choosing the ‘wrong’ gas for narcosis/density/decompression reasons, wearing the wrong thermal protection for the conditions, diving with the ‘wrong’ buddies, the list goes on. This article will cover how we make decisions, and more importantly, why we make poor decisions and the pitfalls we encounter when doing so. We won’t be able to always stop ourselves making poor decisions, if we can recognise when we are likely to make them, we can at least try to put controls in place to check ourselves.
So, how do we make decisions? In its basic form, decisions are made by referring to a set of ‘rules’ or ‘models’ we have in our conscious and unconscious memories. Those models may have been developed through direct experience or by a proxy i.e. learning from someone else’s account of an event or by someone explaining it to us (teaching/training/coaching).
Referring to these ‘models’ in our memory is very simplistic and doesn’t necessarily highlight the problems we face. So the following sections break this down further and show how we can improve things.
The first type of decision making is known as knowledge-based decision making and is grounded on your own knowledge or understanding of a scenario. Again, simple, you would think. But your knowledge of a scenario isn’t necessarily complete and there are short-cuts your brain makes to speed up our decision making (even when it isn’t required).
Look at Diagram 1, what do you see? Now look at Diagram 2 and see what was really there.
Your brain made a number of automatic decisions comparing the scene to what was in your long-term memory about what those shapes looked like.
Now think about a diving scenario you have been in where you have used a quick generally accepted rule in the past to make a decision. It might have been how much gas to plan for on a dive, whether to continue on a dive in poor visibility because it was ok last time or to undertake a ‘trust me’ dive assuming that the person you are diving with will look out for you.
Those decisions are based on previous experiences but what you face now might not be the same as you might have missed something. Our situational awareness might not be complete.
As an example, follow this link: www.youtube.com. And think about whether you saw what was asked of you in the clip? As it says, about 50 percent of people watching the clip miss the obvious. (There are a number of other clips by the same author that are worth watching to show you how we miss the obvious).
Given that you were sitting in a calm environment watching the video, now think about how easy it would be to miss something whilst diving, or getting prepared for a dive, because you are task fixated or distracted. That might be because you are still mastering the skills needed to dive and still focussing on buoyancy and/or trim, maybe just the wonder of being underwater, it might be because you are actively doing something like photography or videography or chatting to colleagues whilst kitting up.
Once your ‘awareness’ channels are full, you try to fill the gaps; you are assuming knowledge. This is known as complacency, where your ‘model’ of the world is not the same as what is going on around you. e.g. valves not opened properly, buddy check not completed thoroughly, gas not analysed, CCR checklist not complete.
The list of shortcuts is long but think back to the shortcut you take, think why you do it and whether you should continue to do so. Just because something didn’t go wrong the last time, doesn’t mean it will this time! Consider the model below (Illustration 3)
This shows that our knowledge increases with feedback from previous experiences. However, our knowledge might be incorrect because nothing went wrong, which means the model you will use in the future to make a decision will be potentially incorrect (Illustration 4).
Finally, there is lots of evidence to show that our knowledge of what we think we know is lacking. The most well known is the Dunning-Kruger effect where the under-skilled are over-confident and the skilled are slightly under-confident in their abilities.
The graph (Fig 1) shows the results of an experiment to ascertain students’ knowledge of how well they did in an exam. The adage, “You don’t know what you don’t know”, comes from this effect.
There a number of ways to resolve poor knowledge-based decision making:
Continually question what is going on around you, noticing (not just seeing), thinking about what that means, and then anticipating how it might impact your dive, your buddy’s or your safety, and then adding corrective action. It’s no use saying after the event, “I knew that was going to happen”, when you had an opportunity to resolve it!
If you are taking shortcuts, consider using a checklist and get your buddy to follow it with you, ensuring you (and he or she) do the items on the check. Having a checklist and completing the items without actively checking is worse than not having a checklist because you have a false sense of security—your model is incorrect.
Have a debrief after each dive and talk through what went well, why did it go well and what you could do to improve. My personal experiences in other domains (aviation, and oil and gas) have shown that getting people to start having debriefs is hard, but once they realise the benefits, they become part of the routine and they recognise the benefits, which then improves the knowledge for future situations.
The second type of decision-making is known as rule-based decision making where we have a formal set of rules to follow.
In driving this could be compared to when you approach a red light and you know to stop until the green light shows. In diving where there is no formal supervisor, there are very few ‘formal’ rules to follow, you can do what you like as there are no dive police out there.
There is best practice or ‘safe diving practices’ but no real ‘rules’ because there is no form of punishment if you break them. However, when you undertake diving and hold a level of supervision, this means that you are very likely to have formal rules that need to be followed. These might be training agency rules, Health and Safety rules or local/national legislation; and if you break these, there are serious consequences if an incident occurs and it was down to the rules being broken.
Breaking rules, or committing a violation, has been shown in some fields to increase the likelihood of a fatality occurring (compared to a non-fatality), and this is why the rules are there.
The graph below from a study examining errors in General Aviation (GA) in the United States shows that whilst there are more fatal accidents due to skill-based errors, the ratio of fatal to non-fatal accidents is much greater when violations take place. Simply put, breaking rules means you are more likely to die than be injured.
American pioneer cave diver Sheck Exley came up with the six rules for cave diving after examining hundreds of cave diving fatalities; these were contained in his book, Blueprint for Survival, where case studies were used to show how the rules were broken and the consequences of doing so. (See the green side box.)
Technical divers Michael Meduno and Billy Deans did something similar for technical diving, although it wasn’t as simple as Sheck’s, containing more detail. The Blueprint for Survival 2.0: Technical Diving can be found here: www.anaspides.net
So how do we improve poor decision-making when rule-based decisions have to be made? Or, how do we stop violations taking place?
Fundamentally, we must understand why the rule is in place to start with. That might be the need for medical cover or equipment, it might be minimum gas requirements, or it might be maximum depth limits. Each of the rules has been put in place for a reason, primarily to protect you (or others) from human fallibility and the subsequent incident occurs.
Whilst this might appear to limit your own activity or enjoyment, consider and understand the consequences of not following the rules and make an active decision about whether the impact of the risk being materialised is worth the benefit gained by breaking the rule. Sometimes that impact might be your or someone else’s death!
Rules can be broken. Consider driving to hospital with an injured relative or friend in the car. You arrive at a red light for temporary road-works on a clear stretch of road where you can see past it to the lights on the other side. Your light stays red for two minutes, then five minutes. Do you jump the red light because you can see ahead? What about 10 minutes? At some point you will make the conscious decision to break the rules and jump the light as long as the road is clear.
The same goes for diving. It might be your buddy is injured and you need to ascend to the surface missing decompression stops, or you need to rescue them from below the Maximum Operating Depth (MOD) of the gas you are breathing. These are conscious decisions where you have a choice and you decide what to do.
What shouldn’t happen is breaking ‘rules’ because you are too lazy e.g. not analysing gas, buddy check, checklist, conduct skills regularily.
Therefore, the introduction of more rules in ‘non-supervisory’ diving situations is not the answer to improving safety, getting divers to recognise the risks they are taking by improving their knowledge is key, and feedback is essential if divers are to improve.
We make decisions every day of our lives, from choosing what clothes to wear, what route to take when driving to work, to what dive to go on, who our buddies will be, what the run time will be, when to end the dive based on gas remaining or decompression obligation or something going wrong… the list is almost endless.
Some of those decisions have minor consequences; if a poor decision is made, some of them have a very major impact—you (or someone else) can die. That is not being melodramatic but trying to bring the reality of the situation to the fore.
Divers don’t get up in the morning thinking this is a good day to run out of gas or make a rapid ascent to the surface; a number of factors and poor decisions have been made, which come together and lead to the incident or accident.
By understanding the way in which we make decisions, and the fallibility of the human brain, then we might be able to reduce the occurrence of poor decisions and make better decisions, which we can then learn from.
Experts make good decisions more often because they have a much bigger library of experiences to refer; they are also normally keen to learn and increase that library. Talk about your dives and your incidents; learn from eachother. Fortunately not everyone has an incident, but you can learn from others who have. ■