Thursday, 24 August 2017

Sense Making in Safety


                                                    
Jeff, the seafarer and common sense

Jeff had been into boating all his life (26 years), right from a young lad he was on the sea and never really had any accidents. Jeff was very respectful of the risks with boating and seen what boating risks could do first hand as an Air Sea Rescue Leader.


Jeff considered himself to be a very careful marine vessel operator. Jeff had no specialised training in risk management other than what he learnt in his fitter and turning trade in the early 80’s and from owning his own business in the marine industry. He always lived by, ‘it’s about having your wits about what you are doing and having boating common sense’. It was on this day where his life was taken and the lives of those who loved him would change forever.

Jeff had been staying at his holiday house on an island east of Gladstone QLD, and had a call from a boating customer asking Jeff to take him out to sea in one of the boats Jeff sold in his business. So, Jeff decided to leave the island late that afternoon to go back to main land to prepare the boat for the morning test run. Jeff had a brand new 4 plus metre aluminium tiller steer dinghy, the motor had a new safety lanyard attached to it, which wasn’t on the boats he grew up with. Jeff never wore this as it would shut the motor down if he forgot to take it off when he went to move towards the front of the boat, it was an extra thing to stuff around with.

Jeff was on his way back to main land in his dinghy and was not wearing the lanyard as he never needed one of these ever before, it saved a few minutes just by not putting it on. On this day after many years not wearing the lanyard, Jeff hit what was thought to be a dugong (sea cow), Jeff was thrown out of the boat, however the motor kept going and it was in fast mode. Jeff attempted to get into the dinghy a couple of times and on the third attempt he slipped down and the motor hit him in the head and killed him. It wasn’t until the next day that he was found.

What was not unique about Jeff’s story, is his reliance to ‘common sense’ and ‘having your wits about you’. ‘Being witted’ or ‘being careful’ are expressions for what are known as ‘micro-rules’. Jeff and most of us use micro-rules to manage risk and our safety. Many of us think that these micro-rules should be intuitively known, or objectively self-evident. Micro-rules are developed by a human process called ‘heuristics’. Heuristics is any approach to problem solving, learning, or discovery that employs a practical method not guaranteed to be optimal or perfect, but sufficient for the immediate goals. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision. Examples of this method include using a rule of thumb, an educated guess, an intuitive judgment, guesstimate, profiling, or common sense. Some of our mental shortcuts, in reality are myths, they either have no meaning or have little connection to reality, however they provide us with incredible comfort and confidence, sometimes over-confidence (hubris).

To many who heard about my brother Jeff’s story at the time did not make sense that someone would not wear a safety lanyard to save a few seconds, yet the practice to Jeff and old school boat users, it made sense. Jeff had been working on ‘common sense’ for 26 years of his life, and so the shortcut was successful and became the rule in which he and other seafarers used to save a few seconds, it was a trade-off of time for risk. Jeff like all of us did not want to associate losing his life with not wearing the lanyard, he just wanted to ‘get from A to B quickly’ and without fuss.

We can see from Jeff’s story that assigning confidence in micro-rules and shortcuts eventually brings things undone. Jeff’s rule of thumbs did not help him the day he died, why? Many micro-rules are not calculated can be misconceptions or half right which people unquestionably rely on to explain and help make sense of the world around them. In reality they provide comfort and confidence but lack any real meaning.  
         
So, what is ‘common Sense’?

Wikipedia defines common sense as: Common sense is a basic ability to perceive, understand, and judge things that are shared by ("common to") nearly all people and can reasonably be expected of nearly all people without need for debate.

The question is, ‘is sense-making common?
I created an experiment using an image & or the word ‘chocolate. I asked participants to tell me what the image or word meant to them, we got results like: yum, sweet, desert, love, valentine’s day, not many just said, ‘chocolate or block of chocolate’. All the answers were different, chocolate was interpreted and or understood so many ways. ‘Sense making on just one word was not common’, how is there common sense when managing hazards, risk, safety and documents?    
When it comes to risk there is no substitute for learning, experience, training and coaching.

Some learn from a young age, like the danger of power points, hot water, and roads, because someone taught them, or we learned some things from observation. We have also learned from ‘trial and error’, we witnessed something, we were told something or we were trained what to do. Unfortunately, there is little to no room for ‘trial and error’ when it comes to some risks in life.

It is not common sense for a kid whose ball goes on the road not to chase it. It’s not common sense for a kid who sees a fire not to want to play with it, or to want to put something into a power point unless they have learned about the risks associated with what they are doing. Some forms of knowledge are acquired, but not by all, and they certainly can’t be assumed for everyone.

Look at the story of my brother Jeff, he was not lacking intelligence when he was killed, nor did he want his life to end. He simply made sense of the situation differently, perhaps from others, because his perspective and perception were influenced by a trade-off of time for risk. All trade-offs in risk makes sense to the person who is taking the risk. They have good reason for the trade-off and feel they can manage the risk in front of them. It is only in hindsight or when things go wrong when we think the rule of thumb (shortcut) didn’t make sense. We all have our own rules of thumb and shortcuts that we use every day to save time, money of effort.

It is critical to understand that the frustration we see others have over the lack of ‘common sense’ of another person. This often becomes the device for dismissiveness and non-understanding how risk makes sense to others. So, when we have organisations telling workers they have to wear gloves at all times or not to use Stanley knives as they are too risky, what is happening to most worker’s perception? Are they becoming dismissive to the organisations sense making?
If we respect and understand that everyone makes their own sense of risk, then we are on the road to better understanding peoples judgement and decision making, and we might start to realise that ‘common sense’ is not common. People who create shortcuts in risk, think of the gain from the rule of thumb in the face of uncertainty not loss. They see any trade-off which creates some sort of gain as logical common sense.

Common understanding must be created and assessed as something all hold in common, it cannot be assumed. So, how can we create common understanding to create common sense? We know there must be some formula to this learning as organisations need common understanding with their processes and operations.

The key to learning common understanding to risk is through an infinite learning and feedback coil. We call this the, ‘Lens of Common understanding learning coil’.     



Wednesday, 24 May 2017

 Planning for Success!








Unpredictable and unexpected events often test our businesses resilience. These events affect how organisations can stretch without breaking. They test how well the business can bounce back that’s if they bounce back at all. How these events affect us heavily relies on business culture, which is the heart of any organisation. Consider some examples. It is only when we have cranes tip over, space shuttles or oil rigs explode, tires falling off vehicles, that we can see with hindsight there were clear signals and patterns we rejected, that contributed to the incident occurring. Most organisations treat near misses/hits as a measure of safety, rather than system and planning failures. Take the Space Shuttle Challenger explosion for example; many times the engineers had noticed black burn marks on the O-rings. They put this down to the O-rings safely managing and protecting the shuttle from explosion. They were then blinded to the black burn marks increasing in size.

So often we see and hear organisations strategically planning, be it business start-up, new projects or re-booting business vision, yet this is done mostly without culture in mind. Most Businesses start with trying to understand through early risk assessment, what hazards and potentials may impact their businesses before they kick off the plan? They develop contingency plans to protect themselves from worst case scenarios. Their intention is to prevent small unexpected interruptions from slowing down their business processes. People’s foresights, are very limited when trying to do this as a small group, and in most cases, we use who we believe are the ‘experts’, which are normally perceived to be at the top of the organisational chart. Organisations with healthy culture know that our foresights and anticipations are limited, they know that paper based precautions fail.  The unexpected, errors and/or surprises are difficult to foresee or predict in most circumstances, but when we become comfortable with our contingency plans we also become overconfident and blinded to the fallibility of such plans. Businesses who are committed to building resilience, and willing to engage key experts where ever they may sit in the business, not just the top of the org chart, are ones that acknowledge that culture is critical, and the heart of their organisations success. They are the businesses prepared to put money into educating and developing true business culture, to have “collective mindfulness”. When a crisis hits, they are the business that will bounce back quite quickly, yet still be mindful of what is going on. They know relying on systems and procedures alone will not save them from things going wrong.

Planning for containment differs from planning with anticipation, in that it aims to prevent unwanted outcomes after an unexpected event has occurred rather than to prevent the unexpected event itself. Most organisations have unexpected events unfold without being noticed, this means that businesses reliability critically depends on how well prepared their business culture is to be mindfully reactive to things about to go wrong.

Does your organisation have a culture that knows what to look and listen for?
Are there “tick, tick, clunk, clunks that are going unnoticed”?
Do your team members raise even the most insignificant concerns, do you pay attention to the concerns or push them away as ‘trivial issues’?     

Organisations create plans to prepare for the inevitable, pre-empt the unfavourable and control the controllable. Rational as this seems, planning has its short comings. Planners plan in steady predictable settings, they are unconsciously moved into thinking the world will unfold in a predictable manner, this is a misconception to predetermination. When people are engrossed with predetermination, there is no place for unexpected occurrences that fall outside of the realms of planning. Planning without cultural understanding can do the exact opposite of what has been intended, creating mindlessness, instead of mindful anticipation of the unexpected. Plans are built from assumptions and beliefs of how we see the world, this is what sways our expectations and biases. When our expectations are strong, they influence what we actually see, we become blinded to many things right in front of us and then we choose what we approve of and what we choose to ignore. When unexpected issues start developing it takes longer to discover what is growing. If we put our expectations on vague stimuli, with good intent we fill in the gaps, we try reading between the lines and complete the proposed picture the best way we can, this is far from being calculated or robust. We actually paint the picture the way we would expect to see it play out, and the slight issues are soon pushed under the rug, rather than understanding the ways it possibly could unfold. It is only when we have cranes tip over, space shuttles or oil rigs explode, tires falling off vehicles that we can see there are clear signals and patterns we first rejected. Most Organisations treat near misses/hits as a measure of safety, rather than system and planning failures. How does your organisation treat near misses? Do they go back to the process/plan and see what has failed?

By design, plans influence people’s perception and reduce the number of things people notice, this occurs because people predetermine the world largely into the classifications galvanised by the plan. The trivial issues gain minimal attention, and get brushed under the carpet, as we feel they are irrelevant to the plan. These issues are the actual seeds that develop into the unexpected issues, errors, and or events that create organisations unreliable functioning.

Research has shown that perception of risk varies according to life experience, cognitive bias, heuristics, memory, visual and spacial literacy, expertise, attribution, framing, priming and anchoring. In other words, risk is a human constructed sense of meaning associated with uncertainty, probability and context. For example: What one person sees as too risky, another sees it as their opportunity to grow. When planning, one could be forgiven for thinking that compliance would be much easier if it didn’t involve people. When we think systems, procedures and plans control business risks, we largely miss the heart of what truly manages business success, ‘the risk makers, risk takers’, who are you and your people.
Social arrangements give us meaning, purpose and fulfilment, they can also determine the way we make decisions and judgements. Risk is not a planning, manager or engineering problem but a business culture problem. Planning, management and or engineering approaches to risk tends to have its training and thoughts focused on objects. Whilst it is vitally important to observe what is constructed, it is not the core focus of their discipline to understand human organising, collective mindfulness and the collective unconscious in response to objects. 


When we consider culture with planning it helps us understand the following questions:
·      Why do people not obey procedures?
·      Why are people non-compliant?
·      How are our perceptions limited?
·      Why do people make poor judgments about risk?
·      How is risk recognised?
·      Why are people not motivated to better understand risk?
·      How is business perception strengthened by collective mindfulness?

Without an understanding of how social arrangements affect culture, it becomes easy just to view people who take risks as fallible and not having common sense or not being obedient. Once we have dismissed people in this way, we no longer feel compelled to understand the problem or the drivers of the problem – the label has taken away any need for further understanding. Without a better understanding of human judgment and decision-making in a social and culture context, leadership tends to support greater attentiveness to ‘more of the same’.

Peter Drucker phrase “Culture Eats Strategies for Breakfast” sums up perfectly what is wrong in so many businesses today. No matter how well thought-out the strategy is, if you don’t consider the culture in your organisation to support that strategy it will not come to fruition. Many business leaders have underestimated the power of culture and failed in their new strategies because of it. Therefore, it’s really about the two working hand-in-hand. Planning with collective cultural influence is critical to business outcomes for success. Systems are the foundation of what a business wants to achieve, which need to consider the beliefs and actions of people. People however are the drivers and action takers of systems, if not included in the development of plans and procedures can grossly miss interpret them, creating an unintentional road block before the plan gains any momentum. How is your organisation planning for success?