Tacit knowledge can be described as the kind of wisdom that experts accrue over their careers. They have deeply embedded this information into their mindset and it has helped to shape the mental models (or, to give them their right name, ‘heuristics’) that allow them to find optimal solutions in reaction to unexpected and emergency situations.
So, what’s the problem?
Well, tacit knowledge is so deeply ingrained in these individuals, they are often unaware that they themselves use it and therefore, extracting, communicating and sharing the tacit knowledge of experts is a major challenge in performance psychology.
Whilst we have our own experience of emergency protection and contingency planning situations, the focus of this article will seek to stimulate discussion around innovative new solutions based on the principles of good leadership and behavioural science. We hope that by doing so, you may become self-aware of some of your own tacit knowledge and be able to both appreciate its value and to share it with your colleagues.
We live in an increasingly uncertain world; the likelihood of major incidents, emergency situations and disasters has amplified in recent years. Climate change, globalisation, economic and political instability and the rise of automation and Artificial Intelligence have revolutionised the ways we do business. Whilst these situations are becoming more difficult to predict, effective management of such incidents is crucial to ensure that members of the public and our workforce are able and willing to take appropriate protective actions to take care of themselves and others. Those responsible for preparing for and managing such random events are finding it much more challenging in a rapidly changing world.
“preparing for the next emergency based on what has happened in the past is just futile”

They know that “no plan will survive the first contact” and there is even evidence to suggest that preparing for the next emergency based on what has happened in the past is just futile. It is often said that “we plan for the last event, not the next one.” There is a tendency to base assumptions about the size and characteristics of each event that will be faced in the future based on the historical evidence of similar situations in the recent past. But what if the next event is entirely out of character or different to what has happened in the past?
Recently we’ve been starting our courses and training events in a slightly different way. We have always highlighted our “Ground Rules” for our time together, we create an environment of psychological safety for everyone to share their ideas, cultivating a feeling of curiosity and excitement; a fun element to our learning experience together. Then, there is the inevitable moment when we need to identify and discuss any emergency procedures or alarms that there might be on site. Very often, people have travelled from various locations around the world to be together and they might not be familiar with the risks on site or the systems in place that need to be followed.
Now when we get to the emergency procedures bit, usually everyone is aware that there will be a fire alarm or alert system on site. Of course, we say, “should the fire alarm go off today, we will need to evacuate the premises quickly but calmly”. We then ask if there are any other alarms or warning systems that we should be aware of, which is usually met with either a blank look or a confused stare. People tend to be much less aware of any alternative alarm systems that might be in place to cover the unlikely event of a chemical or gas leak, or even a terrorist attack warning system. Many of the locations that we find ourselves in should certainly have such warning systems and immediate action procedures. Surprisingly some organisations don’t have them at all and even those that do often have poorly designed emergency procedures that are never rehearsed or properly stress tested. Indeed, envisaging such events happening causes many of us physical discomfort. By implementing an emergency procedure, we are subconsciously accepting that it is a real possibility that could happen to us.
Scientists and engineers in high-tech industry nowadays are accustomed to working with potentially deadly substances and materials, whether it be radioactive sources or genetically modified retroviruses, these experts, by definition, are usually extremely well educated and trained to manage the risks of their work. Tightly regulated work environments and smart implementations of monitoring equipment make exposure or loss of containment incidents rare, yet the consequences and costs involved in rectifying these incidents is extremely high and time-consuming, often requiring international governmental efforts to rectify.

For these reasons, industries such as this try, as best they can, to operate in a highly controlled environment. The idea being that systems and staff have multiple checks, measures and in-built redundancies to ensure that the likelihood of a catastrophic event occurring is significantly reduced. Such systems are heavily reliant on operator compliance at critical moments and past experience suggests that reliance on compliance alone, even in such high risk environments, is not so effective.
Your smartest employees are your biggest liabilities
Yes, you read that right. In such highly regulated work environments, it is important to consider that both the extreme level of safety measures and the seniority and expertise of our workforce in industries such as nuclear power, pharmaceutical and medicine can breed complacency. When a threat is often invisible, whether it be radiation or viral particles, it is much harder to maintain vigilance and cognisance of risk when a threat is not visible or immediately apparent. In fact, unlike most industries where most accidents occur due to unpredicted slips or lapses from workers at the operational level, it’s the wilful acts of non-compliance, or violations that are of most serious concern in these high-tech sectors, particularly by mid-level to senior engineers and R&D staff.
These people would not be your typical ‘rebel’ employees, they do not delight in breaking rules or wish to rebel against them. They genuinely believe that they are making a better decision than what is advised by company protocols and SOPs and they generally have the leadership and influencing skills to convince others that they know best in these situations. These are the colleagues you probably look up to, the high fliers who make big decisions and have received a steady stream of positive feedback and accolades for their actions throughout their careers. These are perhaps the biggest dangers to high-tech organisations in emergency situations.
It’s not so easy to protect your organisation from these kind of factors, because they often inherently have the authority and confidence to override any checks and balances. Indeed, due to the seniority and expertise of these workers, warnings and consequences are even less effective on them than lower-level employees. They have less to fear; even in the event they are fired for a wilful violation, they will be convinced that they will be able to seek new employment relatively easily.
Bucking the system
An example of how this phenomenon has manifested in other areas of risk are the investment bankers that manage the $39 billion endowment of Harvard University. These are some of the brightest financial minds around, operating at the direction of one of the world’s most prestigious centres of learning. You would expect such a team of experts to make excellent investment decisions that perform significantly better than any returns gained by simply investing in an Index Fund; i.e. just betting on the whole market, rather than carefully handpicking which stocks to invest in.
Yet, over the last decade, whilst the Harvard Endowment saw a 4.1% return, the more modest Index Fund option, available to literally anyone, returned 8.1% over the same period. Even in the medical field, there has been a shift towards a ‘systems approach’ to diagnosis and treatment, providing a framework for doctors to work through in order to reduce the effects of human bias and overconfidence.
Indeed, some of the world’s most dangerous incidents have been caused, arguably, by the overconfident actions of specialists and experts; for example, the Chernobyl disaster in 1986 and more recently the failure to contain the Ebola outbreak of 2014, which lead to further spread of the disease and remains yet to be eradicated. Both incidents involved the failings of well-trained senior staff, whose overconfidence and wilful violations of procedures led to potential catastrophe.
Implementation of carefully considered collaborative plans across multiple NGO’s can only be as robust as the discipline of the employees on the ground making decisions. It is important to consider that unlike your typical ‘rebel’ employees who might eschew their lab coats or hard hats, some of the individuals involved in these incidents genuinely had selfless and well-intentioned approaches.
Altruistic acts

Back in 2014, nurse Pauline Cafferkey selflessly travelled from Scotland to assist with the treatment of Ebola patients in West-Africa. Cafferkey herself contracted the disease. The fact that Cafferkey contracted Ebola whilst working with patients in Africa is of course an innocent human error, a slip or a lapse that must have placed her in direct contact with the virus. However, on return to the UK and screening for potential Ebola infection, another medic and nurse who had accompanied her on the trip admitted to falsifying her body temperature reading, which if reported correctly would have triggered a closer examination and quarantine procedures.
“by implementing an emergency procedure, we are subconsciously accepting that it is a real possibility that could happen to us”
In the resulting tribunal, both medical professionals indicated that they took the risk in falsifying the reading because they believed that the infected nurse was not unwell, despite the clear data and procedures indicating they should have reported the incident. Although Cafferkey was cleared of any wrongdoing, both the accompanying nurse and doctor were given short suspensions from practicing medicine. This is a clear example of how the overconfidence of highly trained, expert and well intentioned employees can pierce straight through even a carefully planned system and its layers of safety nets organisations might put in place.
Core behavioural principles
Most organisations want to hire and then keep top talent, the creative thinkers and natural leaders that care for those around them. The paradox to this is that in an emergency situation involving radioactive or biological contamination, it can often be these people and their wellintentioned actions that can put more lives at risk, spread contamination and make containment and future decontamination much more difficult and dangerous. For example, in many cases when an emergency occurs, people rushing in to help colleagues often results in them becoming a victim themselves. This can especially be a problem in gas leaks, where an employee may see a colleague faint, rush in to help them and suffer the same fate. Equally, in the event of a biomedical related spill such as contaminated patient blood, it can be preferable to instead contain the spill and not open doors and expose more of a facility to the contamination.
“how can we craft effective emergency procedures when the wisdom of how to respond is locked inside the minds of just a few employees?”
Similarly, depending on the type of emergency, it could be better to either stay in place and await the response of trained response teams, or to immediately evacuate. In the average workplace, workers only really need to be aware of a fire alarm and potentially a carbon monoxide detector; in both cases it’s easy to train them to evacuate. But the response needed can be highly variable and context specific in more complex industries. For these reasons, the core behavioural principles of effective emergency protection and decontamination are trust in the system, self-discipline and the ability to work in a methodical manner during an incident and strict adherence to SOPs.
In order to prevent incidents and to react to emergencies in an optimal manner, it is especially important
that organisations define clear boundaries for employees at all levels of an organisation. This is especially important with midsenior level employees. These are your most effective problem solvers and creative thinkers, yet in an emergency situation it is best to give these people the ‘freedom to be constricted’ by the SOPs of the organisation and to switch their usual ‘I know what to do’ attitudes into ‘I know what I am supposed to do’. In emergency situations, we don’t want to be engaging conscious thought, we want to be activating subconscious, automated responses and adhere to what we have trained these employees to do in a given situation.
Clearly, putting our most valuable employees through long, unengaging and tedious training sessions, for events that are unlikely to ever occur during their career, is not an optimal use of their time or the company’s resources. Neither is trying to shock or warn these employees about the consequences of not following SOPs in an emergency.
Instead, by instilling core values in employees and generating a self-perpetuating organisational culture, we can build a feeling where emergency SOPs are respected and viewed as something that will not only keep individuals and the organisation safe, but also alleviates the pressure on senior employees to react to incidents and make them feel empowered by these SOPs, rather than limited or constricted by them.

A core component of this is trust which is best fostered by good leadership at all levels, once employees trust the systems and procedures that are in place to deal with emergencies, they become much more likely to follow them. This brings us back to our original point regarding tacit knowledge. How can we craft the most effective emergency protection and decontamination procedures when the best wisdom of how to respond is locked inside the minds of just a few employees?
In the same way, we hope that this article has provoked your own thoughts on how you would respond to these kind of emergency situations should they arise.