People are oftentimes flawed at thinking in concepts of risk. We see this in daily life and risk professionals experience this at their jobs on a regular basis. There is a great analysis by Owen Shen on “Charting Death” using perceived threats of death by analysing google searches and comparing them with the Center for Disease Control’s (CDC) data on actual deaths in the United States and reporting in the New York Times and Guardian Newspapers. Aaron Penne created this amazing diagram from the data:
This makes flaws in people’s perception and dealing with risk painfully obvious. I would like to call out three key areas of bias that are evident from the data and explain how this translates to challenges in managing Operational Risk, Technology Risk, Cyber Risk and similar risk categories within an organisational context.
1. Information Bias
Clearly we are influenced by the data that we are confronted with most. Nearly one third of the media reporting is focussed on terrorism, while actual deaths through terrorism are not even registered as a percentage on the graph. People obviously perceive this threat drastically higher than it is with more than 5% of searches relating to terrorism.
For risk professionals this is a challenge. When running Risk Control Self Assessment processes (RCSA) we will likely gather drastically exaggerated representation of risks which have most supporting information. For example, should an Intrusion Detection System (IDS) provide regular notifications to process owners on blocked intrusion attempts, the probability of risks relating to network intrusions would be significantly over-represented in risk assessment outcomes from this audience. Risk managers need to ensure the objectivity of the gathered risk information.
2. Comfort Bias
Heart Disease is still one of the main causes of death in the United States. While there is a strong genetic predisposition to suffer from heart disease, diet and exercise are the leading factors in preventing heart issues. A very simple and uncomfortable truth that few are willing to embrace in daily life. Consequently less than 5% of searches are related to heart disease and there is even lower media coverage.
In the organisational context there may be obvious truths as well, where the impact or potential mitigations are so uncomfortable or undesirable that they will be underrepresented in risk assessment outcomes. Think of the banking world’s risk assessment prior to the collapse of Lehman Brothers. There were certainly many that saw the Key Risk Indicators move, but the consequence of a system relevant breakdown were so dire that they were not appropriately reported or factored into risk exposures. What risks are so uncomfortable that they were not reported in your last RCSA?
3. Knowledge Bias
To quote the infamous press conference held by then-secretary of defence Donald Rumsfeld there are “unknown unknowns”, creating a knowledge bias in risk results. In our initial “Chart of Death”, Lower Respiratory Disease is responsible for more than 10% of deaths, however no one is searching for it and no one is reporting on it. Without googling this myself, I personally do not know what this disease is, so I probably would not put it in a risk register of things that are likely to kill me.
The same is true for your operational risk management. Risks that you are not immediately aware of will not be represented in the risk register. This is, of course, a fundamental flaw in risk management processes
4. Removing Bias from Risk Management
So how do we move closer to a bias free risk evaluation? At Alyne we have put quite a bit of thought into this. Risk processes should be a continuous activity that happen without creating huge amounts of effort. However, the quality of the risk information needs to be high. This is how we address bias in our risk process.
- Fighting the information bias
Information bias is more likely when the format of the risk assessment is unstructured, i.e. everyone is asked to contribute any risk they can think of. In Alyne you have the possibility to capture free text risks, but you can also select from a risk universe of more than 750 pre-defined risks. Based on the assessment outcomes, suggestions which risks might exist are presented to the risk manager. While a person may actively still raise a risk based on information bias, the user is also presented with additional suggestions to complete his or her view and level the bias.
Fighting the comfort biasIn Alyne’s risk analytics, we provide a pre-qualification of risks based on risk assessment outcomes. While you can naturally remove risks that are not applicable, we also force the user to provide a reason for removing the risk from the selection. Once again, a user may select to remove a risk for reasons of “comfort”, however at least an auditable reason for removing it must be given. The comfort bias in “free text risk raising” is much higher.
- Fighting the knowledge bias
“Unknown unknowns” are more likely, if the individual does not have visibility or knowledge of the full context of the risk. The more complex the organisation gets, the more difficult it becomes to have full insight over the context of any given risk. At Alyne we have addressed this by modelling so called “Risk Trees” that articulate causal effects of a control weakness or a risk being elevated. One risk can lead to multiple other risks until the tree reaches its final “leaves” of ultimate risks. Users may see risks that were previously “unknown unknowns” which can now be evaluated in context.
The nature of operational risk will always remain dependent on expert opinion and people participating in risk processes to the best of their knowledge. It will also remain human nature to be bad at evaluating risk intuitively. As risk professionals it is our task to understand these biases and support people with tools and expertise to develop a more unbiased and accurate representation of operational risk.