Issue 5.12 | May 2020

In this Article: how to make good leadership decisions within the confusion of a crisis. The second in a series on Leadership in a Time of Crisis.

by Jonathan Wilson

We crouched alone in our eagle’s eyrie, cowed before the the noise and chaos of the earthquake that had struck barely a minute before. Even as the ground heaved, and I held on for dear life, I stared silently at the maelstrom of nature’s violence, as if patience would provide me with an insight. Until fear wrenched a cry out of my throat. It was that fear that drove us to safety; but as I described in Issue 5.11, when crisis hits, fear tunnels our vision, shortens our awareness of time in the extreme, and reduces our awareness of others (or at least those outside our immediate group). It focuses each of us on our own survival.

To lead our organizations effectively through a crisis, we need to counter the tunnelling and isolationist effects of fear. We need to find a way to the kind of clarity that brings calm, and that leads to better decisions. This requires us to engage in two totally counter-intuitive tasks. The first is to slow down and think. In fact, a silent stare is exactly where we and our teams need to start. The second is to seek to understand the present, rather than try to predict the future. And, as you will see, we need to set aside our love for big data and statistical modelling. This is not its time.

One Brain, Two Systems

In any situation we face, our brain provides us with two ways of thinking. Daniel Kahneman, Nobel prize-winning economist and psychologist, calls these System 1 and System 2.* 

System 1 is fast. It is our intuition. It is a specialist in pattern-recognition. It does so fluidly and quickly, drawing on previous observations and experiences our conscious mind may not even remember. It operates efficiently and places low demands on our energy reserves. 

System 2 is slow. It is the part of our brains that deliberates, reflects and reasons its way to a decision. It can find patterns too, but only by sheer dint of effort. To be effective, System 2 requires self-control and focus. This intense concentration of the mind demands a lot of energy from us, so we tend to be lazy with System 2. Thinking hard is just too much work.

System 1 is very effective when it reads signals and patterns accurately. In a crisis it is our brain’s “first responder”. It is the very nature of a large crisis, however, that information comes at us in a deluge of confusing signals. Some signals correspond well to past experience and knowledge, some are quite alien. The toughest are those signals that correspond in some way to what we have seen before — but not quite. Such signals have a very jarring effect on us. We call this cognitive dissonance. 

An example of this is the story I tell here, of a Yali tribesman who, living in roadless mountains, had only ever seen small planes. He had flown in them too, between mountain villages. Then, on his first trip to a town, he saw a passenger-carrying vehicle with windows, doors, an engine, seats, and wheels. All these familiar signals told him he was seeing a plane. Except that he wasn’t. And when the car, which is what it was, drove down a twisting road with him in the back seat, and never got airborne, he became increasingly alarmed. Cognitive dissonance (and fear!) gripped his mind as familiar signals came together in a disconcertingly unfamiliar pattern.

In a crisis we experience a deep-seated drive to reduce dissonance and ease the discomfort, and often distress, that comes with it. We want to reacquire the control we thought we once had, and we want to do it quickly. We want speed and certainty. Crisis has, by its very nature, made both impossible. 

This drive to regain control is so powerful that we are at great risk of taking not just one, but two, very appealing but dangerous, false turns. 

Running Fast Downhill 

 

Fast thinking takes us faster out of control, not further into control.

The first false turn is to follow our instincts and depend on fast thinking to get fast results — to make snap decisions based on our intuitions. 

There is a place for fast thinking. For example, to an experienced fire crew a burning building is an emergency, but it is not a crisis per se. They have encountered fires before, and train incessantly to respond to them. A burning building may be a dangerous, even complex, situation, but it is a familiar one to those with experience. In such a situation, the fire crew’s intuitions are extremely valuable and, more likely than not, dependable. 

However, a major crisis pushes us into large amounts of unfamiliarity. In these situations our intuition delivers, at best, mixed results. Now is the time for System 2 to show its strengths. Now is the time for slow thinking.

Unfortunately, the jarring signals that come with a crisis, and the cognitive dissonance that follows, pull us away from slow thinking. Our mind is already tired from wrestling with cognitive dissonance. It doesn’t want any more work. Under this pressure we push ourselves towards a quick assessment and a quick solution that eases our dissonance. 

In my mid-twenties I led a party of Canadians on an adventure in a remote and mountainous part of Indonesia. After an overnight trek, one of them, whom I’ll call Sally, regaled us with a tale that, only because it ended well, left us howling with laughter. 

Descending from a high pass, Sally found herself quickly picking up momentum. This alarmed her, as the trail was steep and gravelly. She needed to keep her footing. Her guide quickly came to her aid and said, with some urgency, “Sally, pelan-pelan!” Surprised by this instruction, Sally increased her pace. The guide jogged beside her, obviously concerned. “Sally, pelan-pelan!” So Sally ran even faster, now frightened at the dangerous pace of the descent and not understanding why speed should help her on the dangerous downhill. To make it worse, the guide was now hurtling downhill alongside her, shouting: “Pelanpelan, pelan-pelan!” Relief only came when the trail levelled out. Naturally, this hair-raising, high-speed descent left Sally puzzled and distressed. 

I asked Sally, “What did you think the guide was telling you to do?” 

“He was telling me to go faster, right? Doesn’t pelan-pelan mean ‘faster’ in Indonesian?” 

Poor Sally. The group roared with laughter: “No, Sally! Pelan-pelan means ’slowly’! The guide was telling you to slow down!”

When we allow our distress to lead us into fast thinking, we do the cognitive equivalent to running fast down a steep and dangerous trail. As in Sally’s story, this takes us faster out of control, not further into control. 

Gaining Control by Slowing Down

To avoid this false turn, we need to consciously press pause on our intuition. It’s a great tool, but this is not its moment. Slowing down gives us:

  1. The time to disconnect from noisy signals that demand our attention, but which don’t give us accurate insight. The media — whether social media or the press — is a notorious contributor to this noise, as it is biased towards the novel and sensational. For example, when the media tells harrowing stories of death due to COVID-19, it captures our attention but gives us little useful insight into the pandemic and how to respond to it. After all, every day citizens die in other gruesome ways, such as in car accidents, but we don’t get blow-by-blow accounts, because the causes are familiar.
  2. The time to gather adequate data, analyze it and act upon it. If we make long-term decisions in short time-frames, we place a load on our fast thinking that it cannot bear, and we will make decisions in which we treat cars as if they were planes, where assumptions based on past patterns govern our response to new patterns. We will make decisions in which we over-simplify in an attempt to sooth our dissonance. Without meaning to, we will accelerate further into fear-based thinking and behaviour.
  3. The time to pull in key stakeholders, and not rush them. Short-cut the emotional and relational component of decision-making and you simply set up landmines for yourself in the future. 
  4. The time to access diverse sources of insight, and deliberately wrestle with “incompatible interpretations” (as Kahneman puts it). These introduce rigour to our thinking and strengthen our analysis. In the current pandemic it is incredible to see people hitting out at each other’s ideas about how to best respond to the crisis. Five months in, the situation remains far too complex for anyone to claim certainty about their solutions. There needs to be a readiness for a long road of learning, given the overwhelmingly complicated interplay of virological, socio-cultural, economic, and political factors, let alone multiple other variables. There needs to be a tolerance of the uncertainty that inevitably comes while we wait for the debate that drives learning to do its work. Time is required for our understanding to improve.
  5. The time to process complex factors. As Kahneman says, “You think with your body, not only with your brain.” Our brains can’t instantly unscramble or synthesize in a situation of complexity. To think well — reflectively, thoughtfully, and with powerful outcomes — we need sleep, nutrition, exercise, and time to reflect. We need creative stimuli, debate and discussion. Only when all of these are given their place will our slow thinking do its work well, and will clarity eventually come.

Which brings us to the next point.

Looking for Certainty in All the Wrong Places

 

The leader’s task is to make sense of what’s going on in the present. When we do so, we prepare our intuition for the work it will have to do when the future arrives.

In a crisis, desperate for certainty, we seek the seemingly solid ground of prediction and probabilities. This is where the next false turn lies. There is no such solid ground. It has been widely noted that the statistical modelling in the current pandemic has been, across the board, consistently unreliable (and nobody who specializes in modelling is surprised by that). A type of certainty is available to us, but it cannot be offered by predictions based on data analysis. 

It should be clear, by now, that, in a major crisis, regaining control over external factors is a fool’s errand (nor is it the time to initiate a “disruption”.  We’re too late. The crisis is the disruption). The quest for predictions and probabilities, for control over the future, is equally mistaken. This does not mean we should neglect forward thinking. Instead, we should come at it from an unexpected angle. The best way to be ready for the future is to understand the present, and the past that led to it. 

In their incredibly timely, just-published book, Radical Uncertainty, a leading British economist, John Kay, and former governor of the Bank of England, Mervyn King, argue that a major crisis is not the time to try and function like computers, which are designed to work with known variables (what statisticians call stationary factors) and deliver optimal solutions. That’s fine for sending probes to the outer solar system, they argue, because in such situations we actually know the necessary variables and can factor them in mathematically. To predict the future, they point out, is like trying to identify the probability of the invention of the wheel. To even come up with that, you have effectively invented the wheel anyway. We can’t predict what lies outside the scope of our imaginations. 

Instead, say Kay and King, we need to focus our slow thinking on understanding the present — or as they put it, we need to figure out “what is going on”. Unlike computers, human brains are primed for adaptation, and that’s the work we should put them to. Humans do not rationally calculate for optimal and maximal outcomes, as economists used to think, but instead we seek and make meaning out of circumstances. Some speak of our tendency to make meaning as if it deserves condescension or pity. Kay and King argue that this is what makes us exceptional among all the species.

Thus the leader’s task is to make sense of what’s going on in the present. When we do so, we are in fact developing a deep familiarity with the very realities from which the future will emerge. This deep insight prepares our intuition for the work it will soon have to do as the future arrives.

As Kay and King say, “We can consult statistics on the number of pedestrians killed crossing the road …. but that does not help us much in deciding whether to cross the road.” For one, a statistical model can’t account for the enormous number of variables contained in a person’s decision to cross the road. Paying attention to what is going on, we note, perhaps, that the decision-maker is elderly, female and unfit. Perhaps she is in a strange city, perhaps the road is dry, perhaps it is wet, perhaps it is empty because it’s midnight. On their own, however, even these observations don’t help us anticipate what will happen. We need to understand her motives — the human meaning behind what she is doing. 

As Kay and King point out, when making life decisions, humans are not calculating for simple optimization or maximization. In deciding whether or not to cross the road on a Friday afternoon, the elderly woman may indeed factor in her age, health and mobility, and the fact that the home-bound traffic is heavy, but if it gets her to her long-lost brother, she may well cast prudence to the wind and hobble across.

Preparing for the Future by Making Sense of the Present

To discern meaning in the present, I recommend two particularly powerful exercises. The first uncovers the meaning of who you are as an organization. The second uncovers the meaning of our times.

Uncovering Who You Are

Draw recent and seasoned employees into a study of your organization’s story. The present has a story behind it. Kay and King advocate a method I have used for twenty years in strategy work, with consistently powerful results: narrative analysis. In a workshop, have a cross-section of staff recall key events or themes from previous years — at least ten, if possible. Post the recollections on a timeline. Then dig into the story you see there. Look for patterns that reveal truths about your organization by answering these questions:

  • What were we consistently good at?
  • What have been consistently weak at?
  • What notable factors were in place whenever we achieved a major success (cultural, operational, market conditions, networks, capabilities, etc.)?

This method lends itself to reliable insights, partly because it circumvents the bias that interferes when we dive straight into diagnosis, e.g. in a SWOT exercise. In particular, however, when we are looking for solid insight, history is so much clearer than the future. Hindsight is always 20/20, as the saying goes. This saying, however, usually implies that hindsight is too late to be useful. Not true. The past tells you both inspiring and uncomfortable truths about who you are as an organization, and it tells you how these set you up for what’s coming. Importantly, your history also reveals what your organization, up to this point in his time, thought was important. It provides incontrovertible evidence as to what meaning you were trying to make. To get at this, ask these questions of your story:

  • What did we sacrifice time, effort, money and even relationships to achieve?
  • What did these choices reveal about what we believed was most important — what do they reveal about what we actually valued (as opposed to whatever is listed on our boardroom posters or website’s About page)?

In a crisis you now have the opportunity to ask yourselves, are we happy with what we have been working for? Do we want to be motivated by the same assumptions and convictions going forward? Or is it time to articulate a better reason for existing?

Uncovering the Meaning of Our Times

Study the story of your organization’s environment. With a structured approach this can be a less daunting task than you may expect. A common structure for environmental scans, as they are commonly called, is the PESTEL framework, which studies an organization’s context using the following categories: political, economic, social, technological, ecological and legal (regulatory). A simpler alternative is to focus only on the categories of industry and market. The usual approach is to identify key themes and trends in each category. The resulting list, however, is not particularly useful until we uncover the significance of the trends we identified: in other words, until we have discerned their meaning. To do this, we have to ask questions that focus on human behaviours:

  • What human behaviours, whether individual or collective, personal or institutional, characterize each theme or trend within the PESTEL framework? 
  • To what external forces are these behaviours responding?
  • What is the history behind those behaviours, and does that history reveal a trend or trajectory?
  • What internal assumptions — what individuals or society believe is important or true — do these behaviours reveal?
  • What patterns, trajectories, or divergencies in human behaviours can be observed across the different areas, e.g. political, social, regulatory, etc.?

When we answer these questions, we come closer to finding out why people are behaving the way they do, which gives us much greater clarity about how they are likely to behave in the months and years to come.

Enough With Slow Thinking, When Can We Use Our Computers?

 

The less time we spend articulating and refining the underlying assumptions that inform our models, the less useful they are.

When we take the time to think carefully about the meaning of things — the significance of how we live our lives, individually and organizationally, and the significance of our times — we root ourselves in the most stable ground available in a time of crisis. This may leave us wondering about the proper role of the more glamorous crisis-management tools we like to go to — scenario planning and statistical modelling. Both are in fact extremely valuable, but only when we understand what they are for. 

Scenario planning is not a predictive tool. It can’t be. Instead, it trains your team to think beyond what it normally thinks about, and thus prepares them for change. To be relevant, however, scenario planning needs to be preceded by the kind of deep analysis described above. The stronger your analysis of the present, the stronger your extrapolations from the present into the future. Used in this way, scenario planning enhances your team’s adaptive capabilities. 

Similarly, statistical modelling is a powerful tool, although its power increases the more certain we are about the data and parameters we put into it. Modelling is simply computer-powered scenario planning; in a crisis, it too serves only a preparatory, not a predictive, purpose. 

It is crucial that we understand, however, that statistical models are built upon not only assumptions about, e.g. how a virus transmits, but upon our prior, value-laden assumptions about what matters, about what’s important, in life. This is why we are witnessing, at the popular level, online squabbles over “lives vs economics” and lockdowns vs social distancing. Social media is awash with calls to depend only on “science”, but there isn’t enough science to be definitive about pandemic interventions, and there won’t be for some time: because, before it can offer a library of “facts”, science is a method. It will take considerable time for the scientific method to complete the task of understanding the virus and how it works, let alone to find a vaccine — and we may never know all we’d like to and may never find a vaccine. In the meantime, authorities make decisions rooted in what they assume is important or valuable, and the rest of us have our opinions based on what we assume to be important and valuable.

The less time we spend articulating and refining the underlying assumptions that inform our models, the less useful they are. For example, when a company wants to measure the risk of a new venture, or a government wants to measure the risks associated with its pandemic strategy, there is no such thing as an objective standard of risk to incorporate into those models. Risk is whatever a given social group, in its particular culture and circumstances, thinks risk is (as per the example of the elderly lady wanting to cross the road). This is why it is so important that we learn to think deeply about the meaning of things. What a model measures as risk is only as good as the rigour we have put into considering what needs protection, and why. 

The current crisis vividly illustrates this. A case can be made that one major motivator behind the West’s lockdowns has been a cultural assumption that the risk of pain and death, especially to a cause so strange and horrible, is to be avoided at any cost (fear, and not just culture, of course plays its part in that motivation). That we can take this option seriously is because we are, collectively speaking, immensely wealthy. In doing so, however, we are quickly finding that “any cost” includes not just massive government indebtedness, but millions who may die because their non-COVID-19 illnesses have not been caught in time (150,000 per month in the US from cancer, according to this article), while in the global South, ruined supply chains and under-resourced health and food infrastructures push entire regions into disenfranchisement, poverty, disease and starvation. To date, at least 360,000 have died with or from COVID-19, while millions have been saved through various interventions. Yet it is also the case that millions have, by some of these same interventions, been sentenced to premature death, and many millions more, to misery. 

Only time will tell what interventions were necessary and why, but whatever the outcome, it will demonstrate how profoundly our assumptions inform our choices. It is not science, but humans working with their assumptions, and funnelling as-yet-incomplete scientific data through those assumptions, that has taken each country down its particular path. That is unavoidable.  Business leaders across the world are in the same boat. In a crisis, we make high-risk decisions in our effort to tackle a highly uncertain future. There’s no way of reducing this uncertainty by gazing into the future more intently. Instead, we need to root our decisions in a sober assessment of the present, and what it actually means, and with clarity about what we believe is most important, rather than operating by ill-prepared instinct. Now is a time for slow-thinking.

Beyond the crisis, when time eventually reveals the effectiveness of our strategic decisions, it will not tell us how good our modelling was, but how wise we were in deciding what to model. It will not tell us how nimble we were, but how stable we were — as leaders, as teams, and, in the case of the pandemic, as a society.

*****

Another leadership insight from www.leadbysoul.com.

Leadership by Soul™, Trademark and © Jonathan Wilson, All Rights Reserved.

Related Articles

*See Thinking, Slow and Fast, by Daniel Kahneman.

Photo Credit: iStock

Leave a Reply

Your email address will not be published. Required fields are marked *