This Advisor enumerates classic leadership mistakes made in turbulent times: approaches and behaviors that are easily fallen into but must be avoided. It also provides guidance on what leaders can do instead to adapt to this turbulence.
Avoiding Classic Mistakes
Turbulent times don’t always bring out our wisdom; sometimes just the opposite:
Paralysis. Two classic mistakes often go side by side: stopping the act of making decisions and staying the course on decisions already made. It’s the epitome of intellectual laziness and reflects the mistaken belief that any change is inherently riskier than stasis. It’s risk non-management trying to disguise itself as risk minimization. Back in the boat, it’s staying on course for your original destination without looking for an intermediate harbor and keeping all sails set even as the shifting winds pick up to gale force.
The sunk-cost trap. There’s a natural human instinct not to walk away from something that you’ve already put a lot of resources and effort into, even if the likelihood of achieving the goal falls precipitously. It’s a trap; the time and money spent are forever gone. Even if you only need another X to get a return of Y, if you wouldn’t now start something new that cost X to get Y, you should cut your losses. It’s hard psychologically and can be politically fraught, but it’s correct. People who don’t understand this or are loath to apply it are always welcome at friendly poker games.
Indiscriminate belt-tightening. It makes sense to get down to fighting weight when you need to be nimble, and every organization puts on weight over time when things have gone well. But across-the-board cuts (“We all must share the burden …”) are like losing weight by getting rid of both fat and muscle. We know better for our bodies, but we don’t always seem to for our organizations — again, intellectual laziness.
Even when we avoid these obvious traps, others litter our path:
Freezing up in the face of seemingly overwhelming disaster. The hyper-urgent actions to keep the financial system from imploding in 2008 were often nothing the principals would ever want to do or ever dreamt of doing, but which emerged as the least awful alternatives. “Extraordinary times call for extraordinary measures,” as then-US Federal Reserve Chair Ben Bernanke said.
Overly rigid adherence to abstract principles or standard procedures. The words “always” and “never” can be dangerous when taken too literally. Sometimes, decision makers need to be pragmatic, possibly even devious, though not unethical. This is particularly true in dire situations, where “I did it by the book” is no defense for letting a disaster get worse. As boxer Mike Tyson said, “Everybody has a plan until they’re punched in the face.”
Analysis paralysis. This may be a slight exaggeration, but I suggest that spreadsheet software is both the best and the worst thing that ever happened to decision making. Why it’s the best is obvious. It allows us to answer what-if questions by easily building straightforward models of complex situations. Why it can be the worst is what happens when we turn the spreadsheet jockey loose before we think through the model. (Yes, it’s like computer programming that way; it is computer programming!) Mucking about with the spreadsheet sucks up all the energy; there are only a few scenarios worth modeling, but the ability to model thousands provides a kid-in-a-candy-store experience.
Turbulence and the need for quick decisions doesn’t change the need for clear thinking, but it modifies the goal and style. The thoroughness needs to focus on:
Favoring future-proof decisions. When turbulence obscures the future, the primary goal of investments is that they will pay off under a broad set of eventualities. A high payoff investment that seemed like a no-brainer under stable conditions may not look so good if the payoff depends on too many uncertain things going right.
Reducing the scope of decisions. Plans for investments need intermediate checkpoints to the greatest extent practical, with a plan B or C thought out and ready to be embraced with minimal disruption if new potential circumstances make B or C suddenly look better than Plan A. There might also be a Plan Z to stop cold turkey or Plan Y to pause and mothball what’s been done in an orderly way for possible revival. Such course corrections are also teachable moments. Could we reasonably be expected to have anticipated the new circumstances, and if we didn’t have a Plan B, why not? Is there a pattern of analysis or behavior that could be improved for the future?
Revisiting prior decisions. Economist Paul Samuelson nailed this when criticized for changing his mind over time: “When the facts change, I change my mind. What do you do, sir?” No one can predict which facts change and when, but that’s no excuse for not paying attention when they do. Some prior decisions will stand. Others will become very questionable, raising the possibility of abandoning, limiting, or modifying them. All are fair game; there should be no sacred cows.
Proactively seeking exposures. Assuming there’s already attention paid to competitors, customers, and technologies (i.e., the risk sources that apply in placid times as well as turbulent), turbulent times demand attention to the sources of turbulence described in the introduction. This attention should not be restricted to investments in progress. Decisions made years or decades ago could come back to haunt us. Examples include critical facilities in areas where hundred-year floods have become, if not annual, a lot more frequent; water-dependent operations in long-standing drought-stricken areas; and critical components procured from countries with unstable and potentially hostile governments.
In the short term, hope may be the only alternative, suggesting the need to develop contingency plans and workarounds. Back in the boat, this is like paying attention to the weather forecasts and, when gales are coming, plotting a new course that brings you closer to harbors in case you need them, even if they add time to the voyage.
Software suites like enterprise resource management and supply chain management have enabled levels of optimization unthinkable not that many years ago, but there is such a thing as overoptimization. It’s when the optimal result requires a lot of separate steps to go right and there aren’t straightforward workarounds when they don’t. This is particularly applicable to supply chains as we learned during the pandemic. But other events can throw a monkey wrench into an optimized process: strikes, fires, and acts of God-like storms or earthquakes. These can happen to upstream vendors we may not even be aware of, as we learned with the Fukushima nuclear disaster in 2011.
A bit of redundancy or inefficiency can be thought of as a form of insurance. No sane person bemoans what they spent for fire insurance last year because their house didn’t burn down. From a more mathematical viewpoint, we should think of optimizing the expected value rather than the best case when probabilities of problems and glitches increase, as they do in turbulent times.
[For more from the author on this topic, see: “Navigating Business Through Turbulent Waters.”]