Advisor

The Ingredients of Good Decision Making — and How They Can Go Wrong

Posted November 17, 2022 | Leadership |
Ingredients

High winds and waves are coming at us from every direction at once. But we still have to sail our boats. It’s just different and harder. More decisions are required, even as the bases for making them can shift in a moment, meaning we need flexibility and nimbleness as never before. We need leaders with more than just knowledge, insight, and skills; they need the temperament to remain calm and collected as they apply analysis and judgment to high-stakes decisions. This Advisor reviews the ingredients of good decisions and explores how decision making can go wrong.

Ingredients of Good Decisions

The objective of any decision in times placid or turbulent is to achieve the best possible outcome based on what we can reasonably be expected to have known and understood at the time we made the decision. Getting to a good decision requires a mix of science and art with a touch of alchemy. Let’s deal with the science part first:

  • Facts, obviously, but they must be carefully distinguished from opinions, beliefs, assumptions, conventional wisdom, and hopes. Not that those aren’t sometimes useful, but we need to recognize them for what they are and treat them accordingly. Specific facts we need include the situation, our options, risks, rewards, and constraints, as well as urgency and the consequences of inaction.

  • Insights developed from personal and organizational experience and knowledge of historical analogues and precedents.

  • Logic and reasoning are essential in making sense of how the things we know and believe and assume interrelate and apply to the situation at hand. Game theory, a branch of mathematics, can be helpful in some cases where several options are available and you want to, for example, minimize how bad the worst case would be, which is rarely the same as maximizing how good the best case would be.

These ingredients are necessary but not sufficient for good decision making, and this is where art gets into the mix. Emotions matter, our own and those of others. There’s a reason Captain Kirk was in charge of the Enterprise rather than Mr. Spock. Some art ingredients to consider:

  • We need to know ourselves — our tendencies, style, biases, self-image, and the image we want to project, plus our own culture’s beliefs, norms, and expectations and how those factors may affect our decisions for good or ill.

  • We need to understand our environment — the people whom our decision will affect, the cultures in which those people live and work, and the politics (i.e., stakeholders who may or may not be willing to influence people and nudge culture).

  • We need to learn that judgment is what good decision makers use — to bring all these ingredients (calling them “tools” suggests they’re more mechanistic than they really are) together in the right proportion. The critical element is pragmatism, which comprises both science and art. Inquisitiveness and healthy skepticism are essential in making judgments. Like art, which it is, good decision making is hard to teach. Some fortunate people are naturals at it; others can learn from mentors, examples, and mistakes; still others never quite get the knack.

How Decision Making Goes Wrong

Decision making in practice is not just about doing things right, it’s about avoiding common ways of going wrong. The following are typical pitfalls:

  • Failure to verify supposedly factual information — garbage input to decision making, garbage decisions out. Just because IT can make information look highly authoritative, replete with clever graphics, doesn’t make that information any more intrinsically reliable than a scribble on the back of an envelope.

  • Failure to challenge received opinions, assumptions, and beliefs — our own or those of others. This could fill several books, and has, but here are a few examples: the weapons of mass destruction that weren’t in Iraq, Microsoft failing to recognize the central importance of the Internet even as late as 1995,1 and the reliance on “blue wall” states coming through for US presidential candidate Hillary Clinton in 2016.

  • Confirmation bias, meaning subconsciously screening out information that doesn’t agree with what we believe to be true. This is a particularly insidious form of the previous pitfall, because we’re typically not even aware of doing it. No one is immune; the only counter is constant examination and reflection to sort what we really know from what we or others thought true.

  • Closing off options by deciding prematurely (or tardily). Decisive people fear tardiness much more than the opposite, making prematurity their more likely pitfall. We’ve all done things that have gone wrong and then said, “If only I’d known.” Too often, we could have known but did not want to take the time to learn more, or perhaps we at some level didn’t want to learn something that would dissuade us from making the decision we wanted to make — a conscious form of confirmation bias. (It could even be because the meeting called to decide ran out of time before it ran out of arguments.) Iraq comes to mind, where the haste to have the war over before the brutal Iraqi summer caused the premature cessation of the United Nations inspection, which had up to then revealed no weapons of mass destruction, as in reality none existed. Robert Rubin did not get to run Goldman Sachs or become US Treasury Secretary by being indecisive, but he always avoided making decisions before he had to.

  • Failure to learn from history and overlearning from history. “This time it’ll be different” and “This time it’s just the same” are two sentences rarely heard from good decision makers. This time is this time. Nuances matter.

  • Machismo, meaning making decisions quickly to demonstrate strength and cojones. Bullying and browbeating are often involved as someone exercises political clout to force his or her will, declaring that “failure is not an option.”

  • Groupthink, in which participants get caught up in mutually reinforcing enthusiasm, drowning out questions and voices of caution.

  • Doubling down; there’s a saying that when you find yourself at the bottom of a hole, stop digging. Too often, we switch to a bigger shovel (e.g., adding staff to a troubled project who will only trip over one another), rather than understanding and adapting to the new reality.

  • Making decisions too close to the vest even when secrecy is not critical, thus losing out on potential sources of knowledge and insight as well as reducing the breadth of ownership.

Unfortunately, an optimally made decision is not enough to guarantee a good outcome, especially when times get turbulent. There’s also luck. Some well-made decisions inevitably prove wrong for reasons that could not have been anticipated, and some badly made ones back their way into fortuitous success. Both present “teachable moments” if we let them. More often, we punish those responsible for well-made decisions that don’t pan out and reward the alchemists lucky enough to have gold paint spill on their lead.

Note

1 Bill Gates’s book, The Road Ahead, published that year (Viking Penguin), barely mentioned it.

[For more from the author on this topic, see: Navigating Business Through Turbulent Waters.]

About The Author
Paul Clermont
Paul Clermont is a Cutter Expert. He has been a consultant in IT strategy, governance, and management for 40 years and is a founding member of Prometheus Endeavor, an informal group of veteran consultants in that field. His clients have been primarily in the financial and manufacturing industries, as well as the US government. Mr. Clermont takes a clear, practical view of how information technology can transform organizations and what it takes… Read More