The chief executive of a large multinational was trying to decide whether to undertake an enormous merger—one that would not only change the direction of his company but also transform its whole industry. He had gathered his top team for a final discussion. The most vocal proponent of the deal—the executive in charge of the company's largest division—extolled its purported strategic advantages, perhaps not coincidentally because if it were to go through he would run an even larger division and thereby be able to position himself as the CEO's undisputed successor. The CFO, by contrast, argued that the underlying forecasts were highly uncertain and that the merger's strategic rationale wasn't financially convincing. Other members of the top team said very little. Given more time to make the decision and less worry that news of the deal might leak out, the CEO doubtless would have requested additional analysis and opinion. Time, however, was tight, and in the end the CEO sided with the division head, a longtime protégé, and proposed the deal to his board, which approved it. The result was a massive destruction of value when the strategic synergies failed to materialize.

Does this composite of several real-life examples sound familiar? These circumstances certainly were not ideal for basing a strategic decision on objective data and sound business judgment. Despite the enormous resources that corporations devote to strategic planning and other decision-making processes, CEOs must often make judgments they cannot reduce to indisputable financial calculations. Much of the time such big decisions depend, in no small part, on the CEO's trust in the people making the proposals.

Strategic decisions are never simple to make, and they sometimes go wrong because of human shortcomings. Behavioral economics teaches us that a host of universal human biases, such as overoptimism about the likelihood of success, can affect strategic decisions. Such decisions are also vulnerable to what economists call the "principal-agent problem": when the incentives of certain employees are misaligned with the interests of their companies, they tend to look out for themselves in deceptive ways.

Most companies know about these pitfalls. Yet few realize that principal-agent problems often compound cognitive imperfections to form intertwined and harmful patterns of distortion and deception throughout the organization. Two distinct approaches can help companies come to grips with these patterns. First, managers can become more aware of how biases can affect their own decision making and then endeavor to counter those biases. Second, companies can better avoid distortions and deceptions by reviewing the way they make decisions and embedding safeguards into their formal decision-making processes and corporate culture.

Comment