Chief Emotional Officer “Think Again,” by Sydney Finkelstein, Jo Whitehead and Andrew Campbell (Harvard Business Press, 2009)
“Think Again” explores how we make decisions and how we can make them better.
Organizational culture reveres and rewards rationality. A
successful leader’s stock in trade is rigorous analysis of data and information, careful consideration of multiple options and clear-eyed strategic planning,
leavened with a touch of the maverick’s intuition. Well, that is a widely held view anyway. The truth, however, is nearly the opposite, according to Sydney Finkelstein of the
Tuck School of Business, and Jo Whitehead and Andrew Campbell, both of the Strategic Management Centre at Ashridge Business School. As they explain in their book, “Think Again” (Harvard Business Press, 2009), when seeking a solution amidst incomplete data and multiple uncertainties, the human mind’s initial gambits are predominantly unconscious, intuitive and emotional. In other words, in “Star Trek” terms, we are all more Captain Kirk than Mr. Spock. How the Brain Makes Decisions Evolution, say the authors, has given us the ability to quickly assess and react to the complex world around us. We continually, unconsciously, seek to recognize complex patterns — visual, structural and sequential — in our experience. We then ﬁle the patterns
the korn/ferry institute
The authors’ ﬁeld work reinforces the view that, contrary to what we believe about ourselves, human beings do not normally do much conscious analysis.
away so we can take action based on our past experience the next time we encounter situations that ﬁt those patterns. The brain also associates each of these stored patterns with an emotion. This “emotional tagging,” as it has been called, operates as a fast-access ﬁling system in our thought process, further guiding us to the kind of rapid response that rational analysis alone could not trigger. Emotions are advantageous in that they often help us get to the right answer effortlessly and efficiently. However, because pattern recognition requires the brain to ﬁll in blanks and guess when there is incomplete information, the brain can make connections between past and current situations that do not exist. The authors call these “misleading experiences.” And the strong emotional tags linked to past experiences can drive us to inaccurately size up current circumstances, what the authors call making “misleading prejudgments.” Unfortunately, once a misread pattern or a misleading emotional tag has set a bad plan in motion, human nature is not likely to lead us to review or reverse our course. Research shows — and the authors’ ﬁeld work reinforces — that, contrary to what we believe about ourselves, human beings do not normally do much conscious analysis, such as identifying and comparing options or challenging assumptions and initial assessments. In the interest of quick, decisive action, our brain has evolved to arrive at a plan and not actively search for disconﬁrming information. To the contrary, it seeks input that will conﬁrm its original course and reevaluates only if the plan proves ﬂawed — and sometimes not even then. Such misreadings occur all the time in the business world. In 1994, William D. Smithburg, as CEO of Quaker Oats, spearheaded the acquisition of Snapple Beverage, believing that its declining brand could be revived with the same marketing tactics that had worked with the Gatorade Company, which Quaker Oats had acquired under Smithburg’s leadership a decade earlier. Smithburg failed to recognize that, beyond superﬁcial similarities, Snapple was not like Gatorade. It was an offbeat brand, driven by an entrepreneurial sensibility, grass-roots
marketing and a cultlike affinity with its customers and distribution channels. In the end, the Snapple business lost $75 million in 1995, suffering a 5 percent decrease in sales.
More Red Flags The authors identify two additional signs that a disastrous decision is likely to be made: inappropriate self-interest and inappropriate attachments. The notion that self-interest can work for the common good forms the basis of much of economic theory and regulates society at large. But self-interest and personal attachments to people, places or things are counterproductive when they do not align with the interests of the organization. Inappropriate self-interest — ﬁnancial gain, ego, personal enmity, personal preference and even nostalgia — was found to be a factor in two thirds of the cases of failed decisions studied by the authors.
Designing Safeguards Because red-ﬂag states of mind for the most part reside in the unconscious, Finkelstein, Whitehead and Campbell conclude that relying on self-awareness or self-regulation are unreliable strategies for making sound decisions. Instead, they recommend that individuals and organizations counterbalance human biases with planned, systemic interventions that they call “safeguards.” These safeguards involve supporting decision makers by providing them more experience, data, analysis and formal venues for debate and challenge. While these kinds of safeguards are likely to minimize the chances of disastrous decisions being made, it is debatable whether they will ultimately lead to better decisions. After all, any process designed to smooth out lows is also likely to dampen highs, kind of like organizational Valium. The authors concede that what they call “toxic levels” of safeguards can be paralyzing and have to be carefully administered. Ultimately, however, the question is whether we want our leaders to be stewards or avatars. If the latter, we may just have to ride the roller coaster, red ﬂags and all. When you think about it, there was probably a good reason that Kirk, and not Spock, captained the Enterprise.
the korn/ferry institute