ISSUE 01 • MAY 2013
Who am I? THE SELF IN SELF-HELP
Why wait? THERE’S NO TIME LIKE THE FUTURE
What’s in a face?
JUDGING A BOOK BY ITS COVER
ISSUE 01 â€˘ MAY 2013
5 Pursuit of Knowledge
M E N TA L P L A N S
3 In Defense of Risks
BY PEGGY SILVER
BY BRIAN DANIELS
Learning from history
How to be a positive rebel
21 Mental Music
12 Finding Your Bliss
BY ALEX JACOBSON
BY STELLA JORDAN
Connecting thought and sound
So much depends on passion
16 Laugh It Out
BY ARIEL HUNTER
BY PATRICK FINN
Memories of people and places
Using humor to heal
LET TER FROM THE EDITOR
Welcome to the premiere issue of Thinker, a psychology magazine that brings you psych news and features the way you want them. Itâ€™s friendlier than an academic journal but more scholarly than the too-friendly self-help guides fighting to fix and improve and enhance every aspect of our lives. Sedias doluptat. Ceperume plam veliassim sequam sit aute endi quae nestemporis accae plitatio. Hil in parum facest, qui a siminvero con corro voluptam antur sectur ari videl molo od qui ut fugia pra comnim qui aces cusda idelessit, est mod ut liquo molore reperum sumquidelit hicimpos rese comnihi llabo. Ut inverfe rferem aborae. Itatem in consequiamAlitas sit ut reritas ex. Coriorum volor apidus. Nam etur aut harciistrum, omnimod que nam, cullupta dolorestempe maximollor aut lanim qui con et autem conecta taquatq uundion eseque eiuri beaquo oditate nis eumquas dolum latis pore. Udandit quodi te porempore dolorep elitat liquam as siminctat.
F E AT U R E S
32 COVER STORY
The Self in Self-Help BY JENNA PINCOTT
With the constant promise of self-improvement, you have to wonder: how can we help the self if we don’t know what the self is?
JUDGING A BOOK BY ITS COVER
What’s in a Face? 37 BY KATHRYN SCHULZ
Sometimes you can judge a book by its cover. Appearance predicts behavior in surprising ways... some of the time.
There’s No Time Like the Future 54 INTERVIEW BY MEAGHAN KING
In his new book, Wait: The Art and Science of Delay, Frank Partoy explains the untapped power of procrastination.
4 M AY 2 0 1 3
We have no idea what a self is. So how can we control it? By Kathryn Schulz
The corollary is more prosaic but, regrettably, at least as true: We humans never become most of the things we pretend we will someday be. Nevertheless, last Monday, you and I and several billion other incorrigible optimists raised our glasses and toasted all the ways we will be different in 2013. It’s easy to understand why we want to be different. We are twenty pounds overweight; we are $20,000 in debt; we can’t believe we slept with that guy; we can’t believe we didn’t. What’s harder to understand is why transforming ourselves is so difficult. Changing other people is notoriously hard; the prevailing wisdom on that one is Don’t hold your breath. But it’s not obvious why changing oneself should present any difficulty at all. And yet, demonstrably, it does. The noted self-help guru Saint Augustine identified this problem back in the fourth century A.D. In his Confessions, he records an observation: “The mind gives an order to the body and is at once obeyed, but when it gives an order to itself, it is resisted.” I cannot improve upon Augustine’s insight, but I can update his examples. Say you want to be skinny. You’ve signed on with Weight Watchers, taken up Zumba, read everything from Michael Pollan to French Women Don’t Get Fat, and scrupulously recorded your every workout, footstep, and calorie on your iPhone. So whence the impulsive Oreo binge? Or say you are a self-identified codependent. You know your Melody Beattie, listen to your therapist, and tell yourself every morning, quite firmly, just what you will and will not do that night. So what are you doing back in bed with that man? Or say you are a professional writer who values being conscientious, respects her editors, and passionately believes that good writing requires time. So—well, let’s drop the pretense. Why am I sitting here typing this at 4 a.m., two days past deadline? I don’t know, but misery loves company, and such acts of auto-insubordination happen all the time. They go some way
6 M AY 2 0 1 3
toward explaining the popularity of the selfhelp movement, since clearly we need help, but they also reveal a fundamental paradox at its heart. How can I want to achieve a goal so badly that I will expend considerable time, energy, and money trying to reach it while simultaneously needing to be coaxed, bribed, tricked, and punished into a compliance that is inconsistent at best? This is where the cheerfully practical and accessible domain of self-help bumps up
against one of the thorniest problems in all of science and philosophy. In the 1,600 years since Augustine left behind selfhood for sainthood, we’ve made very little empirical progress toward understanding our own inner workings. We have, however, developed an $11 billion industry dedicated to telling us how to improve our lives. Put those two facts together and you get a vexing question: Can self-help work if we have no idea how a self works? I know people who wouldn’t so much as walk through the self-help section of a bookstore without The Paris Review under one arm and a puzzled oh-I-thoughtthe-bathroom-was-over-here look on their face. I understand where they’re coming from, since some of the genre’s most persistent pitfalls—charlatanism, cheerleading, bad science, silver bullets, New Age hooha—are my own personal peanut allergies: deadly even in tiny doses. And yet I don’t share the contempt for self-help, not least because I have sought succor there myself. The first time was for writer’s block—which is, I realize, a rarefied little issue, sort of the artisanal pickle of personal problems. (I got over it: QED.) The second time was for its very nasty older brother, depression—of which more anon. In both cases, I ventured into the self-help section for the usual reason: the help. Last month, though, I went back to investigate the other half of the equation: the self. If, like me, you have read your way through sober Stephen R. Covey (The 7 Habits of Highly Effective People) and godly Norman Vincent Peale (The Power of Positive Thinking), through exuberant Tony Robbins (Unleash the Power Within) and ridiculous Rhonda Byrne (The Secret), through John Gray who Is From Mars and Timothy Ferriss who has a four-hour everything and Deepak Chopra who at this point really is one with the universe (65 books and counting)— anyway, if you, too, have reckoned with the size and scope of the self-help movement,
you probably share my initial intuition about what it has to say about the self: lots. It turns out, though, that all that surface noise is deceptive. Underneath what appears to be umptebajillion ideas about who we are and how we work, the self-help movement has a startling paucity of theories about the self. To be precise: It has one. Let us call it the master theory of selfhelp. It goes like this: Somewhere below or above or beyond the part of you that is struggling with weight loss or procrastination or whatever your particular problem might be, there is another part of you that is immune to that problem and capable of solving it for the rest of you. In other words, this master theory is fundamentally dualist. It posits, at a minimum, two selves: one that needs a kick in the ass and one that is capable of kicking. This model of selfhood is intuitively appealing, not least because it describes an all-too-familiar experience. As I began by saying, all of us struggle to keep faith with our plans and goals, and all of us can envision better selves more readily than we can be them. Indeed, the reason we go to the self-help section in the first place is that some part of us wants to do something that some other part resists. Of course, intuitive appeal is a poor indicator of the merits of a model; the geocentric universe is intuitively appealing, too. But even though this master theory of self-help is coarse, misleading, none too useful, and probably just plain wrong, it does capture something crucial about the experience of being human. One of the strange and possibly unique facts about our species is that we really can intervene on ourselves. Get a lab rat addicted to alcohol and you will have yourself an addicted rat. Get a teenager addicted to alcohol and eventually you might find yourself celebrating his 30th year of sobriety. It isn’t consistent, it isn’t predictable, and God knows it isn’t easy—and yet somehow, sometimes, we do
manage to change. The self really can help itself. The question is: How? Master theories—of self-help or anything else—don’t really answer questions like that. Instead, they dictate the shape an answer must take. Consider, for example, the way language works. English is a subject-verb-object language, meaning that the sentences we produce must all conform to that grammatical pattern. Within that constraint, however, the number of
sentences we can generate is infinite: “We have not yet begun to fight.” “A screaming came across the sky.” “I’m intercontinental when I eat French toast.” The master rule controls the form, but it’s completely agnostic about the content. So too with the master theory of selfhelp: It mandates a conflict between two parts of the self, but beyond that, it makes no particular demands and answers no particular questions. Who is divided against whom, who has the power and who is powerless, how to ensure that the “right” part of yourself winds up in charge: All this is up for grabs. Accordingly, self-help strategies distinguish themselves from one another—and pledge to solve your problems— by carving up the self at different joints: a mind and a brain, a consciousness and an unconscious, an evolved self and a primitive self—you get the picture. Such distinctions inevitably reflect different beliefs about what kind of creatures we are and often reflect different beliefs about our place in the universe. That makes them philosophically interesting—but, alas, it does not make them particularly useful. To see why not, consider two examples. In self-help programs that draw on religious or spiritual practices, the locus of control is largely externalized; the real power belongs to God (or a supreme being, a universal consciousness—whatever you care to call it). But these programs also posit a part of the self that is receptive to or one with that external force: an internal fragment of the divine that can triumph over human weakness. This is pretty much the oldest kind of dualism in the book: your sacred soul against your mortal flesh. You can see it at work in 12-step programs, where addicts begin by admitting they are powerless to control their addiction and then make “a decision to turn our will and our lives over to the care of God.” But think about that for a moment: How do recovering addicts simultaneously exercise and abdicate their right
to make decisions? How do they choose to let a higher power do the choosing—not just once but every time temptation comes along? Twelve-step programs are reputed to be one of the more effective ways to treat addiction, yet how their followers pull off this sleight-of-self remains a mystery. Now consider what seems, at first, like a completely different model of selfhood. “Everything you and I do, we do either out of our need to avoid pain or our desire to gain pleasure,” Tony Robbins writes in Awaken the Giant Within. Robbins’s vision of the self is Skinnerian rather than spiritual: We are conditioned, like dogs to a whistle or unluckier dogs to a kick, to certain habits of thought and action. How, then, are we supposed to change? “The most effective way,” he tells us, is to “get your brain to associate massive pain to the old belief.” Well, wait a second: Who is the “you” who gets “your brain” to rewire, and how does it do so? Through “the power of decision,” Robbins says, which “gives you the capacity to get past any excuse to change any and every part of your life in an instant.” But if we are creatures of conditioning, how did
8 M AY 2 0 1 3
this one part of ourselves remain independent? Where did it hide while we were being conditioned, and how will it emerge, and by what mechanism will it make decisions for the rest of us? You see the problem. The self-help movement seeks to account for and overcome the difficulties we experience when we are trying to make a desired change— but doing so by invoking an immortal soul and a mortal sinner (or an ego and an id, a homunculus and its minion) is not much different from saying that we “are of two minds,” or “feel torn,” or for that matter that we have a devil on one shoulder and an angel on the other. These are not explanations for the self. They are metaphors for the self. And metaphors, while evocative and illuminating, do not provide concrete causal explanations. Accordingly, they are not terribly likely to generate concrete solutions. True, self-help literature is full of good advice, but good advice is not the issue; most of it has been around for centuries. The issue is how to implement it. In the words of the emphasis-happy Robbins, “Lots of people know what to do, but few people actually do what they know.”
When it comes to solving that problem— which is the problem—all self-help literature offers is a kind of metaphysical power of attorney for our putative better halves. But if you identify with the above-mentioned Oreo-eater or healthy-relationship saboteur or procrastinator, you yourself are evidence that this is a nonsolution. If giving your better half executive control by fiat could change your life, sales of self-help material would plummet overnight. It is a somewhat beautiful fact that the underlying theory of the self-help industry is contradicted by the self-help industry’s existence. But, in the spirit of being a better person, I should not be so hard on self-help. The fact is, selves are profoundly difficult to understand. “There is nothing that we know more intimately than conscious experience,” the contemporary philosopher David Chalmers observes, “but there is nothing that is harder to explain.” Part of why we can’t explain the self is that we can’t even find it. Here’s William James, an exceptionally acute internal observer, giving it a try. “My present Me is felt with warmth and intimacy,” he wrote in Psychology: Briefer Course. “The heavy warm mass of my body is there, and the nucleus of the ‘spiritual me,’ the sense of intimate activity is there. We cannot realize our present self without simultaneously feeling one or other of these two things.” That was as close as James ever got to figuring out how to find a self: on the basis of a warm fuzzy feeling, emphasis on fuzzy. David Hume, meanwhile, couldn’t find himself at all. “When I enter most intimately into what I call myself,” he wrote, “I always
stumble on some particular perception or other, of heat or cold, light or shade, love or hatred, pain or pleasure. I never can catch myself at any time without a perception, and never can observe any thing but the perception.” If there was an essential “I” beneath all that, Hume couldn’t find it. Ultimately, he proposed that it doesn’t exist—that we are not sum, only parts: “nothing but a bundle or collection of different perceptions.” That idea poses a major problem for the master theory of self-help, with its internal governor, its you ex machina. Apparently, self-help has assigned the lead part in our show to an actor who is nowhere to be found. Nor has science made much progress in locating the self, let alone explaining it. These days, most people who study the mind believe that our sense of having an “I” somehow arises from cognitive processes like the ones Hume described. That rules out Descartes’s theory that our inner essence was rooted in the pineal gland, but it still leaves us intellectual light-years from anything like a fully developed scientific theory of the self. To put the problem in perspective, consider that, three centuries after Isaac Newton pioneered the study of optics, vision scientists are just starting to understand how our brain handles the problem of recognizing faces. Those discoveries are interesting and admirable on their own merits. But it is a very long way—probably many more centuries—from understanding how the mind sees faces to understanding how the mind sees itself. In the meantime, perhaps we should start looking beyond the constraints of the master theory of self— and, indeed, beyond the self entirely—for ways to improve our lives. T he expression “self-help” comes from a book of that name, published in 1859 by the great-grandfather of the modern movement, one Samuel Smiles. (I kid you not.) These days, the phrase is so commonplace that we no longer hear the ideology implicit in it. But there is one: We are here to help ourselves, not to get help from others nor lend it to them. Unlike his contemporary Charles Dickens, Smiles was unmoved by appalling social conditions; on the contrary,
he regarded them as a convenient whetstone on which to hone one’s character. As a corollary, he did not believe that altering the structure of society would improve anyone’s lot. “No laws, however stringent, can make the idle industrious, the thriftless provident, or the drunken sober,” he wrote. “Such reforms can only be effected by means of individual action, economy, and self-denial; by better habits, rather than by greater rights.” Smiles was Scottish, but it makes sense that his ideas received their most enthusiastic and enduring reception in the United States: a nation founded on faith in self-governance, belief in the physics-defying power of bootstraps, and the cheery but historically anomalous conviction that we all have the right to try to be happy. But this now-ubiquitous model of self-help might do an injustice to both the source of our problems and their potential solutions. We are social creatures, and we function (and dysfunction) in context. All of us know that we are notably different from one environment (Grandma’s assisted-living facility) to the next (Pyramid Club, East Village, 3 a.m.). What none of us knows is who we would be—or could be—if our context were altered in crucial ways at critical times. It’s entirely possible that socioeconomic background and current community exert a more powerful influence over us than
our ostensibly independent inner selves. In that case, the best self-improvement effort would be to better society. The larger point is this: God knows we all need more help, but possibly we need less self. That has long been the political response to the self-help movement, and it is also, in a different sense, what Buddhists believe. Curiously, Buddhism is simultaneously a burgeoning influence on the Western self-help movement and entirely at odds with it: anti-self, and anti-help. It is anti-help insofar as it emphasizes radical selfacceptance and also insofar as it emphasizes remaining in the present. (Improvement, needless to say, requires you to focus on the future.) It is anti-self in that it treats thoughts as passing ephemera rather than as the valuable products of a distinct and consistent mind. The journalist Josh Rothman once wrote a lovely description of what a cloud really is: not an entity, as we perceive it, but just a region of space that’s cooler than the regions around it, so that water vapor entering it condenses from the cold, then evaporates again as it drifts back out. A cloud is no more a thing, Rothman concluded, than “the pool of light a flashlight makes as you shine it around a dark room.” And the self, the Buddhists would say, is no more a thing than a region of air with thoughts passing through. •
10 M AY 2 0 1 3
IN HIS NEW BOOK, FRANK PARTOY EXPLAINS THE UNTAPPED POWER OF PROCRASTINATION. INTERVIEW BY MEAGHAN KING
T H I N K E R 11
OMETIMES LIFE SEEMS TO APPEN AT WARP SPEED. UT DECISIONS SHOULD NOT.
When the financial market crashed in 2008, the former investment banker and lawyer, now a professor at the University of San Diego, turned his attention to literature on decision-making.“Much recent research about decisions helps us understand what we should do or how we should do it, but it says little about when,” he says. In his book, Wait: The Art and Science of Delay, Partnoy claims that when faced with a decision, we should assess how long we have to make it, and then wait until the last possible moment to do so. With his advice on how to “manage delay,” we’ll live happier lives. Thinker: The cover of Wait shows a dog balancing a treat on its nose. Explain the connection to your research. Frank Partnoy: In this modern age of technology where we all feel the crush of emails and social media and 24-hour news, we need a role model, someone who can show us how to wait and I
12 M AY 2 0 1 3
think Maggie is doing a pretty good job. Dogs are capable of delaying gratification and so she’s showing us how to do it. Some dog experiments show they are capable of delaying gratification for a surprisingly long time. There was a study from earlier this year that showed that some dogs could hold a chicken-chew treat in their mouth for ten minutes while waiting for a chance at a bigger treat.
It is not surprising that the author of a book titled Wait is a self-described procrastinator. In what ways do you procrastinate? I’ll admit it. I’m actually a little bit proud of it and I think that one of the things I want to try to do in “Wait” is to persuade people who feel like they are procrastinators and have been made to feel guilty about it, to persuade those people that actually they’re doing exactly the right thing, that their approach to decision-making is precisely the right approach and that we should be embracing delay. I procrastinate in just about every possible way and always have,
since my earliest memories going back to when I first starting going to elementary school and had these arguments with my mother about making my bed. My mom would ask me to make my bed before going to school. I would say, no, because I didn’t see the point of making my bed if I was just going to sleep in it again that night. She would say, well, we have guests coming over at 6 o’clock, and they might come upstairs and look at your room. I said, I would make my bed when we know they are here. I want to see a car in the driveway. I want to hear a knock on the door. I know it will take me about one minute to make my bed so at 5:59, if they are here, I will make my bed. I procrastinated all through college and law school. When I went to work at Morgan Stanley, I was delighted to find that although the pace of the trading floor is frenetic and people are very fast, there were lots of incredibly successful mentors of procrastination. Now, I am an academic. As an academic, procrastination is practically a job requirement. If I were to say I would be submitting an academic paper by September 1, and I submitted it in August, people would question my character.
When does it cross from good to bad? Some scientists have argued that there are two kinds of procrastination: active procrastination and passive procrastination. Active procrastination means you realize that you are unduly delaying mowing the lawn or cleaning your closet, but you are doing something that is more valuable instead. Passive procrastination is just sitting around on your sofa not doing anything. That clearly is a problem. I think one of the keys to understanding how we procrastinate
WE WILL ALWAYS HAVE MORE THINGS TO DO THAN WE CAN POSSIBLY DO.
It has certainly been drilled into us that procrastination is a bad thing. Yet, you argue that we should embrace it. Why? Historically, for human beings, procrastination has not been regarded as a bad thing. The Greeks and Romans generally regarded procrastination very highly. The wisest leaders embraced procrastination and would basically sit around and think and not do anything unless they absolutely had to. The idea that procrastination is bad really started in the Puritanical era with Jonathan Edwards’s sermon against procrastination and then the American embrace of “a stitch in time saves nine,” and this sort of work ethic that required immediate and diligent action. We all feel this tremendous pressure at speeding up our lives, making us feel faster and faster. Some of it started to happen in the 1970s when a kind of anti-procrastination industry first developed to tell us to speed up, to get things done now. Peter Drucker and other management consultants advocated speed in business and then particularly over the last few years email and social media, Twitter. We’ve all gotten faster and faster as the selection cycle approaches. We can see every day. We’re getting news and snapreacting. The Supreme Court’s decision in the health care case, we saw that the news media reacted and got the case wrong in the first few seconds. All of us are feeling this pressure and I hope that people will, particularly as the election approaches, take a step back and think. But if you look at recent studies, managing delay is an important tool for human beings. People are more successful and happier when they manage delay. Procrastination is just a universal state of being for humans. We will always have more things to do than we can possibly do, so we will always be imposing some sort of unwarranted delay on some tasks. The question is not whether we are procrastinating, it is whether we are procrastinating well.
and when it’s good and when it’s bad, when it’s what psychologists call active procrastination versus passive procrastination, is the management of time, right. It’s thinking explicitly about how long you have. So for example the classic example is a student in college who’s waiting until the last minute and then crams for an exam or writes the paper at the last minute. Well, you have to understand how quickly you can study for an exam, how quickly you can write the paper. Take in information during the semester and then wait until the last minute. But if you wait until two days before, you’ve waited too long. So it’s a hugely important question. You have to know your own capacity for delay.
What made you want to take a closer look at the timing of decisions? I interviewed a number of former senior executives at Lehman Brothers and discovered a remarkable story. Lehman Brothers had arranged for a decision-making class in the fall of 2005 for its senior executives. It brought four dozen executives to the Palace Hotel on Madison Avenue and brought in leading decision researchers, including Max Bazerman from Harvard and Mahzarin Banaji, a well-known psychologist. For the capstone lecture, they brought in Malcolm Gladwell, who had just published Blink, a book that speaks to the benefits of making instantaneous decisions and that Gladwell sums up as “a book about those first two seconds.” Lehman’s president Joe Gregory embraced this notion of going with your gut and deciding quickly, and he passed copies of Blink out on the trading floor. The executives took this class and then hurriedly marched back to their headquarters and proceeded to make the worst snap decisions in the history of financial markets. I wanted to explore what was wrong with that lesson and to create something that would be the course that Wall Street should have taken and hopefully will take.
T H I N K E R 13
TH PR AR
mories going back to when I first starting going to elementary school and had these arguments with my mother about making my bed. My mom would ask me to make my bed before going to school. I would say, no, because I didn’t see the point of making my bed if I was just going to sleep in it again that night. She would say, well, we have guests coming over at 6 o’clock, and they might come upstairs and look at your room. I said, I would make my bed when we know they are here. I want to see a car in the driveway. I want to hear a knock on the door. I know it will take me about one minute to make my bed so at 5:59, if they are here, I will make my bed. I procrastinated all through college and law school. When I went to work at Morgan Stanley, I was delighted to find that although the pace of the trading floor is frenetic and people are very fast, there were lots of incredibly successful mentors of procrastination. Now, I am an academic. As an academic, procrastination is practically a job requirement. If I were to say I would be submitting an academic paper by September 1, and I submitted it in August, people would question my character.
It has certainly been drilled into us that procrastination is a bad thing. Yet, you argue that we should embrace it. Why? Historically, for human beings, procrastination has not been regarded as a bad thing. The Greeks and Romans generally regarded procrastination very highly. The wisest leaders embraced procrastination and would basically sit around and think and not do anything unless they absolutely had to. The idea that procrastination is bad really started in the Puritanical era with Jonathan Edwards’s sermon against procrastination and then the American embrace of “a stitch in time saves nine,” and this sort of work ethic that required immediate and diligent action. We all feel this tremendous pressure at speeding up our lives, making us feel faster and faster. Some of it started to happen in the 1970s when a kind of anti-procrastination industry first developed to tell us to speed up, to get things done now. Peter Drucker and other management consultants advocated speed in business and then particularly over the last few years email and social media, Twitter.
But if you look at recent studies, managing delay is an important tool for human beings. People are more successful and happier when they manage delay. Procrastination is just a universal state of being for humans. We will always have more things to do than we can possibly do, so we will always be imposing some sort of unwarranted delay on some tasks. The question is not whether we are procrastinating, it is whether we are procrastinating well.
When does it cross from good to bad? Some scientists have argued that there are two kinds of procrastination: active procrastination and passive procrastination. Active procrastination means you realize that you are unduly delaying mowing the lawn or cleaning your closet, but you are doing something that is more valuable instead. Passive procrastination is just sitting around on your sofa not doing anything. That clearly is a proble. I think one of the keys to understanding how we procrastinate and when it’s good and when it’s bad, when it’s what psychologists call active procrastination versus passive procrastination, is the management of time, right. It’s thinking explicitly about how long you have. So for example the classic example is a student in college who’s waiting until the last minute and then crams for an exam or writes the paper at the last minute. Well, you have to understand how quickly you can study for an exam, how quickly you can write the paper. Take in information during the semester and then wait until the last minute. But if you wait until two days before, you’ve waited too long. So it’s a hugely important question. You have to know your own capacity for delay.
What made you want to take a closer look at the timing of decisions? I interviewed a number of former senior executives at Lehman Brothers and discovered a remarkable story. Lehman Brothers had arranged for a decision-making class in the fall of 2005 for its senior executives. It brought four dozen executives to the Palace Hotel on Madison Avenue and brought in leading decision researchers, including Max Bazerman from Harvard and Mahzarin
HE QUESTION IS NOT WHETHER WE ARE ROCRASTINATING, IT IS WHETHER WE RE PROCRASTINATING WELL. We’ve all gotten faster and faster as the selection cycle approaches. We can see every day. We’re getting news and snapreacting. The Supreme Court’s decision in the health care case, we saw that the news media reacted and got the case wrong in the first few seconds. All of us are feeling this pressure and I hope that people will, particularly as the election approaches, take a step back and think.
14 M AY 2 0 1 3
Banaji, a well-known psychologist. For the capstone lecture, they brought in Malcolm Gladwell, who had just published Blink, a book that speaks to the benefits of making instantaneous decisions and that Gladwell sums up as “a book about those first two seconds.” Lehman’s president Joe Gregory embraced this notion of going with your gut and deciding quickly, and he passed copies of Blink out on the trading floor.
The executives took this class and then hurriedly marched back to their headquarters and proceeded to make the worst snap decisions in the history of financial markets. I wanted to explore what was wrong with that lesson and to create something that would be the course that Wall Street should have taken and hopefully will take. And I just thought there’s something wrong with the thinking on Wall Street. There’s something wrong with the course that Wall Street has designed for itself that says we should go with our snap judgment. I think one of the lessons that we learned from the financial crisis is that people have to think about the longer term. They have to think about much longer than two seconds.
You looked beyond business to decision-making in sports, comedy, medicine, military strategy, even dating. What did you find? I was so surprised to find that this two-step process that I learned from arguing with my mother about making my bed is actually a process that is used by successful decision makers in all aspects of life and in all sorts of time frames. It is used by professional athletes at the level of milliseconds. It is used by the military at the level of minutes. It is used by professional dating services at the level of about an hour. Question one is: what is the longest amount of time I can take before doing this? What time world am I living in? Step two is, delay the response or the decision until the very last possible moment. If it is a year, wait 364 days. If it’s an hour, wait 59 minutes. For example, a professional tennis player has about 500 millisec-
PEOPLE HAVE TO T ABOUT THE LONGE TERM. THEY HAVE THINK ABOUT MUC LONGER THAN TWO SECONDS. onds to return a serve. A tennis court is 78 feet baseline-to-baseline, and professional tennis serves come in at well over 100 miles per hour. Most of us would say that a professional tennis player is better than an amateur because they are so fast. But, in fact, what I found and what the studies of superfast athletes show is that they are better because they are slow. They are able to perfect their stroke and response to free up as much time as possible between the actual service of the ball and the last possible millisecond when they have to return it. The international dating service It’s Just Lunch advocates that clients not look at photos, because photos lead to snap reactions that just take milliseconds. It asks that they consciously not make judgments about a person when they first meet them. Instead, they tell clients to go to lunch, wait until the last possible moment, and then at the end of lunch just answer one question: Would I like to go out on a second date with this person? In the same way it frees up time for a tennis player to wait a few extra milliseconds, someone on a date will make a better decision if they free up extra minutes to observe and process information.
What else surprised you? That our decision-making doesn’t just happen in our brain, but it happens in our brain’s stem and in something called the vagal nerve, the tenth cranial nerve that kind of comes down from our brain stem and winds around the various organs in our body, most importantly our heart, and it varies our heart rates. •
T H I N K E R 15
W H W H A T H A T ’ S A T ’ S S I N A W H AT
W H A T ’ S W H AT ’ S
I N A I N A FA C E ?
W H A T ’ S FA C E 16 M AY 2 0 1 3
S IN A
? Sometimes you can judge a book by its cover. Appearance predicts behavior in surprising ways... some of the time. WRITTEN BY JENNA PINCOTT T H I N K E R 17
Masked Beauty (previous page) Surgeon Dr. Stephen R. Marquardt developed the Phi Mask, composed of line segments and shapes which relate to each other through the Golden Ratio. Faces that match Marquardt’s facial mask are subconsciously recognized as more “human” and, therefore, more attractive.
S IN everal years ago, a woman named Brook White appeared on the reality TV show American Idol. White was 24 years old, blond, and strikingly pretty. When she sang her song, “Like a Star,” she struck a familiar chord among some viewers. White said nothing about her religion, but Mormons were certain that she was one of their own.
18 M AY 2 0 1 3
“She has the Mormon Glow,” one blogger wrote, referring to the belief that the faithful radiate the Holy Spirit. White mentioned that she never drank a cup of coffee or watched an R-rated movie— signs of a Mormon-like squeaky-clean lifestyle. But the “glow” clinched it, and it turned out that her fans were right. “I didn’t know I was setting off the Mormon radar,” White remarked later in an interview with The Arizona Republic. Soon after, psychologists Nalini Ambady, then at Tufts University, and Nicholas Rule, at the University of Toronto, set out to test the Mormon glow. One way to do this is to see if even non-Mormons can detect it. The psychologists began their experiment by cropping head shots of Mormons and non-Mormons and asking undergraduate volunteers whether they could pick out the Mormons. They certainly could—and in just a glance. While random guessing would yield 50 percent accuracy, as in a coin toss, the volunteers accurately identified Mormon men and women 60 percent of the time. (Mormons themselves were only slightly more accurate.) This means that “Mordar” isn’t foolproof, but it’s statistically significant—about as accurate as the ability to tell if a face looks envious or anxious. “Thin-slicing” is the term that Ambady and her colleague, Richard Rosenthal, coined in 1992 to describe the ability to infer something about a person’s personality, character, or other traits after a very brief exposure. Thin-slicing relies on a brain network that includes the fusiform gyrus, which perceives faces, and the amygdala, which filters that information for anything that might be useful or threatening to survival. To determine what exactly triggers Mordar, Ambady and Rule cropped photos beyond recognition. Some faces had only eyes or hair. Could judges identify Mormons from these features alone? Fail. Others had
only noses or mouths. Nothing. Other faces had no features or even an outer shape. Just a patch of flesh, basically. Success. “What the judges were primarily picking up,” Rule explains, “are cues of health in the skin.” The tone and texture of facial skin reflects overall immune function. “We have a natural system set up to assess others’ health for mate selection and disease avoidance,” Rule says. “This can be coopted for social purposes as well — such as detecting religiosity.” Mormons don’t drink or smoke. They enjoy community support, which relieves stress. They live 10 years longer than the average American. Holy Spirit aside, their skin may glow because it’s healthier. While the judges likely knew that Mormons are clean-living, they weren’t consciously aware when categorizing faces
The ability to quickly extract information from faces has given us an edge in predicting character and behavior.
that they were associating religious purity with good skin. It was a gut feeling. Over evolutionary time, the ability to quickly extract information from faces has given us an edge in predicting character and behavior. It helps us to discern who’s sick and whom to trust, who’s flirtworthy, and who might blow up at a moment’s notice. To get a sense of others’ religiosity, sexual orientation, promiscuity, aggressiveness, competence, intelligence, and even trustworthiness, you might think that you should focus on how they act, not how they look. But then you’d neglect your swiftest insights.
T H I N K E R 19
Snap judgments about faces are not fail-safe, but they are far too accurate to ignore, especially in moments when danger may loom. And the consistent abovechance rate (often 60 percent or higher) at which we correctly extrapolate character traits from facial features taps into core questions of identity: Are appearance and behavior linked by underlying biological processes such as hormones—or do beautiful visages and aggressive miens elicit lifelong treatment that reinforces an individual’s behavior? It is the ultimate chicken-and-egg question, and it plays out on every face you meet. Attractiveness And Personality Our assessment of attractiveness is automatic, and strongly influences how we judge the person on a range of other traits, including personality. In a recent surprising experiment at the University of Pennsylvania, neuroscientists Ingrid Olson and Christy Marshuetz asked volunteers to judge pre-rated faces, some beautiful and others homely. The judges claimed that they didn’t see anything; the faces flashed by too quickly. Yet when coaxed to rate the attractiveness of the faces that they thought they hadn’t seen, they were astonishingly accurate. Each face was
exposed for 13 milliseconds, well below the threshold of conscious awareness. That’s how quickly we judge looks. Beauty and health are tightly linked. The closer a face is to the symmetrical proportions of Gwyneth Paltrow or Zac Efron, and to the average face in a standard population, the more it advertises developmental stability, meaning that pathogens or genetic mutations have not adversely affected its owner. Good looks also confer a well-documented “halo effect”: a beautiful man or woman is consistently evaluated in a positive light. Good-looking people are assumed to be smarter, although there is no correlation between intelligence and appearance above a median level of attractiveness. Appearance interacts with personality in complicated ways—good-looking people are consistently rated higher on positive traits. When volunteers were asked to evaluate faces in a UK study, the most attractive individuals received the highest ratings for extraversion and agreeableness. Yet more than the halo effect is at work, because the owners of those goodlooking faces also rated themselves to be higher on these traits. More impressively, when judges looked at digital composites (averages of faces) made from people
who scored at the extremes for extraversion and agreeableness (and, for women, openness), they gave those faces the highest attractiveness ratings. The judges didn’t know that the composites were made from the faces of exceptionally outgoing and easygoing people. They just thought those faces were better-looking than average. (For men only, facial composites generated from the most conscientious and emotionally stable subjects were also rated as more attractive than those made from subjects with the lowest scores for these attributes.) Clearly, thestereotype “what is beautiful is good” contains at least a kernel of truth. Here, then, is the big chicken-or-egg puzzle that runs throughout face perception research. Do the biological blessings behind good looks give rise to a sparkling personality; or do attractive people exhibit the socially desirable traits of extraversion and agreeableness because society treats swans better than ugly ducklings? Or do individuals with attractive personalities develop more attractive faces over time? Whether nature or nurture, the relationship between beauty and “positive” personality traits is real—and readily discernible. Sex hormones are one clear link between appearance and personality. Testosterone
Golden Glory During the European Renaissance, artists and architects used an equation known as the “golden ratio” to map out their masterpieces. Thousands of years later, scientists adopted this mathematical formula — that a ratio of about 1.6 is considered most aesthetically pleasing — to help explain why some people are considered beautiful. The ratio of facial features align in the form of a series of golden spirals, a logarithmic spiral based on the golden ratio that shows up in much of nature. The more closely features match this classical framework, the more attractive the face is perceived to be.
20 M AY 2 0 1 3
and estrogen influence facial development as well as behavior. High testosterone shows itself in strong jawbones, darker coloring, and hollower cheekbones. High estrogen reveals itself in smooth skin, a small chin, sparse facial hair, arched eyebrows, and plump lips. We make numerous assumptions about people with high-hormone profiles that conform to gender norms: First, that they’re hot. In a lineup, the highestrogen Jessica Alba and Beyoncé types receive the highest attractiveness ratings by both genders. Their pretty faces predictably get top ratings for social dominance (high status). As for men, high-testosterone faces are especially desired by women who are ovulating, but women may have a default preference for men with a mix of masculine and feminine features; dominant and cooperative. At the University of St. Andrews, volunteers of both genders could tell, with above-chance accuracy, whether people were promiscuous (open to one-night stands and sex without love) just by looking at photos of their faces. Among women, high-estrogen feminine faces were accurately rated as the most promiscuous—and the most beautiful. Among men, the Lothario face (a composite of the most promiscuous males) had high-testosterone features: slightly smaller eyes, larger noses, and broader cheekbones. Women accurately judged this face as belonging to a playboy and downgraded it in favor of men who looked—and actually were—more committed and monogamous. Do highly feminine-looking women and masculine-looking men have hormone profiles that give rise to stronger sex drives, or do their looks simply lead to more sexual opportunities? The likely answer is both: nature and nurture are inseparable. Studies show that the signs are clear. The next time you’re perusing photos on an online dating site and get a suspicious feeling about a person’s romantic trustworthiness, you might listen to that instinct.
Pretty faces predictably get top ratings for social dominance and status.
Aggression and Criminality Several years ago, Cheryl McCormick, a neuroscientist at Brock University in Ontario, was listening to a radio program about the effects oftestosterone on male skull size during puberty. Under the influence of the hormone, men’s facial width increases in relation to height (width-height ratio or WHR), independent of body size. WHR is the distance measured from cheekbone to cheekbone versus the distance between the top of the lip and midbrow. A high WHR is 1.9 or above. Bill Clinton’s is 2.07; John Edwards’s, 2.38; and Richard Nixon’s, 2.02; compared to John Lennon’s, 1.63, and George Washington’s, 1.65. (Now try to resist eyeballing every man’s WHR.) Intrigued, McCormick and her colleagues, psychologist Cathy Mondlach and graduate student John Carre, wanted to know if a high WHR might be a marker for aggression, which is related to high testosterone levels. They asked volunteers to guess from photos of male Caucasian faces which ones were the fighting type. A surprise: Not only could the judges in her study accurately detect which men are aggressive based on a headshot alone, they could do so after seeing the face for only 39 milliseconds. These men had the highest WHRs. Unknown to the judges, they also scored highest on an aggression test that measures how often a player steals points from an opponent, to no benefit for the player himself. (Women with high WHRs are perceived as more
aggressive than average but aren’t, presumably because the skull-shaping pubescent testosterone surge affects males only.) Male face width is also associated with a propensity to deceive. A business school professor now at UC Riverside, Michael Haselhuhn, and his colleague Elaine Wong, discovered that men with high WHRs were three times likelier than their narrowerfaced peers to lie to increase their financial gain in hypothetical scenarios: a lottery drawing and a buy-sell negotiation that occurred over email. While most men with high WHRs— 60 percent—did not break the rules, this finding is still startling. Don’t dismiss your instincts, especially if your safety and well-being are at risk. Guesses about trustworthiness based on headshots tend to correlate with those individuals’ self-reports and judgments by their acquaintances. In one experiment, people who were perceived as dishonest were likelier to mislead their peers than were those whose faces were thought to look honest. In studies involving the prisoner’s dilemma game, participants, going by facial photos alone, could accurately identify people who were likely to deceive, and also remember the faces of prospective cheaters more than cooperators. Consciously or not, you may be more cautious and attentive around potential cheaters and other offenders. At Cornell University, psychologist Jeffrey Valla and his colleagues set out to test just how readily people can spot criminals based on facial appearance alone. They prepared close-cropped, expressionless, facial photos of clean-shaven Caucasian men in their twenties and asked volunteers to identify the murderers, rapists, thieves, forgers, drug dealers, and so on. Men and women alike could distinguish convicts from noncriminals with consistent abovechance accuracy, but, interestingly, not violent offenders from nonviolent ones. While one might expect violent criminals to be more discernible, the fact is that people who commit one type of crime are more likely to commit others. The boundary between
T H I N K E R 21
violent and nonviolent offenders can be delineated in the lab, but not as readily in the real world. It is adaptive to be sensitive to any indices of potential criminality. Criminal identification (“facial profiling”) isn’t hugely reliable, in part because there are too many false alarms. And yet, one criminal type consistently crept up. The Faces of Dominance High testosterone has two faces. One is the devious, aggressive cheater. The other is a strong and capable leader. Both of these stereotypes play out in research. Among business leaders, powerfullooking features are predominant. When Haselhuhn and Wong analyzed Fortune 500 company executives, they found that firms led by men with high WHRs had superior financial results compared with firms led by men with lower WHRs. Similarly, the undergrads in Ambady and Rule’s experiments guessed with above-chance accuracy which faces belong to Fortune 1000 CEOs and managing partners of the most profitable law firms based solely on how dominant those faces looked. While masculine-faced leaders may appear more competent, it’s unclear if they really are so. The most profitable firms may simply hire dominant-looking people to be their public face. Equally possible is the self-fulfilling prophecy: Parents and teachers may groom these men from an early age to be leaders, so they see themselves as such, as do others. Nevertheless, it’s a statement about corporate dynamics that an appearance of dominance, not warmth, also predicted which faces belonged to the most successful female CEOs. Will Yahoo!’s sweet-looking Marissa Mayer, dubbed “The Hottest CEO Ever,” crack the high-WHR ceiling? For women, competence can also be conflated with comeliness. Shawn Rosenberg, a political scientist at UC Irvine, presented photos of the same woman appearing in two faux campaign photos. In one, she’s professionally made up, and in the other she looks dowdy. Whether she ran as a Democrat or Republican, she won about 56 percent of the vote—a serious
22 M AY 2 0 1 3
Proportional Profiling Measurements of proportion and symmetry determine beauty, as well as other factors. For example, the ratio between the distance Facial Fractions from cheekbone to cheekbone and from Pisciam a dolorruptius elesto eosande the top of the lip to midbrow is indicative of ndantiore porepro reptam fugit quam etur? aggression in men. Et occae velendis doluptatum ent voluptat
margin—when portrayed by the flattering photo. In a separate mock election study, Rosenberg asked judges to rate headshots of candidates in terms of how competent they appeared. The traits associated with an edge: a strong curve of the upper eyelid, thinner lips, thinner eyebrows, a broad face, and, oddly, a widow’s peak. “This is politics. Perception is reality,” is an old adage, and so it goes: A recent Princeton University study found that inferences of competence based solely on facial appearance predicted about 70 percent—a staggering number—of the outcomes of U.S. Senate races in 2004. Asking the judges to first think about their decision reduced their accuracy. We vote with our gut. That said, the faces of elected leaders may depend on the situation. In a study at the University of Aberdeen, people who were primed to choose a president in a wartime context preferred a morphed George W. Bush face, judging it to be more masculine; in a peacetime context they preferred a morphed John Kerry face, judging it to be more intelligent. In another experiment, volunteers guessed—with about 60 percent accuracy—which faces belonged to Republicans, perceived as more dominant, and which were Democrats, seen as warmer. Generally speaking, our biases about what competent leaders should look like appear to be hard-wired, or at least learned early. In a Swiss experiment, schoolchildren were presented with pairs of headshots of actual political candidates and asked which of the two they would choose to be the “captain of their boat.” Over 70 percent of the kids chose the candidates who would later win the election. They looked like leaders. The Dangers of Stereotyping What does all this mean for those who don’t look authoritative, such as baby-faced men? Their pudgy, youthful faces are the yin to the masculine-faced yang. They’re perceived as soft, suggestible, incompetent, and honest to a fault. Leslie Zebrowitz, a social psychologist at Brandeis, and her colleagues had a
theory that baby-faced boys would strive to confound these expectations. Drawing on archival data that included photos of men over time and information about their socioeconomic status, race, grades, and IQ scores, she discovered that the hunch is true—yet manifests in complex ways. A baby-faced boy who was white, middle-class, or had a high IQ was apt to compensate for the perception of a childlike intellect by getting good grades. But if he came from a poor family and scored poorly, he was likelier to compensate with assertive, hostile—even criminal—behavior, especially if he was also short. Would history
fect—made them appear more honest. “Dishonest women may be more likely to look honest than dishonest men because [women] have less power to achieve their goals through other means,” the researchers suggest. Our biases can come back to bite us. By overvaluing attractiveness, we may elect bad leaders and hire the wrong employees, believing them more honest, capable, and intelligent than they are. By conflating competence with dominance, we may discourage nurturing or egalitarian leadership styles. By pigeonholing all gender-atypical appearances as gay, we
Apply snap judgements too broadly, and we risk turning a survival mechanism into knee-jerk prejudice.
be the same if George “Baby Face” Nelson, the toughest of Depression-era mobsters, hadn’t been born in the slums of Southside Chicago? We’ll never know for sure.Nature and nurture—the chicken and the egg— cannot be unscrambled. The face we’re born with reflects personality and is molded by experience, which in turn reflects how we are perceived. Another pitfall of face perception presents itself in Zebrowitz’s research on deception. She and her colleagues asked volunteers to rate people’s trustworthiness based on headshots taken throughout their lives and compared the ratings of each face. th its owner’s scores on personality tests. While they found that men’s trustworthiness could be predicted from an early age, women’s could not. Women who were less honest in their youth were judged as more honestlooking in adulthood, even if they weren’t actually more trustworthy. These ladies could improve their appearance with cosmetics and hairstyle, which—thanks to the halo ef-
limit diversity of expression. To categorize all wide-faced men as aggressive is not only inaccurate, it distorts our perception of other facets of their personalities. We slip into superficiality. Snap judgments about faces arise in an ancient region of the brain that specializes in self-protection. Apply them too broadly, and we risk turning a survival mechanism into knee-jerk prejudice. Compassion, fairness, and rational decision-making happen only when the slower, more recently evolved prefrontal cortex weighs in. • A lesson comes from one of our first American idols, Abraham Lincoln. The Great Emancipator’s face is a battleground for our biases. With a WHR of 1.9, he had the biomarker of dominance and authority yet still had a reputation for integrity. But Honest Abe enjoyed no halo effect. From an early age, his facial skin was yellowish and creased, revealing his poor health. •
T H I N K E R 23
24 M AY 2 0 1 3
T H I N K E R 25