A Nation of Strangersi by Gerald Zaltman Harvard Business School and Olson Zaltman Associates
Today’s world of information consumption is puzzling. We are taught from a young age to take actions and base opinions on facts, and the internet makes it easier than ever to evaluate truth claims on most topics. At the same time, we see a steady decay in the use of facts. People seem increasingly inclined to let opinion determine what constitutes fact rather than the reverse. When opinion collides with fact, especially on social policies, opinion tends to prevail. More and more people see one another as ignorant of the truth on topics like climate change, immigration, educational policies, free speech, evolution, gun control, inequities in the distribution of wealth, the use of vaccines, trade policy, and much, much more. How others think has become foreign to us, and as a result, we are becoming a nation of strangers, and not just strangers, but opponents. Each side sees the other as embracing opinion over facts—a division I call the “fake news social divide.” This divide, which runs through family, friends, and coworkers, is worrisome to me. This essay is a call for constructive sharing of conflicting thoughts. It is meant for a very broad audience. It seeks to restore and even enhance the pivotal role that legitimate differences in opinion and interpretations of fact play in a healthy democracy. The essay is not about changing minds, resolving conflicts, or negotiating compromises. There are many other helpful guides for those activities. However, none of those activities can occur without first having constructive, respectful sharing of thoughts with those with whom we disagree. As a nation, we need to find ways of handling legitimate, conflicting judgments about facts to spur productive discussion on social-policy divisions rather than using conflicting judgments as goads to deepen and widen those divisions, as we seem to be doing now. The metaphor of the American melting pot doesn’t—or shouldn’t—mean erasure of differences or subordination of all into an undifferentiated whole but rather the creation of a unique national stew in which each contributor adds a flavor or texture that is irreplaceable and without which
the stew would be that much the poorer. We are at risk of forgetting the virtue of many flavors and textures as it applies to diversity in ideas. Part I of this essay provides background about how minds handle ideas, regardless of a position taken. The goal is to create more openness to hearing alternative positions. This is not the same thing as trying to persuade people to adopt a different position. Gaining a better understanding of and respect for what people of opposed views think is very difficult and requires patience. Resistance to this process is natural and is to be anticipated and respected. Our unexamined, preconceived notions and our resistance to hearing ideas we disagree with makes discussion hard, but the results can be worth the effort. Part II presents sample approaches to facilitate constructive interactions among conflicting parties. These approaches, called prompts, are just suggestions for making a conversation easier. They may be adapted according to the issue, participants, and settings involved. A major assumption is that goodwill exists among the partisans involved. I know that assumption exists among the friends, family, and coworkers with whom I differ on one or another social issue. I believe this is also true for the great majority of our nationâ€™s citizens. Part I: How Minds Operate The need to know, to understand the world around us, is a core human need. Our happiness and survival depend on it. This need drives learning, imagination, problem solving, and adapting to change. It determines how we allocate attention, time, and other scarce information-processing resources among competing demands. The need to know operates unconsciously, though it occasionally surfaces as a conscious feeling. For example, we become aware of feeling certain or uncertain, or confident or uneasy about something. We say we get cold feet or have second thoughts about a decision. Or we experience an aha moment or a gut feeling in which we suddenly believe beyond doubt that an idea or action is correct. Such experiences may be intense, and yet our awareness of our need to know remains slight.
Our sense of what we know, on the other hand, is outsized. Our minds play up our sense of our own knowledge while downplaying what we don’t know. This leads to a failure to critically question our own opinions and beliefs to see if they have factual support. I’ve called this “the fallacy of expertise.” It usually partners with confirmation bias—the tendency to seek supporting evidence for a favored position while avoiding evidence against it. There is benefit, of course, to having the need to know operate unconsciously. By operating below awareness, it saves us from “paralysis by analysis” in everyday matters. Imagine if you had to stop and think about which news channel to watch every time you turned on TV, and further, if you had to check each statement made on the news program, and then the sources you consulted when checking those statements, and then the sources you used to vet the sources you consulted, and so on. Our conscious mind doesn’t have sufficient bandwidth to always be asking and answering the question, “How certain am I that I know what I need to know?” Instead, this question is addressed, if at all, in the unconscious mind, where it escapes scrutiny. We can see how this plays out in today’s world through the following example. Consider an outlandish, impromptu claim made by a candidate at a rally. Imagine the claim receives immediate, rousing applause. Let’s say the audience believes from the outset that the candidate is a truthful person with their best interests at heart. These beliefs automatically confer validity upon a questionable claim. For supporters, a preexisting halo of faith and approval surrounds the candidate, so they give him or her the benefit of any doubt. Pausing, even briefly, to examine the validity of the new claim is unnecessary. Furthermore, others in the crowd are signaling their validation of the claim with cheers. People don’t stop to wonder if this manifestation of the wisdom of crowds is fallible. The Trickster Mind and Neural Networks When it comes to the need to know, the mind likes to play tricks. One distinguished psychologist describes the human brain as “a master of deception. It creates experiences and directs actions with a magician's skill, never revealing
how it does so, all the while giving us a false sense of confidence that its products—our day-to-day experiences—reveal its inner workings."1 One such deception has just been noted: the belief that we know more than we actually do. This fallacy has the knock-on effect of hiding from us the incorrect judgments we make based on our assumption of being knowledgeable. Let’s explore this further. Suppose we listen to a political debate and as a consequence decide to support one of the candidates. Describing events, we say to friends that this candidate’s arguments were superior and that won our allegiance. Maybe. However, it is quite possible the reverse occurred: we found the candidate’s arguments to be superior because of an unconscious allegiance we felt prior to the debate. Our notion that we carefully considered each candidate’s statements in the debate and only then developed a reasoned preference for one candidate over the other may be an illusion. We want to believe we made a reasoned choice after careful consideration of each candidates’ merits—that is what we are taught to do. It is what we believe a reasonable person does, and we consider ourselves reasonable. But in fact, any number of unconscious preferences relating to gender, ethnicity, voice, facial characteristics, etc. may have had more influence over our decision. To understand how the trickster element of our mind is so powerful, it helps to understand how the brain’s neural networks operate. These networks are formed by connections among neurons. The more frequently the connections are activated, the stronger or more firmly “wired” together they become. Every idea is represented by a set of neurons. Each set, in turn, is wired to other sets relating to the same topic. For example, the set of neurons involved in liking a candidate may connect to the neurons involved in the willingness to make get-out-the-vote phone calls on his or her behalf, and both ideas may link to the set of neurons involved in being willing to donate to the candidate’s campaign. Operating together, these sets form a more comprehensive network—or what social scientists often call a framework or frame.
Lisa Feldman Barrett, How Emotions Are Made: The Secret Life of the Brain (New York: Houghton Mifflin Harcourt, 2017), 278.
We have an incalculably large number of neural networks. They mediate everything we think about, regardless of how right or wrong that thinking might be. Different frameworks can also be connected to one another. Activating one may trigger activation of another. For instance, activating thoughts about international labor policy may bring to mind thoughts about immigration policy (or vice versa) since policies in the two areas may affect one another. Or, thoughts about a candidate’s ethnicity may activate thoughts about his or her position on immigration, which in turn may activate thoughts about the candidate’s trustworthiness. As the neural connections involved in our image of a candidate—say, the belief that the candidate is caring, is of the “right” gender, and is trustworthy and intelligent—are reinforced by various sources of information, they become increasingly difficult to change. Opinions we encounter in sources that reinforce our image of the candidate are likely to be treated as “facts” and further strengthen these neural connections. Opinions inconsistent with our image, on the other hand, will be ignored or interpreted as inaccuracies or lies, especially if those opinions come from a disliked news channel or disliked candidate. Indeed, when negative statements about our preferred candidate come from disliked sources, those statements reinforce our belief in our chosen candidate’s integrity. If we believe our candidate is an honest person, then we feel that sources saying that (for example) our candidate is dishonest must be mistaken at best or lies at worst. We feel no need to fact-check claims that suggest our candidate has told a lie. This is also why media habits are hard to alter. Viewers of Fox News won’t watch MSNBC news and vice versa. The neural processes supporting these news consumption habits are difficult to change. People prefer the comfort of being told they are right to the discomfort of being told they are wrong and need to reassess their position. A powerful jolt is required to rewire media consumption habits. That jolt is unlikely to originate with our wired-in news channel preferences.
Strength in Connections Because a given idea lives within a network of related ideas, any effort to introduce a new idea or modify an existing one must consider all linked ideas, not just the single idea to be changed. The entire system of ideas needs to be addressed. Suppose you support relatively unfettered admission of asylum seekers to the country, and you are arguing with those who fear that admitting asylum seekers will result in higher crime rates. In this situation, it is not enough simply to assert that allowing asylum seekers into the country will not increase crime rates. Your argument needs to consider other, related ideas in the framework, such as contributions immigrants make to (or burdens they place on) the economy, their participation in our military, the complexity of border security, the humanitarian value of a welcoming community, etc. And ideas exist in more than one network, just as people do. Consider the multiple networks you belong to: networks of family, friends, online communities, coworkers, and so forth, each having some influence on you and through you on other people and groups you belong to. When trying to make a change, these related frames must also be considered. A health-related example may be helpful. Some time ago, my firm, Olson Zaltman, conducted a smoking cessation study for a state department of health. This study revealed that many active smokers accept evidence that cigarette smoking is harmful to them. They continue smoking not simply because of the power of nicotine addiction but because of their strong attachments to other people who happen to be heavy smokers and because of their attachments to the settings—at work, church events, and other social situations—where their valued interactions with those people occur. Constantly being with smokers in these settings adds an impossible hurdle to changing their own behavior. To give up smoking would require giving up the valued social contacts and settings. Many smokers choose not to pay so high a price. The framework of ideas about proper health behaviors loses when it clashes with frameworks about the value of a specific job, a church community, or feelings about the importance of close friendships. Once that fact was realized, the health department’s personnel developed a program of “quitting buddies” to create alternative rich social connections for those seeking to quit—an effort that proved successful.
Back to Feelings How do thoughts encountered in the news, at political events, in personal conversations, and elsewhere make their way into our unconscious thinking as “facts”? A few factors involved in this process are introduced below. Note the importance of feelings: 1. Frequent exposure to an idea helps embed it in our thinking. A single tweet by the president about a “witch hunt” can be rebroadcast multiple times a day by various news networks. This repetition gives the idea greater salience and power. Repeated Information becomes more influential when we are told it originates with an anonymous “insider” source, when we forget the idea’s original source, or when we are unaware it is an attempt to influence us. 2. The more distinctive an idea is, the more likely it is to stick with us and exert a silent influence. A novel twist in how an idea is conveyed—for example, calling Central American migrants traveling north a “caravan,” a term with its own unique associations—can make a cautionary message more impactful. 3. An idea or message that is personally relevant is more apt to be accepted. A claim that immigrants cause crime is more believable to people who have been victimized in the past (regardless by whom) than to those who haven’t. Activation of a past incident of victimhood raises thoughts of future vulnerability. 4. The capacity of a message to trigger stories increases its influence. Suggesting immigrants are gang members and generally desperate can be effective even when one has never been a victim but simply knows of someone who has been victimized—even if one does not know that person personally. The victim’s experience, relayed by a newscast or by a mutual acquaintance, becomes a borrowed story, one that gets replayed in our mind and shared with others. And, as the triggered story finds an active home in our memory, it becomes more personally relevant. 5. If an idea is enhanced by a mood, its impact will be greater. Cues in a message can induce certain moods. A tense movie scene becomes more frightening with anxiety-raising background music. A somber mood is established when a political ad has glimpses of coyotes skulking around at night as a crowd of poorly clad people sneak under a border fence.
6. Finally, the more powerful the emotional content of a message, the greater its persuasive impact. Political attack ads often create feelings of physical vulnerability and an imminent sense of losing something highly valued, such as medical insurance or a retirement nest egg. Part I Summary Well-entrenched patterns of thought are hard to change. It usually takes an unavoidable reason to question core thinking before we will engage in the arduous task of reassessment. In a world that puts many demands on our time and energy, it is easier to automatically conclude that what a preferred position is correct rather than fact-check its supporting claims. We cling to the illusion of being consciously rational when we are not. We imagine that we examine facts first, then develop an informed preference or feeling, and then act, when the reverse is more common—we tend first to feel an alleged fact is or is not correct, and then assess it, using those feelings as a filter. Then we act. Or, we use feeling as a basis for acting and then rationalize our action as the last step. Either way, feelings often override rational thinking. Rational thinking is a follow-on process whose veto power over our feelings and actions is not strong. In short, the factuality of information is not always, or even usually, central to how minds operate. Many factors, most of which operate below awareness, affect the processing of information. Very deliberate efforts are required to bring these to consciousness, where they can be approached in a systematic way. This is the focus of Part II. Part II: Opening Minds A few conversation prompts are introduced here to facilitate discussing a topic on which you and others in a discussion disagree. The intent is not to resolve a conflict or negotiate an acceptable compromise or to convince someone you are right and they are not. There are other, established procedures for those actions. What is aimed for here is a preliminary step, which remains a worthy end in itself: the respectful sharing of positions. In the end, all of you involved may remain far apart. However, it is quite likely each other’s thinking will seem less foreign and your feelings of “strangerhood” less pronounced.
To set the stage for your conversations let’s review a few conditions and expectations. • Civility. Conversations require civility—courtesy and tolerance. Like the aquarium in a dentist’s office, civility reduces anxiety. It facilitates appraising and reappraising what is true and false, which may in turn modify how and what we think. However, the intent is simply to make it easier for everyone to learn more about each other’s thoughts and lessen feelings of strangerhood. • Reciprocity. Everyone should be afforded the opportunity to express their thinking and have it heard in a respectful, open manner. • Openness. Although it’s not always in evidence, I believe the capacity for openness is widespread. It is rooted in two basic human qualities: curiosity and the need to know. Building on these two qualities requires effort, and waiting for the results takes patience. • Acknowledgment of Feelings. Expect strong feelings during discussions. After all, thinking can be deeply personal. Our thinking defines who we are. When we are challenged, it is natural and even healthy to have defensive and/or combative feelings. Experiencing these feelings can provide valuable self-insight. However, they can also be contagious. So, guard against an argument-is-war mentality in which you see another’s ideas as positions to be obliterated and arguments to shoot down. • Intransigents. Finally, there are people who are not curious and who don’t care about civility or having constructive discussions. There are no magic wands to wave to remedy this. I don’t suggest persisting for very long with such folks. Fortunately, they are not nearly as numerous as opinion polls, video clips of political rallies, and coffee shop interviews by newscasters might suggest. Guardrails Guardrails are restrictions to consider when forming judgments, making decisions, and taking action. They arise from laws, custom, and personal codes of 9
conduct. They help keep us from going too far or not far enough. Lack of guardrails can lead to chaos, while too many can stifle freedom. Some guardrails are formal and explicit, like budgets, which are guardrails for spending. With some guardrails, there is a degree of flexibility. For instance, test scores are a guardrail in college admissions, but the admissions committee has the freedom to determine what weight should be given to the scores— introducing flexibility. Similarly, a judge has discretion within a set of guidelines (guardrails) when sentencing a convicted felon. Guardrails may be openly debated—and are, whether in setting budgets, establishing what should be taught in schools, or determining at what level social welfare nets should be set. Guardrails may also be unconscious. For example, we often use automatic and, when asked, hard-to-explain criteria when judging if someone is trustworthy. Because so much thinking occurs below awareness, an approaching or actual violation of a guardrail may be required to make us aware it exists—something that makes us say, “When I heard about that, an alarm bell suddenly went off in my head.” Being Different Yet Similar Two guardrails should be kept top of mind when discussing differences in opinions and beliefs. A very simple example will illustrate them. Like the nakedness of the emperor in the story of the emperor’s new clothes, these guardrails are very obvious when pointed out, but are often pointedly ignored. Guardrail 1. Acknowledge that people differ. Some people love strawberry ice cream, others hate it—that’s just a fact of life. No amount of argument or intervention will change it. People’s opinions on social issues—our concern here—may be more malleable, but change comes slowly. You are likely to continue to differ during your discussions.
Guardrail 2. Acknowledge that people share similarities. When people do differ, it is with respect to a deeper, underlying common denominator which is the yardstick against which we measure differences. It is important to seek out this common denominator. Going beyond conflicting thoughts about strawberry ice cream is likely to reveal an underlying agreement about the importance in life of a tasty treat. Knowing this common ground exists—tasty treats have an important role in life— makes it easier to share disagreements about a specific treat. These two guardrails present a paradox, but not a contradiction. The presence of one does not preclude the other. Differences between people, however, are easy to spot whereas finding the more important, shared values from which surfacelevel differences emerge requires effort. Surface-level differences also cause a wariness that makes people unwilling to persevere in seeking out shared beliefs. Guardrail 2 is important because it helps us keep a shared common goal or value in mind while trying to understand different ways of achieving it. Conversation Prompts The following prompts can help people who come together for conversation achieve a deeper understanding of their own and one another’s thinking. They are illustrative and not meant to exclude others. Those introduced here should be expressed in your own way. How would you would state them to help stimulate fruitful discussion? Prompt 1. Where do You Stand? Imagine you’re talking with friends, and the topic of a nursing strike at the nearby hospital comes up. The conversation is animated—but it’s a subject you haven’t thought much about. You don’t have a firmly established opinion. Should you sit this one out? On the contrary: such conversations are an important source of learning. Your lack of familiarity with an issue may actually give you greater license to ask challenging questions. This is “constructive naivete,” and I encourage using it.
Sometimes, however, unconscious values and forgotten experiences lead us to a specific position on a topic without our being aware of the process. For instance, you’ve undoubtedly experienced a situation where you are asked about a topic you don’t recall thinking about. Your initial response might be, “Hmm. I haven’t really thought about that before.” However, you then proceed to comment anyway and discover you do have strong feelings about the topic after all. In these situations, it is helpful to pause before jumping in and consider the following question: What side am I on, and do I know it? Unsurprisingly, we possess many facts, opinions, and beliefs that we’re unaware of. This hidden trove of facts and feelings leads us to take sides on an issue without conscious deliberation. This happened to me recently when a friend asked for my thoughts about an idea I had not encountered before. I surprised myself by proceeding to list a number of detailed, one-sided thoughts about it. It would have been more constructive for both of us if I had waited a moment or two and thought: • Have I already taken a position? • If so, am I clear on what it is and what it is not? Again, the absence of a clear stance is a legitimate position to have since it is impossible to be well informed on most topics. This should not keep you from joining a conversation with people who are more informed. Active listening and constructive questioning is a worthy role in a conversation. However, be prepared to discover you do have a stance after all, and if you do, make an effort to gain clarity about it. Prompt 2. The Best Opposing Argument A constructive conversation starter for people who are in disagreement is this question: What is the soundest argument those who disagree with you have for their position?
This way of beginning a discussion increases openness in the ensuing exchanges. It also makes easier to follow Guardrail 2, which encourages finding shared beliefs that lie below conflicting ideas. When someone genuinely feels there is no sound opposing argument, questions like these can help: What evidence, if it did exist, could lend at least some support to the arguments you disagree with? Or What information would you have to see, as unlikely as it may be that such information exists, that would cause you to say, “You do have a point”? Infrequently, a situation arises when someone cannot honestly imagine any such evidence. This may occur when strong religious beliefs, for example, are associated with a position. Persisting with something like this helps: Okay. Let’s be really creative. As improbable as it may be, what do you imagine suddenly being uncovered that would lend a hint of support to the most frequent argument you hear from them? Don’t be reluctant to speculate wildly. The beachhead of flexibility such question establishes will carry over into the subsequent conversation. Prompt 3. Knowledge Disavowal and Premortems All of us sometimes feel that we know enough about something to know we don’t want to know more. Hence, we hear phrases like, “Let sleeping dogs lie,” “Ignorance is bliss,” and “What you don’t know won’t hurt you.” These folk sayings help us justify not investigating something further. Knowing more might require an unpleasant action or one that may not matter in the end. “So, why bother?” we say to ourselves.
Knowledge disavowal also takes the form of “plausible deniability.” In this case a leader is informed about a matter, but only up to a certain point. That way, he or she can deny having sufficient information to be held accountable. “Avoidance of disconfirming evidence” is another form of knowledge disavowal. Here, our existing frameworks and mental models mentioned in Part I keep us away from information that might contradict our thinking. We don’t like to be wrong, so we avoid exposure to information suggesting we are in error. Knowledge disavowal is a challenge. It works against Guardrail 1, because if we avoid information that troubles us, we’re less likely to come in contact with people who think differently from us. This is a problem in the other direction, too: if we avoid people who think differently and the information they use, we avoid the discomfort of having our opinions and beliefs questioned. I call this “information dodgeball.” We avoid being “hit” by evidence against our position. (https://www.geraldzaltman.com/readers-corner/) We silently think, “Who needs it? Not me. I’ve got what I need.” This raises the idea of a premortem. If a postmortem is an analysis of an event after it happens to determine what went wrong, a premortem is a preemptive analysis before the event. It involves asking how decisions or actions might be illadvised before they are taken. Let’s say you’re thinking of buying your neighbor’s car, which she’s offering to sell you very cheaply. In a premortem, you imagine what might possibly go wrong if you do, assess the risks, and correct for them if possible. When we’re dealing with ideas, a premortem makes the possibility of disconfirming evidence more salient and creates a greater openness to hearing such information later in a discussion. In a premortem, each participant describes their view of the most likely positive and negative outcomes if (1) their position prevails and (2) if the opposing position prevail. The prompts might be like this: Let’s say your position is correct and is acted upon. What is the most beneficial result that would occur? What is the most harmful thing that could happen?
Now, let’s say your position is put into effect but turns out to be in error. What consequences are most likely to arise or what problems would you most likely have fix? Does the information you are using address these questions? Is that information available? To take an example, what are the consequences of acting on the judgment that humans are a major cause of climate change? What are the consequences of acting on that judgment if it is wrong? What are the consequences of acting on a judgment that humans do not contribute to climate change, both if that judgment is correct and if it is wrong? Do the consequences in all cases have their own unique effects? Are the consequences of being wrong easier or harder to fix in one case than the other? Prompt 4. Assumptions When using conversation prompts, be attentive to assumptions. Assumptions are automatic, unspoken judgments or positions. Since they generally live below our conscious awareness, we tend to accept them as true with little if any critical examination. They are also numerous. Assumptions often carry some reasonable likelihood of being wrong. As the saying goes, “It is not what you know that gets you in trouble, it’s what you know that ain’t so.” When identifying and then challenging assumptions, either your own or someone else’s, the goal is not to trip up someone or to avoid being tripped up by others. Instead, in the spirit of openness and satisfying your need to know or understand, the goal is to learn more about your own and others’ thinking. Think of it as a process of scouting out a terrain, looking for interesting, important, and previously unseen features. It is not a process in which you are setting traps. Assumptions are found in many corners of a disputed issue. Try asking yourself the following: • Who is most affected by your position on the issue? By the opposing position being argued?
• Who stands to gain and lose depending on which position prevails? • Whose position is given the most attention? • What sources of information are most/least reliable regarding the issue? • How clearly defined is the problem being solved? Whose problem is it? • How does the way a solution is implemented impact its effectiveness? Exploring these questions is enlightening. You may identify some assumptions that appear obvious, while others may seem absurd. Still others may spur you to think more deeply. These can be likened to the answer to a riddle or the punchline of a joke: they are attention-getting surprises that redirect our attention in unanticipated ways. Part II Summary Part II began with conditions that encourage the exchange of ideas: civility, reciprocity, openness, and acknowledgment of uncomfortable feelings when challenged. It was also noted that some people are just not curious and have little need to understand how they and others think. Conversations here are unlikely to be productive. Two guardrails were introduced to keep in mind during conversations. The first is an acknowledgment that people differ. We are not clones of one another in our thinking. People’s differences on important issues is a source of learning and a basis for improving social policies. The second guardrail is an acknowledgment that even when people differ, they usually do so with respect to important shared values and beliefs. For instance, different state governments have different laws on child adoption procedures, voting registration, gambling, etc., and the residents of the various states may have wildly differing opinions on those laws. But all states and nearly all their residents agree that it is important to have and follow laws and to have procedures for changing them. Without this fundamental agreement on respect for the law, life would be chaotic.
The guardrails should be kept top of mind when having conversations with family, friends, coworkers, and strangers. Several prompts for more constructive conversations were introduced to facilitate sharing and understanding each otherâ€™s thinking about a contentious matter. The first prompt is intended as a selfcheck, asking if you have taken a position on an issue without realizing it. The second asks you and a conversation partner to describe what you feel is the otherâ€™s best argument. The third explores the tendency to avoid disconfirming evidence and encourages us to imagine the consequences of both the success and failure of own and an opposing position. The last prompt addresses the importance of identifying hidden assumptions. Such assumptions may reveal further differences. They may also reveal deep, shared commonalities. Conclusion This essay is intended for anyone concerned about a growing estrangement in thinking between themselves and others they respect and care about on issues of great importance to society. It is not an essay about converting others to our positions. It simply suggests that better conversations are possible and offers some guidelines for increasing mutual understanding. The mutual benefit that will arise from that understanding includes the establishment of necessary groundwork for resolving conflicts, negotiating compromise solutions, or even gaining converts.
Note to Readers: This document draws from many published sources and critical reading by colleagues. Readers are encouraged to consult Gerald Zaltman, Unlocked: Keys to Improve Your Thinking (Amazon, 2018). Available at https://amzn.to/2JVnCpk The author may be contacted at firstname.lastname@example.org.
An essay by Dr. Gerald Zaltman