فصل 07

کتاب: تمرکز / فصل 7

فصل 07

توضیح مختصر

  • زمان مطالعه 0 دقیقه
  • سطح خیلی سخت

دانلود اپلیکیشن «زیبوک»

این فصل را می‌توانید به بهترین شکل و با امکانات عالی در اپلیکیشن «زیبوک» بخوانید

دانلود اپلیکیشن «زیبوک»

فایل صوتی

برای دسترسی به این محتوا بایستی اپلیکیشن زبانشناس را نصب کنید.

متن انگلیسی فصل

7

SEEING OURSELVES AS OTHERS SEE US

We have a ‘No jerks allowed’ rule, but our chief tech officer is one,” an executive at a California tech incubator tells me. “He executes very well, but he’s a huge bully, freezes people out who he doesn’t like, plays favorites. “He’s got zero self-awareness,” she adds. “He just does not realize when he’s being a bully. If you point out to him he’s just done it again, he shifts the blame, gets angry, or thinks you’re the problem.” The company’s CEO later told me, “We worked with him for another three months or so, and then finally had to let him go. He couldn’t change—he was a bully, and didn’t even see it.” All too often when we “lose it” and fall back on a less desirable way of acting, we’re oblivious to what we do. And if no one tells us, we stay that way. One surefire test for self-awareness is a “360-degree” evaluation, where you’re asked to rate yourself on a range of specific behaviors or traits. Those self-ratings are checked against evaluations by a dozen or so people whom you have asked to rate you on the same scale. You pick them because they know you well and you respect their judgment—and their ratings are anonymous, so they can feel free to be frank. The gap between how you see yourself and how the others rate you offers one of the best evaluations you can get anywhere of your own self-awareness. There’s an intriguing relationship between self-awareness and power: There are relatively few gaps between one’s own and others’ ratings among lower-level employees. But the higher someone’s position in an organization, the bigger the gap.1 Self-awareness seems to diminish with promotions up the organization’s ladder. One theory: That gap widens because as people rise in power within an organization the circle shrinks of others willing or courageous enough to speak to them honestly about their quirks. Then there are those who simply deny their deficits, or can’t see them in the first place. Whatever the reason, tuned-out leaders see themselves as being far more effective than do those they are guiding. A lack of self-awareness leaves you clueless. Think The Office. A 360-degree evaluation applies the power of seeing ourselves through the eyes of others, which offers another pathway to self-awareness. Robert Burns, the Scottish poet, praised this pathway in verse: Oh that the gods The gift would gi’e us To see ourselves As others see us.

A more sardonic view was offered by W. H. Auden, who observed that, so “I may love myself,” we each create a positive self-image in our minds by selective forgetting of what’s unflattering to us and recalling what’s admirable about us. And, he added, we do something similar with the image we try to create “in the minds of others in order that they may love me.” And philosopher George Santayana took this full circle, by noting that what other people think of us would matter little—except that once we know it, it “so deeply tinges what we think of ourselves.” Social philosophers have called this mirroring effect the “looking glass self,” how we imagine others see us. Our sense of self, in this view, dawns in our social interactions; others are our mirrors, reflecting us back to ourselves. The idea has been summI think you think I am.” THROUGH OTHERS’ EYES—AND EARS

Life affords us little chance to see how others really see us. That may be why the course Bill George teaches at Harvard Business School, called Authentic Leadership Development, is among the most popular, overenrolled every time it is offered (the same goes with a similar course at Stanford’s business school). As George told me, “We don’t know who we are until we hear ourselves speaking the story of our lives to someone we trust.” To expedite that heightening of self-awareness, George has created what he calls “True North Groups,” with “True North” referring to finding one’s inner compass and core values. His course gives students the chance to be in such a group. A precept of the groups: self-knowledge begins with self-revelation. These groups (which anyone can form) are as open and intimate as—or even more so than—twelve-step meetings or therapy groups, according to George, providing “a safe place where members can discuss personal issues they do not feel they can raise elsewhere—often not even with their closest family members.”2 It’s not just seeing ourselves as others see us. There’s also hearing ourselves as others hear us. We don’t. The journal Surgery reports a study where surgeons’ tone of voice was evaluated, based on ten-second snippets recorded during sessions with their patients.3 Half the surgeons whose voices were rated had been sued for malpractice; half had not. The voices of those who had been sued were far more often rated as domineering and uncaring. Surgeons spend more time than most other physicians explaining technical details to their patients, as well as disclosing the worst risks of surgery. It’s a difficult conversation, one that can put patients into a state of high anxiety and a heightened vigilance to emotional cues. When it comes to the patient listening to the surgeon explain the technical details—and the frightening potential risks—the brain’s radar for danger goes into high alert, searching for cues and clues to how safe all this really might be. That heightened sensitivity may be one reason the empathy or concern—or rather, the lack of either—conveyed in a surgeon’s tone of voice tends to predict whether he will be sued if something goes wrong. The acoustics of our skull case render our voice as it sounds to us very different from what others hear. But our tone of voice matters immensely to the impact of what we say: research has found that when people receive negative performance feedback in a warm, supportive tone of voice, they leave feeling positive—despite the negative feedback. But when they get positive performance reviews in a cold and distant tone of voice, they end up feeling bad despite the good news.4 One remedy proposed in the Surgery article: give surgeons an audio replay of their voice as they talked to patients, so they can hear how they sound and get coaching on ways to make their voice communicate empathy and caring—to hear themselves as others hear them. GROUPTHINK: SHARED BLIND SPOTS

In the wake of the economic meltdown of investment vehicles based on subprime derivatives, a financial type whose job had been creating those very derivative instruments was interviewed. He explained how in his job he would routinely take huge lots of subprime mortgages and divide them into three tranches: the best of the worst, the not-as-good, and the worst of the worst. Then he would take each of the tranches and again divide it into thirds—and create derivatives for investments based on each. He was asked, “Who would want to buy these?” His reply: “Idiots.” Of course, seemingly very smart people did invest in those derivatives, ignoring signals that they were not worth the risk, and emphasizing whatever might support their decision. When this tendency to ignore evidence to the contrary spreads into a shared self-deception, it becomes groupthink. The unstated need to protect a treasured opinion (by discounting crucial disconfirming data) drives shared blind spots that lead to bad decisions. President George W. Bush’s inner circle and their decision to invade Iraq based on imaginary “weapons of mass destruction” offers a classic example. So do the circles of financial players who fostered the mortgage derivatives meltdown. Both instances of catastrophic groupthink entailed insulated groups of decision-makers who failed to ask the right questions or ignored disconfirming data in a self-affirming downward spiral. Cognition is distributed among members of a group or network: some people are specialists in one area, while others have complementary strengths of expertise. When information flows most freely among the group and into it, the best decisions will be made. But groupthink begins with the unstated assumption We know everything we need to. A firm that manages investments for very wealthy people gave Daniel Kahneman a treasure trove: eight years of investment results for twenty-five of its financial advisers. Analyzing the data, Kahneman found that there were no relationships between any given adviser’s results from year to year—in other words, none of the advisers was consistently any better than the others at managing the clients’ money. The results were no better than chance. Yet everyone behaved as though there were a special skill involved—and the top performers each year got big bonuses. His results in hand, Kahneman had dinner with the top brass at the firm and informed them that they were “rewarding luck as if it were skill.” That should have been shocking news. But the executives calmly went on with their dinner and, Kahneman says, “I have no doubt that the implications were quickly swept under the rug and that life in the firm went on just as before.”5 The illusion of skill, deeply embedded in the culture of that industry, was under attack. But “facts that challenge such basic assumptions—and thereby threaten people’s livelihood and self-esteem—are simply not absorbed,” he adds. Back in the 1960s, as the civil rights movement was boiling in the South, I joined a picket line at a local grocery store in my California hometown that did not then hire African-Americans. But it was not until years later, when I heard about the work of John Ogbu, a Nigerian anthropologist then at the University of California, Berkeley—who came to my nearby town to study what he called its “caste system”—that I realized there was one, a kind of de facto segregation.6 My high school was all-white, with a sprinkling of Asians and Hispanics; another high school was mostly black, with some Hispanics; the third was a mix. I had just never thought about it. When it came to the grocery store, I could readily see their part in discrimination—but I was blind to the larger pattern I was enmeshed within, the overall social ladder inherent in where people lived, and so where they went to school (in those days). Inequity in a society fades into the background, something we habituate to rather than orient toward. It takes effort to shift it back into our collective focus. Such self-deception seems a universal twist of attention. For instance, when drivers rated their abilities behind the wheel, about three-quarters thought they were better than average. Strangely, those who had been in an auto accident were more likely to rate themselves as better drivers than did those whose driving record was accident-free. Even stranger: In general, most people rate themselves as being less likely than others to overrate their abilities. These inflated self-ratings reflect the “better-than-average” effect, which has been found for just about any positive trait, from competence and creativity to friendliness and honesty. I read Kahneman’s account in his fascinating book Thinking Fast and Slow while on a Boston-to-London flight. As the plane landed I chatted with the fellow across the aisle, who had been eyeing the cover. He told me he planned to read the book—and happened to mention that he invested the assets of wealthy individuals. As our plane taxied down the long runway and found its way to our gate at Heathrow, I summarized the main points for him, including this tale about the financial firm—adding that it seemed to imply his industry rewarded luck as though it were skill. “I guess,” he replied with a shrug, “I don’t have to read the book now.” When Kahneman had reported his results to the money managers themselves, they responded with a similar indifference. As he says of such disconcerting data, “The mind does not digest them.” It takes meta-cognition—in this case, awareness of our lack of awareness—to bring to light what the group has buried in a grave of indifference or suppression. Clarity begins with realizing what we do not notice—and don’t notice that we don’t notice. Smart risks are based on wide and voracious data-gathering checked against a gut sense; dumb decisions are built from too narrow a base of inputs. Cfrom those you trust and respect creates a source of self-awareness, one that can help guard against skewed information inputs or questionable assumptions. Another antidote to groupthink: expand your circle of connection beyond your comfort zone and inoculate against in-group isolation by building an ample circle of no-BS confidants who keep you honest. A smart diversification goes beyond gender and ethnic group balance to include a wide range of ages, clients, or customers, and any others who might offer a fresh perspective. “Early on in our operation, our servers failed,” an executive at a cloud computing company says. “Our competitors were monitoring us, and soon we got a flood of calls from reporters asking what was going on. We didn’t answer the calls, because we didn’t know what to say. “Then one employee, a former journalist, came up with a creative solution: a website called ‘Trust Cloud’ where we were completely open about what was happening with our server—what the problem was, how we were trying to fix it, everything.” That was a foreign idea to most executives there; they had come from tech companies where heightened secrecy was routine. The unquestioned assumption that they should keep the problem to themselves was a potential seed of groupthink. “But once we became transparent,” the executive says, “the problem went away. Our customers were reassured they could know what was happening, and reporters stopped calling.” “Sunlight,” as Supreme Court justice Felix Frankfurter once said, “is the best disinfectant.”

مشارکت کنندگان در این صفحه

تا کنون فردی در بازسازی این صفحه مشارکت نداشته است.

🖊 شما نیز می‌توانید برای مشارکت در ترجمه‌ی این صفحه یا اصلاح متن انگلیسی، به این لینک مراجعه بفرمایید.