At first glance, I thought that Alan Deutschman's new book, Walk the Walk: The #1 Rule for Real Leaders, was an exercise in belaboring the obvious. Just as Malcolm Gladwell's book Blink can be reduced to "trust your gut," I thought Deutschman's premise that top CEOs and leaders need to "walk the walk, not just talk the talk," was too obvious to warrant repeating, let alone spend 176 pages discussing at length. But after reading both the book and the business section pages over the past few days, I've decided I was wrong. On two fronts.
First, it appears that Deutschman's premise about the importance of management being authentic, honest, and not asking anyone beneath them to meet any standard or make any sacrifice they're not prepared to meet or make themselves is clearly not as obvious or widely understood as I once might have thought. Take yesterday's column by David Carr of the New York Times about the management at the Tribune Company arguing to a bankruptcy court--after leading the company into bankruptcy (in no small part because of a badly-conceived, heavily-leveraged purchase that left the company saddled with debt) and depriving more than 2,000 employees of jobs-- that the managers should be awarded between 45 to 60 million dollars in performance bonuses. The bonuses are necessary, the company's lawyers argued, because getting a company out of bankruptcy is hard work, and "not being rewarded for hard work and hard effort is demotivating."
No kidding. As Carr says, tell that to the 2,000 journalists and other Tribune personnel whose reward for hard work and hard effort was the elimination of their jobs.
The stunning obliviousness of the Tribune management reminds me of a definition I heard a long time ago for "chutzpah": someone who kills his or her parents and then pleads mercy from the court because he or she is an orphan. Run a company into bankruptcy, and then plead with the court that running a bankrupt company is hard, so you need extra money to do it. That takes ... well, chutzpah. Among other things. Not to mention the fact that $60 million (if all the management performance numbers were met) would give every laid-off staff person $30,000 a year. Think of the products the Tribune could actually produce for that amount of money.
Compare that, for a moment, to some of the military and business leaders Deutschman uses as examples--from Alexander the Great, who took more hits on the front line than any of his soldiers, to Norman Schwartzkopf, who insisted that officers in his command eat the same food and meet the same fitness standards as the troops they commanded. Or Bill Hewlett of Hewlett-Packard, who Deutschman says made every employee, including himself and his entire top management team, take every 10th day off without pay, rather than laying off any employees in the recession of 1970.
Another point Deutschman makes is that a great leader has, in the words of Urban Meyer, head football coach at the University of Florida (where Tim Tebow plays), "the ability to make the level of play of everyone else around him better." Again, a seeming statement of the ridiculously obvious. But consider this piece on Bank of America's outgoing CEO (and former chariman) Ken Lewis, who announced last week that he was retiring--although he said he'd stay on through December because a successor wasn't waiting in the wings. And why wasn't a successor waiting in the wings? Because, according to the article's author, Joe Nocera, Lewis "brutally fired many of the firm's most talented executives, seemingly afraid to be surrounded by potential successors."
So, Lewis wasn't well liked, or good at nurturing or inspiring good performers around him. But not every leader has to be liked to be successful, right? Possibly. But they have to be respected, at least. And ... oh yeah, successful. But during Lewis's tenure, he also made a series of less-than profitable business decisions and purchases, including the purchase of the notorious mortgage disaster known as Countrywide Financial, not to mention the Merrill Lynch mess, that caused the stock to return negative 13 percent while he was in charge.
And yet, Nocera reported, Lewis has taken home $60 million in compensation over the past three years. Clearly, the idea that a good leader--one worth compensating obscenely well-- should be someone who not only exceeds expectations but also inspires better performance in those around him and sacrifices with the troops, is not a patently obvious or well-understood idea at the top levels of Bank of America. Or among executives at any number of other financial institutions and corporations who have spent the last year boggling many people's minds at their capacity for tone-deaf and enduring senses of entitlement. So much so that the entitlement-laden gestures and complaints aren't even eyebrow-raising to many people at this point.
So maybe the more interesting question is: Are these executives beyond hope? Are really great leaders born, and these executives simply don't have what it takes? Or, even if great leadership traits can be learned, are they traits we have to learn in childhood, not at age 55? Or can they be rehabilitated into better behavior and leadership?
Deutschman doesn't get into whether leadership traits are innate or acquired. But he does sketch out, at the end of his book, some traits that he believes are essential in a great "leadership personality": focus; empathy; relentless authenticity; belief not only in themselves, but also in others and in change itself; resilience; and dogged persistence.
Another person's list might differ. But I found the list interesting food for thought. For one thing, "empathetic" isn't generally the first word we hear when Wall Street and corporate titans are described. Brilliant, focused, ruthless, sharply analytic, and relentless, yes. But authentic and empathetic ... not so much. That might explain a lot. (Also ironic to see empathy given such big play in a business book, after all the argument about it in Sonia Sotomayor's confirmation hearings.)
But just for argument's sake, let's say Deutschman is right, and the traits he lists really are the essential pre-requisites for a great executive or leader. Can they be taught in business school, or in a business setting? Or do we simply have to start looking for a different kind of leader in the first place?
Evolutionary psychologists are only beginning to look at how individual personality traits may evolve (as opposed to more basic domains of survival, sexuality, parenting, community, cooperation and aggression). But a recent paper on the subject by David Buss, professor of psychology at the University of Austin, noted that "virtually all personality characteristics ... show heritabilities in the range of 50% and substantial cross-time stability, even over spans of decades."
Which still leaves 50 percent, of course. And education and training can certainly help strengthen or mitigate someone's natural tendencies. After all, belief in a person's ability to change is, itself, one of the traits Deutschman says great leaders possess.
On the other hand, if the traits Deutschman lists as important really do have a significant genetic component, and personality traits have substantial stability over time, then it might not do troubled executives much good to read Deutschman's book. But even if that's the case, it could still prove useful to the rest of us ... if only in underscoring the seemingly obvious fact that we might want to give a little more attention to the personality traits of who we hire to run things. Walking the walk, it turns out, is a lot harder, and rarer, than one might imagine.
She lived with us for 56 years. She raised me and my siblings without pay. I was 11, a typical American kid, before I realized who she was.
The ashes filled a black plastic box about the size of a toaster. It weighed three and a half pounds. I put it in a canvas tote bag and packed it in my suitcase this past July for the transpacific flight to Manila. From there I would travel by car to a rural village. When I arrived, I would hand over all that was left of the woman who had spent 56 years as a slave in my family’s household.
The condition has long been considered untreatable. Experts can spot it in a child as young as 3 or 4. But a new clinical approach offers hope.
This is a good day, Samantha tells me: 10 on a scale of 10. We’re sitting in a conference room at the San Marcos Treatment Center, just south of Austin, Texas, a space that has witnessed countless difficult conversations between troubled children, their worried parents, and clinical therapists. But today promises unalloyed joy. Samantha’s mother is visiting from Idaho, as she does every six weeks, which means lunch off campus and an excursion to Target. The girl needs supplies: new jeans, yoga pants, nail polish.
Listen to the audio version of this article:Download the Audm app for your iPhone to listen to more titles.
At 11, Samantha is just over 5 feet tall and has wavy black hair and a steady gaze. She flashes a smile when I ask about her favorite subject (history), and grimaces when I ask about her least favorite (math). She seems poised and cheerful, a normal preteen. But when we steer into uncomfortable territory—the events that led her to this juvenile-treatment facility nearly 2,000 miles from her family—Samantha hesitates and looks down at her hands. “I wanted the whole world to myself,” she says. “So I made a whole entire book about how to hurt people.”
U.K. police said at least 22 people are dead and 59 injured following the incident at Manchester Arena.
Here’s what we know:
—Greater Manchester Police said 22 people are dead and 59 injured following reports of an explosion at the Manchester Arena.
—Authorities are treating the explosion as a terrorist attack, believing the incident to be carried out by a lone male. The attacker, who reportedly detonated an explosive device, is said to have died at the arena.
—The venue was the scene of an Ariana Grande concert. British Transport Police said there were “reports of an explosion within the foyer area of the stadium” at 10.35 p.m. local time, but Manchester Arena said the incident occurred “outside the venue in a public place.”
—This is a developing story and we’ll be following it here. All updates are in Eastern Standard Time (GMT -4).
Isabel Caliva and her husband, Frank, had already “kicked the can down the road.” The can, in their case, was the kid conversation; the road was Caliva’s fertile years. Frank had always said he wanted lots of kids. Caliva, who was in her early 30s, thought maybe one or two would be nice, but she was mostly undecided. They had a nice life, with plenty of free time that allowed for trips to Portugal, Paris, and Hawaii.
“I wasn’t feeling the pull the same way my friends were describing,” she told me recently. “I thought, maybe this isn’t gonna be the thing for me. Maybe it’s just going to be the two of us.”
At times, she wondered if her lack of baby fever should be cause for concern. She took her worries to the Internet, where she came across a post on the Rumpus’ “Dear Sugar” advice column titled, “The Ghost Ship that Didn’t Carry Us.” The letter was from a 41-year-old man who was also on the fence about kids: “Things like quiet, free time, spontaneous travel, pockets of non-obligation,” he wrote. “I really value them.”
For 15 years, the animation studio was the best on the planet. Then Disney bought it.
A well-regarded Hollywood insider recently suggested that sequels can represent “a sort of creative bankruptcy.” He was discussing Pixar, the legendary animation studio, and its avowed distaste for cheap spin-offs. More pointedly, he argued that if Pixar were only to make sequels, it would “wither and die.” Now, all kinds of industry experts say all kinds of things. But it is surely relevant that these observations were made by Ed Catmull, the president of Pixar, in his best-selling 2014 business-leadership book.
Yet here comes Cars 3, rolling into a theater near you this month. You may recall that the original Cars, released back in 2006, was widely judged to be the studio’s worst film to date. Cars 2, which followed five years later, was panned as even worse. And if Cars 3 isn’t disheartening enough, two of the three Pixar films in line after it are also sequels: The Incredibles 2 and (say it isn’t so!) Toy Story 4.
The office was, until a few decades ago, the last stronghold of fashion formality. Silicon Valley changed that.
Americans began the 20th century in bustles and bowler hats and ended it in velour sweatsuits and flannel shirts—the most radical shift in dress standards in human history. At the center of this sartorial revolution was business casual, a genre of dress that broke the last bastion of formality—office attire—to redefine the American wardrobe.
Born in Silicon Valley in the early 1980s, business casual consists of khaki pants, sensible shoes, and button-down collared shirts. By the time it was mainstream, in the 1990s, it flummoxed HR managers and employees alike. “Welcome to the confusing world of business casual,” declared a fashion writer for the Chicago Tribune in 1995. With time and some coaching, people caught on. Today, though, the term “business casual” is nearly obsolete for describing the clothing of a workforce that includes many who work from home in yoga pants, put on a clean T-shirt for a Skype meeting, and don’t always go into the office.
“Having a slave gave me grave doubts about what kind of people we were, what kind of place we came from,” Alex Tizon wrote in his Atlantic essay “My Family’s Slave.”
A thousand objections can be leveled against that piece, and in the few days since it was published, those objections have materialized from all quarters. It’s a powerful story, and its flaws and omissions have their own eloquence. For me, the most important failure is that Tizon seems to attribute Lola’s abuse entirely to another culture—specifically, to a system of servitude in the Philippines—as though he believes, This doesn’t happen in America. But that system is not only in America, it’s everywhere. It ensnares not only immigrants, but everyone.
New Orleans Mayor Mitch Landrieu explains to his city why four monuments commemorating the Lost Cause and the Confederacy had to come down.
Last week, the City of New Orleans finished removing four monuments—to Confederate President Jefferson Davis, Generals P.G.T. Beauregard and Robert E. Lee, and the postwar battle of Liberty Place. The removals occasioned threats, protests, and celebrations. On Friday, Mayor Mitch Landrieu explained to his city why he had concluded that the monuments needed to come down.
The soul of our beloved City is deeply rooted in a history that has evolved over thousands of years; rooted in a diverse people who have been here together every step of the way—for both good and for ill.
An anthropologist discusses some common misconceptions about female genital cutting, including the idea that men force women to undergo the procedure.
I recently had a conversation that challenged what I thought I knew about the controversial ritual known as “female genital cutting,” or, more commonly, "female genital mutilation."
FGC, as it is abbreviated, involves an elder or other community member slicing off all or part of a woman’s clitoris and labia as part of a ceremony that is often conducted around the time that the woman reaches puberty. Many international groups are concerned about FGC, which is practiced extensively in parts of Africa and the Middle East and is linked to infections, infertility, and childbirth complications.
Organizations such as the United Nations have campaigned against the practice, calling for its abolition as a matter of global health and human rights. But despite a decades-old movement against it, FGC rates in some countries haven't budged. While younger women are increasingly going uncut in countries such as Nigeria and the Central African Republic, according to a survey by the Population Reference Bureau, in Egypt more than 80 percent of teenagers still undergo the procedure.