How on earth did Oprah Winfrey--an unlikely media mogul if there ever was one--get so popular, powerful and rich, all at the same time? Ever since she announced at the end of last week that she was walking away from her legendarily popular syndicated talk show in 2011, the airwaves have been abuzz with discussion about what it means, what impact it will have ... and how she managed to get this successful in the first place.
In yesterday's New York Times, columnist David Carr argued that Oprah Winfrey should be studied in every business school in America--not only for the smart moves she made, but also for the mistakes she didn't make. She didn't go public with her company, so she retained control. She didn't lend her name to other people's products. When she decided to add a magazine to her stable, she created her own, with such a clear sense of branding that she put herself on each and every cover. She didn't use her wealth to invest in fields she knew nothing about. Oprah did extend her brand into new shows, from Dr. Phil to Rachel Ray, but her offshoots all had the same feel and market as the mothership. And she didn't try to cash in on every possible profit opportunity, including the success of the books she turned into overnight bestsellers.
Oprah's branding success, according to Vogue editor Anna Wintour, was due to the fact that she steered her business through "personal choices," like a woman who has an enviably clear and innate sense of what looks good on her. Which is undoubtedly true. But that complicates the matter of how one would teach or replicate Oprah's success in b-school.
Oprah Winfrey, after all, gives a whole new meaning to the "Chicago School" of economics. A meaning that would make Milton Friedman, the father of the adage "the purpose of business is to make as much money as possible for shareholders," turn over in his grave. Oprah never allowed shareholders, of course, which simplified the matter. But, still. In an era where the primacy of the bottom line ruled triumphant, Oprah gave away cars, eschewed commissions on products she made popular, and turned down the short-term money that going public or selling the company could have brought. And made $2.3 billion as a result of it.
Scholars could parse all her decisions for wisdom about brand management, risk, leadership, growth strategies, marketing, and internal R&D investment. They might even find places where her success seemed to illustrate well-known models or schools of thought. Someone is probably working on it right now, as a matter of fact. Which is all well and good, because there's certainly a lot of wisdom that can be gleaned from the story of Oprah's successful climb from a local Chicago talk show host to the CEO of her own production company and network, while becoming a seismic cultural force and, arguably, the most powerful and wealthy woman in America.
The trouble is, Oprah's success isn't just the sum of her strategies. The engine that not only drove those particular strategies, but also made them successful, was a deep sense of identity, authenticity, and purpose that can't be imitated or crafted through method. If Oprah has a deep and guiding understanding of her audience, it's not because she's methodically observed them. It's because she's lived their struggles, hopes, joys and sorrows. And those struggles gave her first a connection, and then a purpose, from which all other decisions organically flowed.
In the world of Silicon Valley, it's said there are two types of entrepreneurs: missionaries, and mercenaries. Mercenaries can make a lot of money if they're smart and have good strategies. But missionary entrepreneurs are the ones who change industries and the world--not only because they continue on no matter how hard the going gets, but because they bring to bear an irresistible combination of passion, authenticity and sense of purpose bigger than mere profit or themselves. Success, for them, is as much about impact as it is about profit. Which is, ironically, how many of them become incredibly profitable.
Clearly, Oprah is a missionary entrepreneur. But how do you teach someone to be a successful missionary? Even Polonius' advice to Laertes, "to thine own self be true" is insufficient. If asked, I suspect Oprah would say that first you have to learn who you are, where you came from, how that affects and informs you, and what matters in the world. You also have to care about something bigger than yourself, and imagine a way in which your particular skills could allow you to make a difference in that area. And whether you seek that path out, or stumble upon it along the way, you have to care about making that difference enough that the vision of it keeps you going through the dark, and can act as a compass to steer your decisions along the way.
Add to that some smarts, savvy, and sharp thinking about content, brand management, marketing, and growth, and you have a legend in the making. But those last bits are the only pieces that can be taught. Honest self-knowledge, authenticity, passion and purpose are harder to acquire. Most often, they emerge from battles fought in the midnights of our solitude, if we manage to scrounge up the courage to face what we find there.
But if you can't teach the intuition that emerges from those internal journeys, you can at least teach its importance. Asking "what would Oprah do?" might not be a bad exercise when contemplating tough or tempting business options. It's not a quantifiable model, of course, and the results can't be proven. But it wouldn't be a bad placeholder while encouraging students to explore enough about themselves and the world to develop a true-steering compass and passionate purpose of their own.
Note: I will be offline for the next week, returning Friday, December 4th. Photo Credit: Flickr User whoohoo120
Heather Armstrong’s Dooce once drew millions of readers. Her blog’s semi-retirement speaks to the challenges of earning money as an individual blogger today.
The success story of Dooce.com was once blogger lore, told and re-told in playgroups and Meetups—anywhere hyper-verbal people with Wordpress accounts gathered. “It happened for that Dooce lady,” they would say. “It could happen for your blog, too.”
Dooce has its origin in the late 1990s, when a young lapsed Mormon named Heather Armstrong taught herself HTML code and moved to Los Angeles. She got a job in web design and began blogging about her life on her personal site, Dooce.com.
The site’s name evolved out of her friends’ AOL Instant-Messenger slang for dude, or its more incredulous cousin, "doooood!” About a year later, Armstrong was fired for writing about her co-workers on the site—an experience that, for a good portion of the ‘aughts, came known as “getting dooced.” She eloped with her now ex-husband, Jon, moved to Salt Lake City, and eventually started blogging full time again.
In continuing to tinker with the universe she built eight years after it ended, J.K. Rowling might be falling into the same trap as Star Wars’s George Lucas.
September 1st, 2015 marked a curious footnote in Harry Potter marginalia: According to the series’s elaborate timeline, rarely referenced in the books themselves, it was the day James S. Potter, Harry’s eldest son, started school at Hogwarts. It’s not an event directly written about in the books, nor one of particular importance, but their creator, J.K. Rowling, dutifully took to Twitter to announce what amounts to footnote details: that James was sorted into House Gryffindor, just like his father, to the disappointment of Teddy Lupin, Harry’s godson, apparently a Hufflepuff.
It’s not earth-shattering information that Harry’s kid would end up in the same house his father was in, and the Harry Potter series’s insistence on sorting all of its characters into four broad personality quadrants largely based on their family names has always struggled to stand up to scrutiny. Still, Rowling’s tweet prompted much garment-rending among the books’ devoted fans. Can a tweet really amount to a piece of canonical information for a book? There isn’t much harm in Rowling providing these little embellishments years after her books were published, but even idle tinkering can be a dangerous path to take, with the obvious example being the insistent tweaks wrought by George Lucas on his Star Wars series.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
An image of a small child evokes an unfathomably huge tragedy.
I had just dropped my son off at daycare when I opened Twitter and came across a photo that over the next 24 hours would become a totem of the refugee crisis in Europe and the Middle East, and the blight that is the Syrian civil war. The picture would quickly reappear, this time as an earnest social-media meme, at a meeting of dithering UN officials and a gathering of unfeeling Arab leaders: a small Syrian boy in a red shirt, blue shorts, and worn shoes, lying face down in wet sand, his head cocked to one side along a gray, glistening shoreline, his lifeless hands cupped upwards, his knees slightly bent.
My first reaction was despair. My second was: My son sleeps just like that.
The attention this photo has received has generated discomfort as well as indignation—for understandable reasons. There are important ethical questions surrounding the taking or sharing of photos of children, dead or alive, in the media, including questions about the intent of the sharers and the consent of the subject. The scale of the Syrian tragedy is orders of magnitude greater, and infinitely more variegated, than this one picture, or this one victim’s story, can possibly convey. Over the last four and a half years, an estimated 240,000 people have died in the grinding violence, including nearly 12,000 children. More than half of Syria’s pre-war population—half, the proportional equivalent of nearly 170 million Americans—have been forced to flee their homes, spawning the largest exodus of refugees in a generation. Seven hundred and fifty thousand Syrian children won’t be going back to school this fall.
Encouraging a focus on white identity is a dangerous approach for a country in which white supremacy has been a toxic force.
Donald Trump and the disaffected white people who make up his base of support have got me thinking about race in America. “Trump presents a choice for the Republican Party about which path to follow––” Ben Domenech writes in an insightful piece at The Federalist, “a path toward a coalition that is broad, classically liberal, and consistent with the party’s history, or a path toward a coalition that is reduced to the narrow interests of identity politics for white people.”
When I was growing up in Republican Orange County during the Reagan and Bush Administrations, lots of white parents sat their kids in front of The Cosby Show, explained that black people are just like white people, and inveighed against judging anyone by the color of their skin rather than the content of their character. The approach didn’t convey the full reality of race as minorities experience it. But it represented a significant generational improvement in race relations.
The man who made computers personal was a genius and a jerk. A new documentary wonders whether his legacy can accommodate both realities.
An iPhone is a machine much like any other: motherboard, modem, microphone, microchip, battery, wires of gold and silver and copper twisting and snaking, the whole assembly arranged under a piece of glass whose surface—coated with an oxide of indium and tin to make it electrically conductive—sparks to life at the touch of a warm-blooded finger. But an iPhone, too, is much more than a machine. The neat ecosystem that hums under its heat-activated glass holds grocery lists and photos and games and jokes and news and books and music and secrets and the voices of loved ones and, quite possibly, every text you’ve ever exchanged with your best friend. Thought, memory, empathy, the stuff we sometimes shorthand as “the soul”: There it all is, zapping through metal whose curves and coils were designed to be held in a human hand.
After a lackluster summer, the famous neurosurgeon is finally surging—but his reliance on the conservative grassroots might be a burden as much as a boon.
The Ben Carson surge that everyone was waiting for is finally here.
The conservative neurosurgeon has been a source of fascination for both the Republican grassroots and the media ever since he critiqued President Obama, who was seated only a few feet away, at the National Prayer Breakfast in 2013. He’s been a steady, if middling, presence in GOP primary polls for most of the year—always earning at least 5 percent, but rarely more than 10. Yet over the last two weeks, Carson has secured a second-place spot after Donald Trump, both nationally and in the crucial opening battleground of Iowa, where he is a favorite of the state’s sizable evangelical community. A Monmouth University poll released this week even showed him tied with Trump for the lead in Iowa, at 23 percent.
According to Franklin, what mattered in business was humility, restraint, and discipline. But today’s Type-A MBAs would find him qualified for little more than a career in middle management.
When he retired from the printing business at the age of 42, Benjamin Franklin set his sights on becoming what he called a “Man of Leisure.” To modern ears, that title might suggest Franklin aimed to spend his autumn years sleeping in or stopping by the tavern, but to colonial contemporaries, it would have intimated aristocratic pretension. A “Man of Leisure” was typically a member of the landed elite, someone who spent his days fox hunting and affecting boredom. He didn’t have to work for a living, and, frankly, he wouldn’t dream of doing so.
Having worked as a successful shopkeeper with a keen eye for investments, Franklin had earned his leisure, but rather than cultivate the fine arts of indolence, retirement, he said, was “time for doing something useful.” Hence, the many activities of Franklin’s retirement: scientist, statesman, and sage, as well as one-man civic society for the city of Philadelphia. His post-employment accomplishments earned him the sobriquet of “The First American” in his own lifetime, and yet, for succeeding generations, the endeavor that was considered his most “useful” was the working life he left behind when he embarked on a life of leisure.
By reorienting the GOP’s foreign-policy debate away from the Middle East, the flamboyant frontrunner took the pact off the front page.
Next week, Donald Trump will join Ted Cruz, Glenn Beck, and others at a rally denouncing the Iran deal. Which is ironic, because Trump is one reason the deal will pass.
Before Trump entered the campaign, foreign policy dominated the Republican presidential race. With Democrats less vulnerable on the economy, and the public growing more progressive on cultural issues like gay marriage, drugs, and crime, the GOP candidates refocused on America’s supposedly collapsing position in the world. As The New York Timesreported in February, “Gruesome killings by the Islamic State, terrorist attacks in Europe and tensions with President Vladimir V. Putin of Russia are reshaping the early Republican presidential race, creating anxiety among party voters and sending potential candidates scrambling to outmuscle one another on foreign policy.”
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.